Search results for

Popping Sound

19,349 results found

Post

Replies

Boosts

Views

Activity

Reply to Screen Time Counting App Usage While Shield Is Displayed
[quote='794802021, cspeir1818, /thread/794802, /profile/cspeir1818'] Is this expected behavior? Is there a way to prevent Screen Time from counting time while the shield is shown? [/quote] this sounds like a bug to me. Have you filed a feedback request already? I am experiencing the same problem And I would say it’s unexpected behavior. my feedback is tracked under FB19200003
Topic: App & System Services SubTopic: General Tags:
Jul ’25
Reply to Drop file not found on MacBook Air
What specific error are you getting? I can see a couple of potential issues here: Sandboxed apps have to be careful when accessing files that are dropped on to them. That drop represents a dynamic extension to the sandbox (see On File System Permissions for an explanation of that). If you don’t handle the file correctly, you could run into a permissions error. However, you mentioned “file not found”, which doesn’t sound like that, so it’s possible you’re hitting some other issue entirely. Share and Enjoy — Quinn “The Eskimo!” @ Developer Technical Support @ Apple let myEmail = eskimo + 1 + @ + apple.com
Topic: App & System Services SubTopic: Core OS Tags:
Jul ’25
Reply to Autofill Extension can't find bundle while the main app can
That’s some pretty complicated code you have there. It sounds like you want the same code to run in both your app and your appex. If so, the canonical way to do that is to put the code, and its resources, into a framework. Your framework can then access its resources as it usually would. A key advantage of this approach is that your app only ships with a single copy of the framework’s code. So, I recommend that you aim for a structure like this: MyApp.app/ Info.plist MyApp PlugIns/ MyExtension.appex/ Info.plist MyExtension Frameworks/ MyFramework.framework/ Info.plist MyFramework MyBundle.bundle/ Info.plist … Within your framework, you can either use the +bundleForClass: trick (à la CurrentBundleFinder) or use Swift Package Manager’s Bundle.module support. Share and Enjoy — Quinn “The Eskimo!” @ Developer Technical Support @ Apple let myEmail = eskimo + 1 + @ + apple.com
Topic: App & System Services SubTopic: Core OS Tags:
Jul ’25
Reply to Clarification on Timing for Critical Alerts Approval in iOS
The development process and flow is your business decision and depends on your management of risk of wasted time and management of resources if the entitlement does not come through. You can start development with regular notifications. The coding for those is identical to critical alerts save for a single property you will add (and a special sound file if you want to play a specific sound). And if you get the entitlement, you can then just turn them into critical alerts very simply. Alternatively, if you want to develop your app only if you get the entitlement, then you can apply for it and wait, before committing resources to development. This is a business decision we can't make for you. You don't need a prototype to apply for the entitlement.
Jul ’25
Reply to Speak Screen gesture not working
Thanks for writing about your experience, that certainly does not sound right to me, and I'd like to take a closer look. As with your other post, please file a bug report using the Feedback Assistant tool to upload additional information about your issue including logs from the device, https://developer.apple.com/bug-reporting/ Reply here with the Feedback ID and I can take a look.
Jul ’25
AVAudioEngine failing with -10877 on macOS 26 beta, no devices detected via AVFoundation but HAL works
I’m developing a macOS audio monitoring app using AVAudioEngine, and I’ve run into a critical issue on macOS 26 beta where AVFoundation fails to detect any input devices, and AVAudioEngine.start() throws the familiar error 10877. FB#: FB19024508 Strange Behavior: AVAudioEngine.inputNode shows no channels or input format on bus 0. AVAudioEngine.start() fails with -10877 (AudioUnit connection error). AVCaptureDevice.DiscoverySession returns zero audio devices. Microphone permission is granted (authorized), and the app is properly signed and sandboxed with com.apple.security.device.audio-input. However, CoreAudio HAL does detect all input/output devices: Using AudioObjectGetPropertyDataSize and AudioObjectGetPropertyData with kAudioHardwarePropertyDevices, I can enumerate 14+ devices, including AirPods, USB DACs, and BlackHole. This suggests the lower-level audio stack is functional. I have tried: Resetting CoreAudio with sudo killall coreaudiod Rebuilding and re-signing the app Cleari
0
0
228
Jul ’25
Guideline 4.3(a) - Design - Spam
Hi all, I'm looking for guidance or experiences from others who’ve run into this issue. I'm developing a series of educational apps that teach different languages from English, using a simple flashcard-based format. Each app is fully offline, has no ads, no subscriptions, and contains native audio and culturally relevant images for that specific language. Each language pack (audio + images + data) is around 50MB, and I’m planning to support 50 languages. Because of size constraints and my offline-first approach, it’s not feasible to combine all languages into a single app. To stay user-friendly and efficient: Each app contains only one language. Each has its own name and icon (e.g., “Babel Bash Chinese ”, “Babel Bash Thai”). All use the same visual structure (by design) for brand consistency and usability. Despite this, I’ve had an app rejected under Guideline 4.3(a) – Spam, with the reasoning that it duplicates the functionality of another app I've submitted (even though the language, audio
2
0
151
Jul ’25
SpeechTranscriber/SpeechAnalyzer being relatively slow compared to FoundationModel and TTS
So, I've been wondering how fast a an offline STT -> ML Prompt -> TTS roundtrip would be. Interestingly, for many tests, the SpeechTranscriber (STT) takes the bulk of the time, compared to generating a FoundationModel response and creating the Audio using TTS. E.g. InteractionStatistics: - listeningStarted: 21:24:23 4480 2423 - timeTillFirstAboveNoiseFloor: 01.794 - timeTillLastNoiseAboveFloor: 02.383 - timeTillFirstSpeechDetected: 02.399 - timeTillTranscriptFinalized: 04.510 - timeTillFirstMLModelResponse: 04.938 - timeTillMLModelResponse: 05.379 - timeTillTTSStarted: 04.962 - timeTillTTSFinished: 11.016 - speechLength: 06.054 - timeToResponse: 02.578 - transcript: This is a test. - mlModelResponse: Sure! I'm ready to help with your test. What do you need help with? Here, between my audio input ending and the Text-2-Speech starting top play (using AVSpeechUtterance) the total response time was 2.5s. Of that time, it took the SpeechAnalyzer 2.1s to get the transcript finalized, Foundat
2
0
512
Jul ’25
Reply to Test my app without ADP membership
The OP wants to use the FamilyControl entitlement. The request page requires a purchased membership to access. I'm not familiar with this feature, but it doesn't sound like something that a developer could just turn on for debug builds. Maybe you could in the simulator? From said request page, it looks like one would have to have the app ready first and then request the entitlement. And most importantly, it's just a request. Judging from other people's experiences, these requests can take from 2 days to 4 months to be accepted, and possibly rejected.
Topic: Code Signing SubTopic: Entitlements Tags:
Jul ’25
Adaptive automatic corner radius in containers with insets/paddings
With the correct corner radius changing in iOS 26, I wondered if there is a way to get properly rounded corners inside containers like sheets without hard-coding a constant value. Here's the results of some experiments I did, example code below. The new in Beta 4 ConcentricRectangle seems nice. Notable here is that it doesn't pick up the larger corner radii from the device corners: If you want all the corners rounded, the isUniform parameter of ConcentricRectangle seems helpful. It doesn't apply the corners in a View in the middle though, not sure if this is an oversight or if this has some purpose: ContainerRelativeShape looks ... interesting ... as of Beta 4, with the larger bottom corners rounded according to the device corners, but the actual bottom corners not fitting the device corners. With ContainerRelativeShape one can also get the middle part to have proper rounded corners in this example ... if you set the outer .containerShape(RoundedRectangle(cornerRadius: 36)) yourself. Notable here is that it t
Topic: UI Frameworks SubTopic: SwiftUI
3
0
510
Jul ’25
Reply to Background service on MacOS
In the context of macOS when we say application (or app) we mean something that’s double clickable in the Finder, shows up in the Dock, has a menu bar, and so on. Your product doesn’t do that, so it’s not an app. How do you want to managed install and uninstall? I've made this Rust application being packed in to a PKG installer, which installs a valid terracotta.app bundle to Application dir. Therefore, Users can run this application by cdouble clicking the icon in Finder and it will show up in the dock. So, THIS IS A APPLICATION. What is the desired lifecycle of your program? I want to make my program being runned in background ONLY AFTER users manually launch the application. Therefore, their won't be much performance waste when my application hasn't being launched since the computer start. And the application will exit and no longer running in the background if certain condition is met, for example, there's no traffic for the application on the port-forward feature. (Of course, there will be a pop
Jul ’25
Why Does WebView Audio Get Quiet During RTC Calls? (AVAudioSession Analysis)
I developed an educational app that implements audio-video communication through RTC, while using WebView to display course materials during classes. However, some users are experiencing an issue where the audio playback from WebView is very quiet. I've checked that the AVAudioSessionCategory is set by RTC to AVAudioSessionCategoryPlayAndRecord, and the AVAudioSessionCategoryOption also includes AVAudioSessionCategoryOptionMixWithOthers. What could be causing the WebView audio to be suppressed, and how can this be resolved?
0
0
535
Jul ’25
Reliable 30-minute background data fetching for safety-critical monitoring app?
I'm developing a safety-critical monitoring app that needs to fetch data from government APIs every 30 minutes and trigger emergency audio alerts for threshold violations. The app must work reliably in background since users depend on it for safety alerts even while sleeping. Main Challenge: iOS background limitations seem to prevent consistent 30-minute intervals. Standard BGTaskScheduler and timers get suspended after a few minutes in background. Question: What's the most reliable approach to ensure consistent 30-minute background monitoring for a safety-critical app where missed alerts could have serious consequences? Are there special entitlements or frameworks for emergency/safety applications? The app needs to function like an alarm clock - working reliably even when backgrounded with emergency audio override capabilities.
1
0
514
Jul ’25
iOS 26 HLS Audio Track Display Behavior: EXT-X-MEDIA NAME vs LANGUAGE Attributes
Hello Apple Developer Community, I am seeking clarification on the intended display behavior of HLS audio tracks within the iOS 26 (or current beta) native player, specifically concerning the NAME and LANGUAGE attributes of the EXT-X-MEDIA tag. In our HLS manifests, we define alternative audio tracks using EXT-X-MEDIA tags, like so: #EXT-X-MEDIA:TYPE=AUDIO,GROUP-ID=audio,LANGUAGE=ja,NAME=AUDIO-1,DEFAULT=YES,AUTOSELECT=YES,URI=audio_ja.m3u8 #EXT-X-MEDIA:TYPE=AUDIO,GROUP-ID=audio,LANGUAGE=ja,NAME=AUDIO-2,URI=audio_en.m3u8 Our observation is that when an audio track is selected and its name is displayed in the native iOS media controls (e.g., Control Center or within a full-screen video player's UI), the value specified in the NAME attribute (AUDIO-1, AUDIO-2) does not seem to be used. Instead, the display appears to derive from the LANGUAGE attribute (ja, en), often showing the system's localized string for that l
2
0
357
Jul ’25