Search results for

Popping Sound

19,349 results found

Post

Replies

Boosts

Views

Activity

Reply to CallKit does not activate audio session with higher probability after upgrading to iOS 18.4.1
Hi @DTS Engineer Thank you so much for checking this. Regarding try to setActive(YES) for workaround, this only happen 5 seconds after our app detect Audio Session is not activated call started. We agree the major the key issue is the SessionID 0x0. I have created a feedback for this issue, and submit sysdiagnose/console logs in it. However, the issue happen at Apr 4, the sysdiagnose was generated at Apr 7. So We are not sure if it could contain the key data at Apr 4. Please help to check it. The issue happen intermediately, and it's hard to reproduce. We can update the feedback again once it can be reproduced again. feedback ticket: FB19429215 (CallKit does not activate audio session with higher probability after upgrading to iOS 18.4.1)
Topic: App & System Services SubTopic: General Tags:
Aug ’25
Reply to How to use a folder generated by an Xcode Aggregate Target as a resource in another target?
A housekeeping note — here is a link to a related thread Caleb started on a different slice of the problem, for anyone who may come to this in the future. I reread what you originally wrote, and it sounds like you had a reason to pull out this web bundle build into the aggregate target. Is there a reason where that couldn't be a script in the main app target that I'm not seeing? It seems like the file level input and output analysis of this step in the main app build phase would be enough. The one downside I see is that if the script does need to run in full, that could potentially be a place where you don't get a lot of concurrently running build tasks doing other things, and thus takes up more wall clock time than is ideal, but also maybe isn't that different from the pre-build script you started with. Another idea would be to question why this script needs to run via an Xcode build at all. If the source files don't change often, could other techniques like a git commit hook targeting the source fi
Aug ’25
Reply to Delay in Microphone Input When Talking While Receiving Audio in PTT Framework (Full Duplex Mode)
Thank you for the detailed reply. I've submitted a bug report as requested: FB19421676 – Push-to-Talk Framework: Microphone activation tone does not play when sending while audio session is active in full duplex mode. Thanks to the context you provided regarding how the PTT framework functions, I was able to identify the cause of the transmission delay I was experiencing. It turns out that isVoiceProcessingInputMuted was set to true when starting a transmission, and only reverted to false once audio output stopped. This was the source of the delay between initiating transmission and receiving valid microphone input. By manually setting isVoiceProcessingInputMuted to false on the input node at the start of transmission, I was able to eliminate this delay and begin receiving microphone samples immediately. I'm still relatively new to Swift and iOS audio development, and I was wondering if there are any sample projects or best practices that demonstrate integrating audio with
Topic: Media Technologies SubTopic: Audio Tags:
Aug ’25
Reply to CallKit does not activate audio session with higher probability after upgrading to iOS 18.4.1
Could you please review the following processes for any potential issues? Yes. It's very simple. Your app should NEVER do this: try to setActive(YES) for workaround The CallKit audio session is NOT a standard audio session. It's a restricted audio session configuration that: Has different, higher, session priority than any other audio session. Is allowed to activate in the background IF that activation comes from the properly authorized daemon. That means: sometimes it won't ...in the standard case, activation will fail because you're not the proper authorization. However, that activation attempt can disrupt the system and interfere with other activations. However... sometimes it will succeed, These APIs are inherently race condition prone, which means that activation could succeed because, for example: Your app was entering the foreground and was able to activate a PlayAndRecord. Timing issues inside the audio system meant that it allowed activation under circumst
Topic: App & System Services SubTopic: General Tags:
Aug ’25
AlarmKit Not Playing Custom Sound or Named System Sounds - iOS 26 beta 4 (23A5297i)
I'm using the new AlarmKit framework to schedule and trigger alarms in my Swift app in iOS 26 beta 4 (23A5297i). I'm trying to customize the alarm sound using a sound file embedded in the app bundle or by referencing known system tones. Problem: No matter what I pass to .named(sound-2), whether a file bundle url, .named(sound-2.caf), tried .mp3, .caf & .aiff, or a known iOS system sound like .named(Radar) (Chimes, etc.), the alarm always plays the default system alert tone. There's no error or warning, but the custom or specified sound is silently ignored. sound: .named(sound-2) Question: What is the correct method or approach to play custom sound / music when Alarm Triggers? What .named(...) expects file name, file Path URL or System sound name? Is there any specific audio file length accepted or specific format? Challenge: The alarm functionality feels incomplete without support for custom sounds.
4
0
138
Aug ’25
Incorrect 5.1 / Atmos channel mapping on Apple TV 4K (2022)
I ran 5.1 audio tests in both YouTube and Apple Music, and I noticed that when sound is supposed to play from the rear or front surround speakers, it’s also duplicated in the front left and right channels. I’m absolutely sure the issue is with the Apple TV, because I played the same video directly through my TV’s native system, and the channel separation was correct. Everything used to work perfectly before, so this must be a software issue. I’m currently on tvOS 26 Developer Beta 5, but I’m certain the problem also existed on the stable tvOS 18.5. I’ve already reset and updated my Apple TV, and I also tried switching the audio format to forced Dolby Atmos 5.1. On the forums, I mostly see complaints about Dolby Atmos not working at all — in my case, everything technically works, but not the way it’s supposed to.
1
0
66
Aug ’25
Reply to Gatekepper acts against .app package developed by a freelancer for our company
[quote='852314022, VBFSDEV, /thread/795578?answerId=852314022#852314022, /profile/VBFSDEV'] do you mean adding the freelancer to my app store connect account? [/quote] Yes. Just like you would do for iOS. [quote='852314022, VBFSDEV, /thread/795578?answerId=852314022#852314022, /profile/VBFSDEV'] which role would you recommend … ? [/quote] That’s a balance between what authority you want to grant them and how much time you want to spend servicing their requests for credential manipulation. Although, having said that, I’ll note that this is no different than it is for iOS. IMPORTANT There’s one thing to watch out for here. If you make them an Admin, don’t explicitly allow them to created Developer ID certificates. See the “Create cloud-managed Developer ID certificates” row in Developer > Support > Articles > Program Roles. [quote='852314022, VBFSDEV, /thread/795578?answerId=852314022#852314022, /profile/VBFSDEV'] You also say that we should not grant access to any certificates [/quote] There are multi
Topic: Code Signing SubTopic: General Tags:
Aug ’25
Am I allowed to use FFMPEG and FFPROBE binaries in Mac App Store?
I have built an app that does audio analysis. I have stripped the GPL files from these binaries regarding license issues. The binnaries are in the root folder of ny build next to the .app file MacOs/ I can not run this in sandbox mode. Everything works except for the audio analysis. Can I still submit this to the AppStore and get accepted? Will users whom download the app be able to run the audio analysis? I aaked this on Apple support but they gave me a general answer to appstore documentation Hope someone here has experience with this.
2
0
101
Aug ’25
Reply to Am I allowed to use FFMPEG and FFPROBE binaries in Mac App Store?
There are two parts to questions like this: What’s technically feasible? What’s required by App Review? I don’t work for App Review, so I can’t offer definitive advice on their policies. I recommend that you review their published guidelines. I’m happy to wade into technical questions. And on that front, you wrote: [quote='795751021, maarten1987, /thread/795751, /profile/maarten1987'] I can not run this in sandbox mode. [/quote] Why not? In general, sandboxed apps are allowed to: Play and record audio Have embedded helper executables Spawn child processes Given that, I can’t see any fundamental reason why this won’t work. So I recommend that you dig further into the failure, because that’s likely to be a lot more fun than the alternatives. Note It’s better to reply as a reply, rather than in the comments; see Quinn’s Top Ten DevForums Tips for this and other titbits. [quote='852254022, Tomato, /thread/795751?answerId=852254022#852254022, /profile/Tomato'] you can have your users download the command-
Aug ’25
Errors reading not-yet-sync'd iCloud files get cached
I have an app which uses ubiquitous containers and files in them to share data between devices. It's a bit unusual in that it indexes files in directories the user grants access to, which may or may not exist on a second device - those files are identified by SHA-1 hash. So a second device scanning before iCloud data has fully sync'd can create duplicate references which lead to an unpleasant user experience. To solve this, I store a small binary index in the root of the ubiquitous file container of the shared data, containing all of the known hashes, and as the user proceeds through the onboarding process, a background thread is attempting to prime the ubiquitous container by calling FileManager.default.startDownloadingUbiquitousItemAt() for each expected folder and file in a sane order. This likely creates a situation not anticipated by the iOS/iCloud integration's design, as it means my app has a sort of precognition of files it should not yet know about. In the common case, it works, but there is a corner
1
0
71
Aug ’25
How to display a full-screen light based (sunrise) alarm notification at specific time (like Clock app)? Can Critical Alerts help with visuals too?
I'm building a light-based(sunrise) alarm iOS app using SwiftUI , the idea is to wake users not with sound, but with a full-screen bright light UI (mimicking sunrise or a light alarm clock). I'd like to replicate behavior similar to the native Clock app: My goal: When the scheduled time is reached, forcefully display a full-screen light screen, even if the device is: locked running another app or the app is backgrounded The problem: So far, I can: Show a full-screen AlarmView only if the app is opening But I cannot: Automatically wake the screen when app is closed My confusion: I've read that Critical Alerts allow bypassing Do Not Disturb and Silent Mode but that's only for sound right? Can Critical Alerts also help with waking the screen or displaying visuals like full-screen UI automatically? If not, is there any way to simulate this kind of alarm: light-based screen effect ( sunrise alarm clock) triggered automatically at a specific time without needing the user to manually tap the notifi
1
0
67
Aug ’25