I am integrating a camera with HomeKit. The audio and video streams work in the HomeKit Accessory Tester, but they do not work when using the Home app on an iPhone (Failed to select audio config – Could not find the right match in the supported list. Session is not in progress). I have a single audio configuration: Opus 16kHz mono with a constant bitrate of 24kbps.
Search results for
Popping Sound
19,350 results found
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I am developing VOIP feature using PushKit and CallKit but CallKit is not showing when app in background or terminate state, now in foreground state I can call reportNewIncomingCall from pushRegistry-didReceiveIncomingPushWith and it's working as expected but the problem is in background or terminate state it's not So, there are a few things that can happen here: If you've just gotten this working, it's pretty common that you failed the call report requirement enough times that the system stopped delivering new pushes. To reset that count, delete the app completely, turn the device off, turn it back on, then reinstall. Note that while the full restart is not specifically required, I recommend doing it any time you need to be SURE things have reset properly. Your app should include the audio background category as well as voip. There are weird entanglements between CallKit and the audio system that make CallKit work even without audio, however, that behavior isn't really intentional
Topic:
App & System Services
SubTopic:
Notifications
Tags:
Thank you for your response, Kevin. From what I can see, the document appears to focus only on video calls, while my use case involves audio-only communication. Could you kindly confirm whether it’s acceptable to display a Picture-in-Picture view for an audio-only call, even if the document primarily focuses on video calls?
Topic:
App Store Distribution & Marketing
SubTopic:
App Review
Tags:
Hello, These sound like issues related to the legacy Vision API so consider updating your implementation. See Original Objective-C and Swift API to distinguish from contemporary API.
Topic:
Machine Learning & AI
SubTopic:
General
Tags:
Hello, This sounds like unexpected behavior for iPad. Please take a moment to send a bug report for beta iPadOS 26 using the Feedback Assistant.
Topic:
UI Frameworks
SubTopic:
UIKit
Tags:
Hello, I have an AppIntent that uses the AudioPlaybackIntent to trigger my app to open and initiate an AVPlayer that plays back a media stream I control. When the phone is unlocked, everything works as I expect. The app opens and plays the audio. However, when the phone is locked, any attempt to invoke the intent causes a Request Code dialog to be displayed. This seems counter to what I would expect with the AudioPlaybackIntent usage. Am I able to accomplish what I'm after here with AppIntents? Does the fact that I'm using openAppWhenRun require me to have the phone unlocked somehow? import AppIntents import Foundation struct PlayStationAppIntent: AudioPlaybackIntent { static var title: LocalizedStringResource = Play radio station static var description: IntentDescription = .init(Play radio station) static var notification: Notification.Name = .init(playStation) static var openAppWhenRun: Bool = true init() {} func perform() async throws -> some IntentResult { AudioPlayerService.shared.play() retu
hi Apple review team, I’m developing an app with audio calling functionality, and I’d like to take advantage of Picture-in-Picture (PiP) so that when the user moves the app to the background, the ongoing call can remain minimized on the Home screen. Based on my research, it seems possible to display a view in PiP mode and have it play, and I haven’t found any documentation stating that this is prohibited. Could you please confirm if this is allowed?
It sounds like your launchd job is failing to start. There are two common reasons for that: launchd tries to start it and it crashes. launchd is unable to start it. A good place to… ahem… start is to look for a crash report. In the first case, and many times in the second case as well, a failure will generate a crash report. If that doesn’t show up anything, add a ‘first light’ log point to your job [1]. Then try to communicate with the named XPC endpoint. That’ll divide the problem in half: If you see a log entry, you know that your job started and you can debug your code. If not, you know that your code never got running and you can debug your configuration. Share and Enjoy — Quinn “The Eskimo!” @ Developer Technical Support @ Apple let myEmail = eskimo + 1 + @ + apple.com [1] I explain this ‘first light’ concept in Debugging a Network Extension Provider.
Topic:
App & System Services
SubTopic:
Processes & Concurrency
Tags:
Hi, I’m working on a Safari extension for macOS, and I’d like the app to use specific system notification settings right after installation. I’m wondering if there’s a way in Swift to programmatically configure the default notification preferences (as seen in System Settings > Notifications > [my app]). Here are the desired settings: Only Desktop – without “Notification Center” or “Lock Screen” Alert Style: Temporary Badge App Icon: Enabled Play Sound for Notifications: Disabled Show Previews: When Unlocked Notification Grouping: Off (I don’t want them to accumulate in Notification Center) Here is the code I’m currently using to display a basic notification: private func handleNotificationRequest(_ message: [String: Any]) { guard let title = message[title] as? String, let body = message[body] as? String else { return } UNUserNotificationCenter.current().requestAuthorization(options: [.alert, .badge, .sound]) { granted, error in if granted { self.showNotification(title: title, body: bod
Topic:
Safari & Web
SubTopic:
General
Tags:
Notification Center
User Notifications
Safari Extensions
I’m building a cross-platform app targeting macOS, iPad, and iPhone. My app currently uses both 2-level and 3-level navigation workflows: 3-level navigation: First level: Categories Second level: List of items in the selected category Third level: Detail view for a specific item 2-level navigation: First level: Category Second level: A singleton detail view (for example, StatusView). It does not have concept of List. After watching a couple of WWDC videos about multi-platform navigation, I decided to go with NavigationSplitView. However, on macOS, a 3-column NavigationSplitView felt a bit overwhelming to my eyes when the third column was empty—especially for the occasional 2-level navigation case. So I removed the third column and instead embedded a NavigationStack in the second column. According to the official Apple documentation, this is supported: You can also embed a NavigationStack in a column. The code with NavigationStack in NavigationSplitView works fine on macOS. But on iPhone, for the same code I’m
Topic:
UI Frameworks
SubTopic:
SwiftUI
Sequoia 15.4.1 (24E263) XCode: 16.3 (16E140) Logic Pro: 11.2.1 I’ve been developing a complex audio unit for Mac OS that works perfectly well in its own bespoke host app and is now well into its beta testing stage. It did take some effort to get it to work well in Logic Pro however and all was fine and working well until: The AU part is an empty app extension with a framework containing its code. The framework contains Swift code for the UI and C code for the DSP parts. When the framework is compiled using the Swift 5 compiler the AU will run in Logic with no problems. (I should also mention that AU passes the most strict auval tests). But… when the framework is compiled with Swift 6 Logic Pro cannot load it. Logic displays a message saying the audio unit could not be loaded and to contact the developer. My own host app loads the AU perfectly well with the Swift 6 version, so I know there’s nothing wrong with the audio unit. I cannot find any differences in any of the built output f
hi, im new to this, i updated my ipad pro 5th gen, to the latest ios 26 beta developer, but today, after using it like i regularly do, right now is with a black screen, is it related to the beta? do i need to downgrade? it is still working, the mac detects it, and it sounds by unlocking it, anyone had this same issue?
Topic:
Community
SubTopic:
Apple Developers
I am using AlarmKit in my app. When I access: AlarmManager.shared.authorizationState It always returns notDetermined, even when I have previously granted the app permission to use alarms via: try await AlarmManager.shared.requestAuthorization() Calling this API again grants me the permission though, without showing the permission prompt to the user. This sounds like a bug - if the permission has been granted, accessing authorizationState should return .authorized. It shouldn't require me to call requestAuthorization() again to update the authorization status again? Environment: iOS 26 beta 3 Xcode 26 beta 3
Topic:
App & System Services
SubTopic:
General
I am using AlarmKit to schedule alarms in an app I am working on, however my scheduled alarms only show up on the lock screen. If I am on the home screen or elsewhere I only hear the sound of the alarm, but no UI shows up. Environment: iOS 26 beta 3 Xcode 26 beta 3
Topic:
UI Frameworks
SubTopic:
SwiftUI
At WWDC25 we launched a new type of Lab event for the developer community - Group Labs. A Group Lab is a panel Q&A designed for a large audience of developers. Group Labs are a unique opportunity for the community to submit questions directly to a panel of Apple engineers and designers. Here are the highlights from the WWDC25 Group Lab for Apple Intelligence. Can I integrate writing tools in my own text editor? UITextView, NSTextView, and SwiftUI TextEditor automatically get Writing Tools on devices that support Apple Intelligence. For custom text editors, check out Enhancing your custom text engine with Writing Tools. Given that Foundation Models are on-device, how will Apple update the models over time? And how should we test our app against the model updates? Model updates are in sync with OS updates. As for testing with updated models, watch our WWDC session about prompt engineering and safety, and read the Human Interface Guidelines to understand best practices in prompting the on-device model. What
Topic:
Machine Learning & AI
SubTopic:
Apple Intelligence