In iOS 18, CarPlay shows an error: “There was a problem loading this content” after playback starts. Audio works fine, but the Now Playing screen doesn’t load. I’m using MPPlayableContentManager. This worked fine in iOS 17. Anyone else seeing this error in iOS 18?
Search results for
Popping Sound
19,356 results found
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Environment→ ・Device: iPad 10th generation ・OS:**iOS18.3.2 We're using AVAudioPlayer to play a sound when a button is tapped. In our use case, this button can be tapped very frequently — roughly every 0.1 to 0.2 seconds. Each tap triggers the following function: var audioPlayer: AVAudioPlayer? func soundPlay(resource: String, type: String){ guard let path = Bundle.main.path(forResource: resource, ofType: type) else { return } do { audioPlayer = try AVAudioPlayer(contentsOf: URL(fileURLWithPath: path)) audioPlayer!.delegate = self try audioSession.setCategory(.playback) } catch { return } self.audioPlayer!.play() } The issue is that under high-frequency tapping (especially around 0.1–0.15s intervals), the app occasionally crashes. The crash does not occur every time, but it happens randomly — sometimes within 30 seconds, within 1 minute, or even 3 minutes of continuous tapping. Interestingly, adding a delay of 0.2 seconds between button taps seems to prevent the crash entirely. Delays shorter than 0.2
Hi, I am trying to remove the audio controls for my app on the lock screen. Since I use WKWebView, there are 3 audio tags in my html and I play and pause em via JS. However, if I do not play any sound since app launch, there are no audio controls on the lock screen. But if I play one of those 3 files (they are even less then 3 Sec sound effects e.g. for buttons) the audio controls appears on lock screen. Note even when the sounds on pause() or not playing they were listed on the lock screen. What I have tried so far without success MPNowPlayingInfoCenter.default().nowPlayingInfo = [:] and ``try audioSession.setCategory(.playback, mode: .default, options: []) try audioSession.setActive(false, options: .notifyOthersOnDeactivation)`` and UIApplication.shared.endReceivingRemoteControlEvents() Another problem is that the app scales with iOS system settings display zoom. Is there a way to deny it? It is latest Xcode verion 16.3 and iOS 18. I have no bac
I created a virtual audio device to capture system audio with a sample rate of 44.1 kHz. After capturing the audio, I forward it to the hardware sound card using AVAudioEngine, also with a sample rate of 44.1 kHz. However, due to the clock sources being unsynchronized, problems occur after a period of playback. How can I retrieve the clock source of the hardware device and set it for the virtual device?
This issue still exists in Xcode Version 16.3 (16E140). To fix the Audio Unit Extension App template to allow installation to an iOS device, you must reset these two Build Settings in the section Signing - App Sandbox : Enable App Sandbox Enable User Selected Files To reset, highlight the row and click delete. These build settings need to be reset for both targets: App target Audio Unit Extension target As Quinn mentions, if you want to deploy to the macOS App Store, you still need to enable App Sandbox. Go to the Signing and Capabilities tab > click + Capability > choose App Sandbox.
Topic:
Code Signing
SubTopic:
Entitlements
Tags:
Currently working on a dating app which needs voip for audio and video calls for ios. the voip notifications only comes to the app in active and inactive mode but doesnt wake the device in background or terminated mode. After debugging i noticed that com.apple.developer.voip entitlement wasnt included which i later added, trying to create a build i get the eas error that the entitlement wasnt added to the identifier capabilities. My issue now is that i can't seem to find the voip capability to check in the identifiers capabilities list for the bundle id.d
I tried to install the profiles, but they both seem to be the same profile - Facetime and Call Activity Logging Yeah, that's fine. What actually happens here is that at some point in the past we had different profile the included slightly different logging behavior but at some point the team decided to just consolidate on to one profile for everything. However, these profile are very widely used in very different context (for example, App Development vs. AppleCare), so it's easier to post the same profile with two names instead of trying to explain that one case should now use the other profile. actually, when checking crashes_and_spins there are crashes 20:01, 20:02 and 20:03... my bad Good. So, it sounds like the system side of this is working correctly. What is the recommended way to detect that we are launched before first unlock? So, my first and most important recommendation is that you NOT think about this issue in terms of any specific state. Yes, that's what's triggering the specific failure
Topic:
App & System Services
SubTopic:
General
Tags:
I'm trying to implement a custom back button in a CPInformationTemplate using CPBarButton. It works as expected on iOS 18 — the button appears in the navigation bar and correctly pops the template when tapped. However, on iOS 16 devices, the custom back button does not respond to taps at all. The reason I need a custom back button is because I want to perform cleanup and reset logic before popping the template. Is there any workaround for making this work on iOS 16? Or is there a recommended way to hook into the system back button behavior so that I can execute code when the template is dismissed?
Actually this is a duplicate for https://developer.apple.com/forums/thread/106537 but in web-specific forums section. Is there any video/audio codec best practices, guides, recommendations for app/web developers for best performance (take advantage from HW acceleration), power consumption saving? What are officially supported media containers? What are video encoding profiles, video dimensions, frame rates? The only official source I have found is https://developer.apple.com/documentation/webkit/delivering-video-content-for-safari?language=objc. But h264 is pretty old. I experimentally found that the VP9 video format is also supported on iOS newer versions. But is this a requirement? Сan i be sure that the video will play on all devices? My goal is to provide web media content (which will be rendered in my application using WKWebView API) that will be supported by most devices (both iOS and MacOS), takes advantage of such features as hardware decode acceleration and be efficient. Any hints/info is hi
[quote='785582021, bobh, /thread/785582, /profile/bobh'] I am a paid $99 a year developer … I am not part of a team. [/quote] You’ve misunderstand the terminology here. Every standard Apple Account is a member of a team: If you haven’t paid, you’re a member of a Personal Team (aka free provisioning). If you paid for yourself, you’re a member of an Individual team. If someone else paid and then invited you, you’re a member of an Organization team (or In-House (Enterprise)). It sounds like you paid for yourself and thus should be a member of your Individual team. However, Swift Playgrounds isn’t picking that up for some reason. Are you sure you’re signed in using the same Apple Account that joined the programme? One good test here is to go to Developer > Account, sign in, and click the popup at the top right. That’ll show you all the teams of which you’re a member. And if you select a team then the Membership details pane will show you its Team ID and team type (in the “Enrolled as” field). If you’v
Topic:
Developer Tools & Services
SubTopic:
Swift Playground
Tags:
To avoid creating new thread, I want to bump this one. Ideally I expect the answer from apple developers, because I think such information must be pinned in wkwebkit documentation as well. Also, I appreciate experience sharing from other developers. So. Is there any video/audio codec best practices, guides, recommendations for app/web developers for best performance (take advantage from HW acceleration), power consumption saving? What are officially supported media containers? What are video encoding profiles, video dimensions, frame rates? The only official source I have found is https://developer.apple.com/documentation/webkit/delivering-video-content-for-safari?language=objc. But h264 is pretty old. I experimentally found that the VP9 video format is also supported on iOS newer versions. But is this a requirement? Сan i be sure that the video will play on all devices? Any hints/info is highly appreciated. Best regards.
Topic:
UI Frameworks
SubTopic:
UIKit
Tags:
Still frustrated with this one. I now realise a matrix mixer won't help as while that can shuffle the outputs they still would output on a single bus. I'm now wondering if the only option is to write an audio unit? Surely there's a way to do this with existing nodes? It's a required function of every DAW to be able to run each mic through a seperate channel strip. Any hints?
Topic:
Media Technologies
SubTopic:
Audio
Tags:
Hi, I think I’m experiencing a very similar issue. I created a detailed thread here: https://developer.apple.com/forums/thread/785312 - feel free to check it out and see if it sounds familiar. In my case, I can see my Team ID when I go to this URL (you’ll need to be logged in): https://appstoreconnect.apple.com/access/users//settings However, Xcode acts like the Team ID doesn’t exist, and when I try accessing certificates here: https://developer.apple.com/account/resources/certificates/list I get this error: Unable to find a team with the given Team ID ‘XXXXXX’ to which you belong. Please contact Apple Developer Program Support. If you also see your Team ID in App Store Connect settings, and if your Mac shows an active Apple Developer membership under System Settings > Media & Purchases (this only applies if you’re the one who paid), and the expiration date is valid, then everything should work - whether the account is personal or business shouldn’t matter. In my case, the issue turned out to
Topic:
App Store Distribution & Marketing
SubTopic:
TestFlight
Tags:
I am work an app development on an app which request an audio function in background as an alert sound. during debug testing , the function work fine, but once I testing standalone without debugging , The function not work , it will play out the sound when I back to app. does any way to trace the issues ?
import SwiftUI import AVFoundation import UIKit @main struct RaceTimerAppApp: App { init() { configureAudioSession() } var body: some Scene { WindowGroup { ContentView() } } func configureAudioSession() { do { try AVAudioSession.sharedInstance().setCategory(.playback, mode: .default) try AVAudioSession.sharedInstance().setActive(true) DispatchQueue.main.async { UIApplication.shared.beginReceivingRemoteControlEvents() } print(✅ Audio session configured for background playback.) print(🎧 Audio category: (AVAudioSession.sharedInstance().category.rawValue)) } catch { print(❌ Failed to set audio session: (error)) } } }
Topic:
Developer Tools & Services
SubTopic:
Xcode