Search results for

Popping Sound

19,739 results found

Post

Replies

Boosts

Views

Activity

Reply to AlarmKit play sound only once
Thank you for your post. When you do not provide your file to play, and you provide the default sound with sound: AlertConfiguration.AlertSound = .default, the sound will now be the default sound.he sound looping until you stop at what time? https://developer.apple.com/documentation/alarmkit/alarmmanager/alarmconfiguration/timer(duration:attributes:stopintent:secondaryintent:sound:) Do you have the time in a focused project with the sound file you are using to share it for people to give you ideas? If you're not familiar with preparing a test project, take a look at Creating a test project. Albert Pascual
  Worldwide Developer Relations.
Nov ’25
Reply to iOS 26.1 PHPickerConfiguration.preselectedAssetIdentifiers doesn't select previous pictures in the PHPickerViewController
This sounds like a regression. Have you tried verifying if the identifiers are still valid when you try to use them again? You can do this by retrieving the assets using the identifier with the fetchAssets(withLocalIdentifiers:options:) method. See the article on Fetching Assets for more details. If the identifiers are valid and you’re still encountering the issue, please file a bug report. Include a minimal reproducible sample code and share the Feedback ID here so I can pass it on to the appropriate engineering team.
Topic: UI Frameworks SubTopic: UIKit
Nov ’25
Is there an errors with SpatialAudioCLI?
Hi, everyone, I downloaded the source code EditingSpatialAudioWithAnAudioMix.zip from https://developer.apple.com/documentation/Cinematic/editing-spatial-audio-with-an-audio-mix, when I carried out one of the actions named process in command line the program crashed!! Form the source code, I found that the value of componentType is set to kAudioUnitType_FormatConverter: // The actual `AudioUnit`. public var auAudioMix = AVAudioUnitEffect() init() { // Generate a component description for the audio unit. let componentDescription = AudioComponentDescription( componentType: kAudioUnitType_FormatConverter, componentSubType: kAudioUnitSubType_AUAudioMix, componentManufacturer: kAudioUnitManufacturer_Apple, componentFlags: 0, componentFlagsMask: 0) auAudioMix=AVAudioUnitEffect(audioComponentDescription: componentDescription) } But in the document from https://developer.apple.com/documentation/avfaudio/avaudiouniteffect/init(audiocomponentdescription:), it seems that componentType can not
1
0
149
Nov ’25
playSoundFileNamed not working on Tahoe?
I have published a number of games that use SpriteKit for everything important. Since the release of macOS Tahoe, I've had a lot of end user reports saying that sound effects have stopped working in many (but not all) of my titles. I'm not doing anything unusual here – typical code is: sndGameOver = [SKAction playSoundFileNamed:@Audio/GameOver.wav waitForCompletion:YES]; Then at the appropriate time: [self runAction:sndGameOver]; Has anyone else encountered this? The code still works fine on previous operating systems, and appears to be fine on iOS too. Has something changed in macOS Tahoe? I'm at a bit of a loss. There's nothing obviously different between the titles that do work and the titles that don't. Suggestions welcomed! Thanks
5
0
1.2k
Nov ’25
How to integrate Apple Immersive Video into the app you are developing.
Hello, Let me ask you a question about Apple Immersive Video. https://www.apple.com/newsroom/2024/07/new-apple-immersive-video-series-and-films-premiere-on-vision-pro/ I am currently considering implementing a feature to play Apple Immersive Video as a background scene in the app I developed, using 3DCG-created content converted into Apple Immersive Video format. First, I would like to know if it is possible to integrate Apple Immersive Video into an app. Could you provide information about the required software and the integration process for incorporating Apple Immersive Video into an app? It would be great if you could also share any helpful website resources. I am considering creating Apple Immersive Video content and would like to know about the necessary equipment and software for producing both live-action footage and 3DCG animation videos. As I mentioned earlier, I’m planning to play Apple Immersive Video as a background in the app. In doing so, I would also like to place some 3D models as RealityKit
2
0
758
Nov ’25
Spatial Audio on iOS 18 don't work as inteneded
I’m facing a problem while trying to achieve spatial audio effects in my iOS 18 app. I have tried several approaches to get good 3D audio, but the effect never felt good enough or it didn’t work at all. Also what mostly troubles me is I noticed that AirPods I have doesn’t recognize my app as one having spatial audio (in audio settings it shows Spatial Audio Not Playing). So i guess my app doesn't use spatial audio potential. First approach uses AVAudioEnviromentNode with AVAudioEngine. Chaining position of player as well as changing listener’s doesn’t seem to change anything in how audio plays. Here's simple how i initialize AVAudioEngine import Foundation import AVFoundation class AudioManager: ObservableObject { // important class variables var audioEngine: AVAudioEngine! var environmentNode: AVAudioEnvironmentNode! var playerNode: AVAudioPlayerNode! var audioFile: AVAudioFile? ... //Sound set up func setupAudio() { do { let session = A
4
0
969
Nov ’25
'tabViewBottomAccessory' leaves an empty accessory area when conditionally hidden
We use SwiftUI's .tabViewBottomAccessory in our iOS apps for displaying an Audio MiniPlayer View (like in the Apple Music App). TabView(selection: $viewModel.selectedTab) { // Tabs here } .tabViewBottomAccessory { if viewModel.showAudioMiniPlayer { MiniPlayerView() } } The Problem This code works perfectly on iOS 26.0. When viewModel.showAudioMiniPlayer is false, the accessory is completely hidden. However, on iOS 26.1 (23B5059e), when 'viewModel.showAudioMiniPlayer' becomes false, the MiniPlayerView disappears, but an empty container remains, leaving a blank space above the tab bar. Is this a known Bug in iOS 26.1 and are there any effective workarounds?
Topic: UI Frameworks SubTopic: SwiftUI
8
0
808
Nov ’25
Not able to write AAC audio with 96 kHz sample rate using AVAudioRecorder or Extended audio file services
Not able to record audio in AAC format with 96 kHz sample rate using AVAudioRecorder or Extended Audio File services with 96 kHz input audio from input device. The audio recording settings used are let settings: [String: Any] = [ AVFormatIDKey: Int(kAudioFormatMPEG4AAC), AVSampleRateKey: sampleRate AVNumberOfChannelsKey: 1 AVEncoderAudioQualityKey: AVAudioQuality.high.rawValue ] When tried using AVAudioEngine using AVAudioFile, AVAudioFile(forWriting: fileURL, // file extension .m4a settings: fileSettings, commonFormat: AVAudioCommonFormat.pcmFormatFloat32, interleaved: interleaved) else { return } got error CodecConverterFactory.cpp:977 unable to select compatible encoder sample rate AudioConverter.cpp:1017 Failed to create a new in process converter -> from 1 ch, 96000 Hz, Float32 to 1 ch, 96000 Hz, aac (0x00000000) 0 bits/channel, 0 bytes/packet, 0 frames/packet, 0 bytes/frame, with status 1718449215
1
0
245
Nov ’25
Reply to Wakes (CalendarDate), although related UI settings are off
Are you build a product for macOS where this is relevant? If so, can you explain more about the background to that. If not, I suspect that the Apple Developer Forums aren’t the right place for you, and I encourage you to pop on over to Apple Support Community, run by Apple Support. Share and Enjoy — Quinn “The Eskimo!” @ Developer Technical Support @ Apple let myEmail = eskimo + 1 + @ + apple.com
Topic: App & System Services SubTopic: Core OS Tags:
Nov ’25
Reply to Spatial Audio on iOS 18 don't work as inteneded
For completeness, this is the view I'm experimenting with: import AVFoundation import SwiftUI class EngineerPlayer { let audioEngine = AVAudioEngine() let playerNode = AVAudioPlayerNode() let environmentNode = AVAudioEnvironmentNode() init(_ url: URL) throws { let audioFile = try AVAudioFile(forReading: url) let mono = AVAudioFormat(standardFormatWithSampleRate: audioFile.processingFormat.sampleRate, channels: 1) let stereo = AVAudioFormat(standardFormatWithSampleRate: audioFile.processingFormat.sampleRate, channels: 2) audioEngine.attach(playerNode) audioEngine.attach(environmentNode) audioEngine.connect(playerNode, to: environmentNode, format: mono) audioEngine.connect(environmentNode, to: audioEngine.mainMixerNode, format: stereo) audioEngine.prepare() try audioEngine.start() environmentNode.renderingAlgorithm = .HRTFHQ playerNode.pointSourceInHeadMode = .mono playerNode.position = AVAudio3DPoint(x: 0, y: 2, z: 10) playerNode.scheduleFile(audioFile, at: nil, completionHandler: nil) } func updatePosition(_
Topic: Media Technologies SubTopic: Audio Tags:
Nov ’25
Reply to "Signing certificate" and post-installation assignment fail due to IOPCIPrimaryMatch
So, let me start with the causes of these failures: But when I do, the signing certificate status in Xcode shows as failed. Why is that? This post has more detail, but in basic terms, your app’s signing is created from two components: Your Entitlement.plist -> This is used to create the actual code-signature attached to your app bundle. The provisioning profile -> This is a file that we've signed and which is then embedded inside your app’s bundle. (Note: not all signed apps have/need one of these, but all DEXT do). The system validates the entitlement by comparing the entitlement values contained in those two files (your app’s code signature and its embedded provisioning profile). That's why the profile (#2) is signed by us - it proves we want your app to have the entitlement. Xcode checks that your app is properly signed by comparing the data in those values to ensure that they match, creating this failure when they don't match. What I wanted to convey in my previous post was that I think I should inc
Topic: Code Signing SubTopic: Entitlements Tags:
Nov ’25
Will the new Automix feature in iOS 26 be available for third-party apps using MusicKit?
Hi everyone, We’re currently developing a music-based app using MusicKit, and we recently noticed that iOS 26 beta introduces a new “Automix” feature in the Apple Music app. This enables seamless DJ-style transitions between songs—beyond the standard crossfade functionality. We’re trying to understand: Will this Automix feature be accessible to third-party apps that use MusicKit? If not available in the initial iOS 26 release, is there a plan to expose it through public APIs in a future update? Is there any technical documentation, WWDC session, or roadmap info regarding Automix support via MusicKit? This functionality would be a significant enhancement for our app, especially for intelligent audio transitions and curated playlists. Thanks.
2
0
721
Nov ’25
Reply to How to check if a sandboxed app already has the access permission to a URL
@Etresoft You can use the isReadable value from URLResourceValues. Thank you. This is exactly the API I was looking for. However, it sounds like you're not approaching this from the right direction. I'm not trying to open random files...; I just want to use it to prompt the user to obtain access permissions if they don't have them for the file they tried to open (the file linked by an alias).
Topic: UI Frameworks SubTopic: AppKit Tags:
Nov ’25
Reply to CLLocationManager didVisit no longer invoked reliably after iOS 26 updates
This sounds like a regression our engineering teams need to investigate this issue, as this might indicate an issue with iOS 26.x We'd greatly appreciate it if you could open a bug report, include any logs and sample code or models that reproduce the issue, and post the FB number here once you do. We would also like a diagnostic log. It would be very helpful if you could please go to https://developer.apple.com/bug-reporting/profiles-and-logs/ and follow the instructions for Location Services for iOS to install a logging profile on your device. Then reproduce the issue, and follow the instructions at the above link to create a sysdiagnose. And attach that to the Feedback report as well. Bug Reporting: How and Why? has tips on creating a successful bug report. Important: For feedback related to a specific framework or API, select Developer Technologies & SDKs as your Topic, then select the specific technology and relevant OS. For feedback related to Xcode, App Store Connect, or other developer too
Nov ’25