Search results for

Popping Sound

19,599 results found

Post

Replies

Boosts

Views

Activity

Swift Playground preview issues
I am a high school teacher from China, currently teaching programming courses at my school. Several issues are preventing the course from being completed on schedule. As shown in the image, there are two problems: The preview issue has existed since updating to the latest software and system version; I don't know how to resolve it. After renaming a Swift file, this window keeps popping up, even after closing it. This problem didn't exist last year. How can I fix this? Thank you for your reply! 我是一名来自中国的高中教师,目前在学校开设编程课程,目前有几个问题导致课程无法如期完成。如图所示,存在两个问题: 一、自从更新了最新版本的软件和系统后,预览问题就存在,不知该如何处理 二、swift文件重命名后,一直跳出来这个窗口,关掉以后还是跳出来 去年的时候还没有存在这样的问题,请问我该如何处理,感谢您的回复!
3
0
59
2d
iOS BGProcessingTask + Background Upload Not Executing Reliably on TestFlight (Works in Debug)
iOS BGProcessingTask + Background Upload Not Executing Reliably on TestFlight (Works in Debug) Description: We are facing an issue with BGTaskScheduler and BGProcessingTask when trying to perform a background audio-upload flow on iOS. The behavior is inconsistent between Debug builds and TestFlight (Release) builds. Summary of the Problem Our application records long audio files (up to 1 hour) and triggers a background upload using: BGTaskScheduler BGProcessingTaskRequest Background URLSession (background with identifier) URLSession background upload task + AppDelegate.handleEventsForBackgroundURLSession In Debug mode (Xcode → Run on device), everything works as expected: BGProcessingTask executes handleEventsForBackgroundURLSession fires Background URLSession continues uploads reliably Long audio files successfully upload even when the app is in background or terminated However, in TestFlight / Release mode, the system does not reliably launch the BGProcessingTask or Background URL
1
0
33
2d
Testing Age Assurance in Sandbox Failed
According to Apple's documentation at https://developer.apple.com/documentation/storekit/testing-age-assurance-in-sandbox?language=objc, the testing steps and expected responses are outlined as follows: Test app consent revocation To test the notification when a parent or guardian revokes access to your app on behalf of their child, follow these steps: Start with a Sandbox account. From the Age Assurance settings, tap Revoke App Consent. Enter your app’s Bundle ID (for example, com.example.bundle). Tap Revoke Consent to simulate the revocation. Confirm that the system displays “Notification Triggered” with the message “A notification will be sent to the developer server soon.” I followed the steps exactly as described above, but during the fifth step, instead of seeing the prompt A notification will be sent to the developer server soon, a pop-up dialog with only a confirmation button appeared. After clicking it, there was no further response, and our server did not receive any notification (neither f
1
0
32
3d
Reply to [Core Bluetooth]The Application Playing a Notification Tone (AVAudioPlayer, System sounds) Should Automatically Route Audio to the Connected BLE accessory which uses HFP Profile
How do we programmatically ensure that these notification tones are automatically and reliably routed to the connected HFP headset Take a look at Routing audio to specific devices in multidevice sessions, which covers the direct issue of audio routing. __ Kevin Elliott DTS Engineer, CoreOS/Hardware
Topic: App & System Services SubTopic: Core OS Tags:
3d
[Core Bluetooth]The Application Playing a Notification Tone (AVAudioPlayer, System sounds) Should Automatically Route Audio to the Connected BLE accessory which uses HFP Profile
The iOS application is a Central Manager connected to a Bluetooth Low Energy (BLE) accessory that utilizes the Hands-Free Profile (HFP). When the application plays a notification tone (using AVAudioPlayer or System Sounds), the audio is incorrectly routed to the device's internal speaker instead of the active HFP headset. How do we programmatically ensure that these notification tones are automatically and reliably routed to the connected HFP headset
2
0
56
3d
Reply to AlarmKit play sound only once
Thank you for your post. When you do not provide your file to play, and you provide the default sound with sound: AlertConfiguration.AlertSound = .default, the sound will now be the default sound.he sound looping until you stop at what time? https://developer.apple.com/documentation/alarmkit/alarmmanager/alarmconfiguration/timer(duration:attributes:stopintent:secondaryintent:sound:) Do you have the time in a focused project with the sound file you are using to share it for people to give you ideas? If you're not familiar with preparing a test project, take a look at Creating a test project. Albert Pascual
  Worldwide Developer Relations.
3d
Reply to iOS 26.1 PHPickerConfiguration.preselectedAssetIdentifiers doesn't select previous pictures in the PHPickerViewController
This sounds like a regression. Have you tried verifying if the identifiers are still valid when you try to use them again? You can do this by retrieving the assets using the identifier with the fetchAssets(withLocalIdentifiers:options:) method. See the article on Fetching Assets for more details. If the identifiers are valid and you’re still encountering the issue, please file a bug report. Include a minimal reproducible sample code and share the Feedback ID here so I can pass it on to the appropriate engineering team.
Topic: UI Frameworks SubTopic: UIKit
3d
Is there an errors with SpatialAudioCLI?
Hi, everyone, I downloaded the source code EditingSpatialAudioWithAnAudioMix.zip from https://developer.apple.com/documentation/Cinematic/editing-spatial-audio-with-an-audio-mix, when I carried out one of the actions named process in command line the program crashed!! Form the source code, I found that the value of componentType is set to kAudioUnitType_FormatConverter: // The actual `AudioUnit`. public var auAudioMix = AVAudioUnitEffect() init() { // Generate a component description for the audio unit. let componentDescription = AudioComponentDescription( componentType: kAudioUnitType_FormatConverter, componentSubType: kAudioUnitSubType_AUAudioMix, componentManufacturer: kAudioUnitManufacturer_Apple, componentFlags: 0, componentFlagsMask: 0) auAudioMix=AVAudioUnitEffect(audioComponentDescription: componentDescription) } But in the document from https://developer.apple.com/documentation/avfaudio/avaudiouniteffect/init(audiocomponentdescription:), it seems that componentType can not
1
0
92
3d
playSoundFileNamed not working on Tahoe?
I have published a number of games that use SpriteKit for everything important. Since the release of macOS Tahoe, I've had a lot of end user reports saying that sound effects have stopped working in many (but not all) of my titles. I'm not doing anything unusual here – typical code is: sndGameOver = [SKAction playSoundFileNamed:@Audio/GameOver.wav waitForCompletion:YES]; Then at the appropriate time: [self runAction:sndGameOver]; Has anyone else encountered this? The code still works fine on previous operating systems, and appears to be fine on iOS too. Has something changed in macOS Tahoe? I'm at a bit of a loss. There's nothing obviously different between the titles that do work and the titles that don't. Suggestions welcomed! Thanks
5
0
988
3d
How to integrate Apple Immersive Video into the app you are developing.
Hello, Let me ask you a question about Apple Immersive Video. https://www.apple.com/newsroom/2024/07/new-apple-immersive-video-series-and-films-premiere-on-vision-pro/ I am currently considering implementing a feature to play Apple Immersive Video as a background scene in the app I developed, using 3DCG-created content converted into Apple Immersive Video format. First, I would like to know if it is possible to integrate Apple Immersive Video into an app. Could you provide information about the required software and the integration process for incorporating Apple Immersive Video into an app? It would be great if you could also share any helpful website resources. I am considering creating Apple Immersive Video content and would like to know about the necessary equipment and software for producing both live-action footage and 3DCG animation videos. As I mentioned earlier, I’m planning to play Apple Immersive Video as a background in the app. In doing so, I would also like to place some 3D models as RealityKit
2
0
723
3d
Spatial Audio on iOS 18 don't work as inteneded
I’m facing a problem while trying to achieve spatial audio effects in my iOS 18 app. I have tried several approaches to get good 3D audio, but the effect never felt good enough or it didn’t work at all. Also what mostly troubles me is I noticed that AirPods I have doesn’t recognize my app as one having spatial audio (in audio settings it shows Spatial Audio Not Playing). So i guess my app doesn't use spatial audio potential. First approach uses AVAudioEnviromentNode with AVAudioEngine. Chaining position of player as well as changing listener’s doesn’t seem to change anything in how audio plays. Here's simple how i initialize AVAudioEngine import Foundation import AVFoundation class AudioManager: ObservableObject { // important class variables var audioEngine: AVAudioEngine! var environmentNode: AVAudioEnvironmentNode! var playerNode: AVAudioPlayerNode! var audioFile: AVAudioFile? ... //Sound set up func setupAudio() { do { let session = A
4
0
901
3d