Search results for

Popping Sound

19,600 results found

Post

Replies

Boosts

Views

Activity

Reply to AVAudioSessionCategoryPlayback is not allowed while CallKit call is active
Thank you for the clear explanation. We must emphasize that, as a Mission critical Push-to-Talk (PTT) application, the perceived volume and clarity of the audio are fundamental user requirements. We currently achieve the necessary amplification and clarity using a custom PCM gain factor. Therefore, the ability for our application to apply a specific gain factor to PCM samples for amplification is essential. Could you please advise on which framework or API Apple recommends for a third-party application to intercept and process the audio stream—specifically, to apply a GAIN factor to PCM samples—during an active CallKit call?
Topic: Media Technologies SubTopic: Audio Tags:
2w
UI tests blocked by “bash requesting screen access” popup in Mac OS 15
On macOS, I get a system popup when running UI tests in GitHub saying: “bash” is requesting to bypass the system private window picker and directly access your screen and audio. How can I prevent these login and screen access popups from appearing during automated UI tests? Is there an official setup or configuration for running IntelliJ UI tests in CI environments (macOS, Linux, Windows) to avoid such dialogs? My builds run in GitHub Actions VMs, so I can’t manually grant these permissions, and they block the tests.
0
0
45
2w
Reply to Is that possible to update ModelContainer?
So the injection is finished right now, although I changed the container in singleton actor, that's the reason it won't affect anything. You assessment sounds about right. If your model container is a @State of the view, as shown in the API specification I linked above, and the view injects the container to a subview hierarchy, I'd expect that changing container triggers an update of the subview with the new container. So it's not possible to save the model data from another ModelContext to a new ModelContext, is that right? Right. If you save a model instance with a model context, and need to access the model instance from a second context, consider grabbing the persistentModelID of the instance and converting it to a new instance with the second context using model(for:). Best, —— Ziqiao Chen  Worldwide Developer Relations.
2w
Can't add a new version for my app and can't upload from Xcode to App store
It's been over two weeks since I accepted the updated agreement BUT when I am in App Store Connect, trying to add a new version for my app, click the + sign still show this pop up: Agreement Update The Apple Developer Program License Agreement has been updated and needs to be reviewed. For more information, go to your account on the Apple Developer website. There is no way to by pass it, meaning no way for me to add a new version at all. I need to deploy a new version for some critical bug fixes, particularly to address compatibility issues with iOS 26. In my Xcode, after archiving a new build, trying to upload it to either Distribution or TestFlight, it kept failing and showed me error You don't have required contract for this operation.. I called Apple Support last week and was told by a Senior Staff that this happened to many developers (?) due to some maintenance issue and should resolve after Nov. 3rd, except, the issue persists. I am so frustrated. :( Exactly what happened at app store connect
0
0
33
2w
Reply to CoreML regression between macOS 26.0.1 and macOS 26.1 Beta causing scrambled tensor outputs
Seeing the same issue on a couple of models we own - audio processing. Output diverges from 26.0.1 and 26.1. On our tests we have seen that a couple of models work as expected on CPU but present corrupt/degraded data on NPU/GPU. Would also like to know if Apple is aware of thjis issue as it is not only affecting our development but also our customer experience as our models are shared with clients and are used in production.
Topic: Machine Learning & AI SubTopic: Core ML Tags:
2w
Not able to write AAC audio with 96 kHz sample rate using AVAudioRecorder or Extended audio file services
Not able to record audio in AAC format with 96 kHz sample rate using AVAudioRecorder or Extended Audio File services with 96 kHz input audio from input device. The audio recording settings used are let settings: [String: Any] = [ AVFormatIDKey: Int(kAudioFormatMPEG4AAC), AVSampleRateKey: sampleRate AVNumberOfChannelsKey: 1 AVEncoderAudioQualityKey: AVAudioQuality.high.rawValue ] When tried using AVAudioEngine using AVAudioFile, AVAudioFile(forWriting: fileURL, // file extension .m4a settings: fileSettings, commonFormat: AVAudioCommonFormat.pcmFormatFloat32, interleaved: interleaved) else { return } got error CodecConverterFactory.cpp:977 unable to select compatible encoder sample rate AudioConverter.cpp:1017 Failed to create a new in process converter -> from 1 ch, 96000 Hz, Float32 to 1 ch, 96000 Hz, aac (0x00000000) 0 bits/channel, 0 bytes/packet, 0 frames/packet, 0 bytes/frame, with status 1718449215
1
0
205
2w
Reply to Correct SwiftData Concurrency Logic for UI and Extensions
Hi Ziqiao, Your previous guidance to use a ModelActor and PersistentIdentifiers has successfully resolved all my data race crashes. However, I'm left with one persistent UI-level race condition, specifically related to deletion. My View Structure My setup is a standard master-detail pattern, which on iPhone (where I'm seeing this bug) acts like a NavigationStack: ScheduleView (Master): Uses @Query to fetch and display all events in a List. Each row is a NavigationLink. This view also has a swipe-to-delete action on its rows. EventDetailsView (Detail): The destination of the NavigationLink. It contains a Delete button. The Problem: A dismiss() vs. @Query Race When I tap the Delete button in EventDetailsView, I call my actor (e.g., await databaseManager.deleteEvent(id: event.persistentModelID)) and then immediately call dismiss() to pop the view. The Bug: When the app returns to ScheduleView, the row for the just-deleted event is still visibly active in the list for at least 5 seconds before it finally
2w
Reply to WiFi aware demo paring issue
It sounds like you’re building an accessory and you want to use Wi-Fi Aware to talk to that accessory from iOS. If so, go to Developer > Accessories, download Accessory Design Guidelines for Apple Devices, and review the Wi-Fi Aware chapter. Share and Enjoy — Quinn “The Eskimo!” @ Developer Technical Support @ Apple let myEmail = eskimo + 1 + @ + apple.com
2w
Real-time notification
I need to create a background notification that counts down time and uses buttons to add or subtract time. Currently, I'm developing in React Native and using Expo Go to develop my app. I managed to display a simple notification, but I can't get it to work in real-time, so that when the time is up, it emits a sound indicating that the break is over. How can I implement this feature? My application now: My goal:
1
0
57
2w
Reply to Creating an URL bookmark in macOS 26.1 of a Windows NTFS fileshare returns a bookmark with access to the local drive
Creating bookmarks to the root directory of a volume was broken in earlier versions of 26. It looks like a change was made and that change broke NTFS. Good thing I haven't released yet. I need to double check. My workaround assumed that creating the bookmark would fail. I didn't anticipate success with invalid data. I don't know if my workaround would work for you. I have a special case where I don't actually need to access the data. I just want to know the URL. I can get the data I need without a security scoped bookmark. And it sounds like I need to re-do it anyway.
Topic: App & System Services SubTopic: Core OS Tags:
2w
Reply to PWA video playback stopped working after updating iOS to 26.0.1
I'm having the same issue with a few of my PWAs. They work fine in Safari, but when installed as a homescreen app, audio playback won't start. It does work once if I restart the phone, but then stops working again if I close and reopen the app. Additionally, it seems that sequential playback doesn't work when the phone is locked. Instead of playing the next audio track, playback will stop until the app is re-focused, at which point playback will begin again.
Topic: Safari & Web SubTopic: General
2w
Reply to Unable to upload an app with ExtensionFoundation
Hmmm, that doesn’t sound right. While ExtensionKit on iOS has significant limitations, my understanding is that we do allow iOS apps to use extensions that are bundled within the app, and that’s what you’re trying to do. I’m gonna consult with Kevin about this and one of us will get back to you. Share and Enjoy — Quinn “The Eskimo!” @ Developer Technical Support @ Apple let myEmail = eskimo + 1 + @ + apple.com
Topic: App & System Services SubTopic: General Tags:
2w
Reply to No mic capture on iOS 18.5
We discovered the iOS version is not the culprit. All models starting with 14 series iPhones and onward are affected by this problem, even on iOS 17. Must be some glitched interaction of hardware and software feature. Affected models do not produce any audio, magnitude logging from C++ engine doesn't produce anything. static OSStatus mixerPlaybackCallack(void *inRefCon, AudioUnitRenderActionFlags *ioActionFlags, const AudioTimeStamp *inTimeStamp, UInt32 inBusNumber, UInt32 inNumberFrames, AudioBufferList *ioData) { AudioEngine *self = (AudioEngine*)inRefCon; uint32_t toget = inNumberFrames << 2; //printf(mixerPlaybackCallack: bus: %u, frames: %un, (uint32_t)inBusNumber, (uint32_t)inNumberFrames); if (inBusNumber < kMaxRemoteLines) { auto& audiobuffer = self->m_remoteBuffers[inBusNumber]; int32_t len = 0; if (uint8_t *buf = audiobuffer.tail(len)) { int32_t tocopy = len > toget ? toget : len; memcpy(ioData->mBuffers[0].mData, buf, tocopy); audiobuffer.consume(tocopy); if (tocopy &
2w