Search results for

Popping Sound

19,744 results found

Post

Replies

Boosts

Views

Activity

Reply to Linker nondeterminism (ld_new) involving branch islands
Hello. On arm64 architecture branch islands are needed only if the total size of the executable code in a binary exceeds 128mb. How many islands will be effectively inserted depends on the exact code layout, and that'll also be a contributing factor to the potentially non-deterministic output. It is supposed to be deterministic, but it sounds like you might be hitting some edge cases. It'd be great if you could file a feedback report with your sample, so that we could try to reproduce it. You can add a -debug_snapshot option (-Wl,-debug_snapshot in Xcode's other linker flags build setting) to create a minimal linker reproducer that'll be saved in a /tmp/*.ld-snapshot directory. One note, in Xcode 26, we've made significant underlying changes to how linker snapshots are generated. There were cases where snapshots created by older ld versions weren't quite complete. So upgrading your toolchain might be helpful, but we can also try first with a snapshot from Xcode 16.4.
Oct ’25
iMessages Deeplink App Switching for iOS 26.0
Ok so for some background, our app has a keyboard extension where we run a dictation service. Due to iOS limitations, this requires the user to press a button on the keyboard which will then bring the user to our app to activate an audio session. Once the audio session has been activated, it takes the user back to the original app it came from to continue using the keyboard + dictation service. The problem we're running into involves iOS 26.0 and the iMessages app. Whenever our app tries to switch back to the iMessages app using Deep Link (specifically the messages:// URL), the iMessages app opens up a new message compose sheet. This compose sheet replaces the view or message thread that the user was previously looking at which we don't want. This behavior appears to be only happening in iOS 26 and not in any of the previous iOS versions (tested up to iOS 18.6). We know that it should be possible to bring the user back to the messages app without opening up this new compose sheet, because si
3
0
199
Oct ’25
Mute behavior of Volume button on AVPlayerViewController iOS 26
With older iOS versions, when user taps Mute/Volume button on AVPLayerViewController to unmute, the system restores the sound volume of device to the level when user muted before. On iOS 26, when user taps unmute button on screen, the volume starts from 0 (not restore). (but it still restores if user unmutes by pressing physical volume buttons). As I understand, the Volume bar/button on AVPlayerViewController is MPVolumeView, and I can not control it. So this is a feature of the system. But I got complaints that this is a bug. I did not find documents that describe this change of Mute button behavior. I need some bases to explain this situation. Thank you.
0
0
155
Oct ’25
Can you include an `alert` with a sound within the `end` event?
Q1. Can you place a sound on an end event? That doesn't seem to work for us Additionally: Q2. Is there any way that after you send the end event, still have the Live Activity remain on the Dynamic Island until the dismissal-date? Currently when an end event is sent, it's abruptly ended from the Dynamic Island without any sound. Users are confused until minutes/hours later they see their Lock Screen.
0
0
150
Oct ’25
Reply to Apple-hosted managed assets
I did further investigation and that's step by step I did: Created manifest file: { assetPackID: assettest, downloadPolicy: { onDemand: {} }, fileSelectors: [ { file: audio/test_audio.mp3 } ], platforms: [ iOS ] } Created package assettest and upload it to AppStore Connect via Transporter (verified, delivered, ready for internal testing) xcrun ba-package Manifest.json -o assettest.aar Created new target ManagedAssetPack (Apple-Hosted). BackgroundDownloadHandler file is: import BackgroundAssets import ExtensionFoundation import StoreKit @main struct DownloaderExtension: StoreDownloaderExtension { func shouldDownload(_ assetPack: AssetPack) -> Bool { // Use this method to filter out asset packs that the system would otherwise download automatically. You can also remove this method entirely if you just want to rely on the default download behavior. return true } } Created appgroup and successfully associated main app target and background assets target (I have 3 flavors in main app, Signing&Capab
Oct ’25
Reply to CallKit does not activate audio session with higher probability after upgrading to iOS 18.4.1
@DTS Engineer Sure, will do. We have re-implemented the workaround solution and plan to roll it out to production in two weeks. Before proceeding, could you please help double-check if the workaround is implemented as expected in our lab app? Thank you for your assistance. Sysdiagnose log file sysdiagnose_2025.10.28_18-28-20_0800_iPhone-OS_iPhone_23A355.tar.gz has been uploaded in the ticket. FB20789841 (CallKit does not activate audio session, the issue rate increased on iOS 26.)
Topic: App & System Services SubTopic: General Tags:
Oct ’25
Reply to Performance issues when using the Network API used to create a web server
Thanks for the extra. Another question: [quote='805044021, chevalierpg, /thread/805044, /profile/chevalierpg'] there is no issue when using WiFi [/quote] I’d like to clarify what you mean by “no issue” here. In general, I’d expect Wi-Fi to be slower than Ethernet, but it doesn’t seem like it’d be slow enough to mask this issue. So imagine these scenarios: Server | Network API | Speed ------ | ----------- | ----- Dispatch | Ethernet | A Dispatch | Wi-Fi | B Network | Ethernet | C Network | Wi-Fi | D In both servers I’d expect to see a slight performance loss from Ethernet to Wi-Fi, but it sounds like you’re seeing a much greater performance loss in the Network framework case. That is, the C:D ratio is wildly different than the A:B ratio. Is that right? Share and Enjoy — Quinn “The Eskimo!” @ Developer Technical Support @ Apple let myEmail = eskimo + 1 + @ + apple.com
Topic: App & System Services SubTopic: Core OS Tags:
Oct ’25
Reply to PushToTalk
I found the problem. 1: Because the recording operation needs to press the Bluetooth button and hold it all the time to start recording. Let go and stop recording. 2: When you press and hold the Bluetooth device button. The mobile phone will always receive the data sent by the Bluetooth device. The recorded audio at this time is noise (the app is always running in the background). 3: But it is normal that there is no noise when the app is running in the foreground and repeats the recording operation. 4: Press and lift the Bluetooth device button to start recording. Wait for the recording. Press and lift again. Stop recording. At this time, the recording is normal and there is no noise. 5: So how to do it without changing the audio session mode when long pressing (long press will always receive the data sent by Bluetooth) 6: How to maintain the audio mode at the beginning
Topic: Media Technologies SubTopic: Audio Tags:
Oct ’25
Reply to TCC Permission Inheritance Failure: Swift Parent -> Python Child
[quote='805245021, sonnylife, /thread/805245, /profile/sonnylife'] our core monitoring logic is in a Python daemon. [/quote] I’d like to clarify what you mean by this. On macOS we generally use the term daemon to mean a launchd daemon, that is, something that launchd runs in the global context, usually as the result of a property list file in /Library/LaunchDaemons [1]. However, it sounds like you’re use in in the more general sense of a program that runs in the background. So, how is this program actually launched? You mentioned it’s a child process, so presumably you’re using fork / exec* or posix_spawn or something layered on top of that. In which case, how is the parent process launched? Share and Enjoy — Quinn “The Eskimo!” @ Developer Technical Support @ Apple let myEmail = eskimo + 1 + @ + apple.com [1] Or installed as a daemon via SMAppService.
Oct ’25
Reply to iOS App Crashes after install but not when running from Xcode
OK, thank you. So I do have an Enterprise account; on the device I installed the app by hand, I do have the trust setting, and it’s trusted. The app will launch and exit and just sit in the background. To give you a scale, we deploy the application via MDM to our fleet, which is less than 10,000. We have been seeing devices more regularly where the app launches and exits right after seeing the splash screen. I think the next step here is to reproduce the problem, collect a sysdiagnose (make sure you don't reboot the device until you've collected the log), then take a look at the console log to see what happened. I don't know what you'll find, but the device WILL log anytime in choose/can't run an app. This also seems to occur on newly enrolled devices. FYI, the devices are all supervised. One last data point. If I sign the app as AdHoc using another non-enterprise developer account, the app installs (did not use MDM) and launches and runs fine. Sure. All of this sounds like a problem with the enterpr
Topic: App & System Services SubTopic: Core OS Tags:
Oct ’25
Reply to PushToTalk
3: Use a Bluetooth device to call up the microphone for audio recording and write the audio data to the file. What is the “Bluetooth device here? Is it: (1) A standard, classic Bluetooth, consumer headset (A2DP/HFP)? To support these, all you need to do is enable accessory events using setAccessoryButtonEventsEnabled(...), at which point the PTT framework will start delivering events through the standard delegate methods. The main issue to be aware of here is that the entire process that makes this work is inherently a bit weird and can look somewhat buggy. For example, many A2DP head units automatically send play on connect, which the PTT framework will convert to a start transmission. The problem here is that the system doesn't really have any way to know exactly what kind of device it's dealing with (for example, differentiating between an headset and a head unit), which means any attempt to really fix one edge case ends up breaking other devices. Two points to all of this: Your app is go
Topic: Media Technologies SubTopic: Audio Tags:
Oct ’25
Prevent SwiftUI to stop rendering UI when window loses focus.
Hello, I am writing an audio utility, with a typical audio track player, in SwiftUI for macos 26. My current problem is that the SwiftUI stops rendering the main window UI when the window loses focus. This is a problem since even clicking on the app menu bar has the window loose focus, and the timer, time cursor and all animations of the audio piece stop. All baground services, audio, timers and model continue running (even tho with some crackling on the switch). Once the window focus is re-obtained the animations continue and skip to current state. I have read that SwiftUI optimizes macos like ios, and disables the ui run loop, but there must be a way to disable, since this obviously not the case for most mac app. Is there a solution with either SwiftUI or involving AppKit?
1
0
155
Oct ’25
Reply to Copying files using Finder and Apple Events
Okay, I made it working exactly as I wanted, so I'd like to inform all people participating here, but also others who might see this while searching for similar content. The first stumbling block was making sure the application can actually send Apple Events in the first place. And in that regard I admit I should've listened better to @Etresoft and I apologize to him for not listening more carefully and dismissing his remark about hoops so that the app can actually send Apple Events too easily. It turned out that, even though the application is NOT sandboxed, com.apple.security.automation.apple-events entitlement is mandatory in the entitlements file. I couldn't assume it would be necessary even for a NON-sandboxed app, but it is. From my experience with executing AppleScript from another, sandboxed, application I remember that defining com.apple.security.automation.apple-events in the entitlements file and NSAppleEventsUsageDescription in the Info.plist file always go together and that's the case here too. S
Oct ’25