Search results for

Popping Sound

19,352 results found

Post

Replies

Boosts

Views

Activity

Reply to Split tunnel w/o changing route table
Hi Quinn, Thank you for the reply that clarifies my 'hallucination'. I know the whole thing sounds a little bit strange. However, on linux (ubuntu), we have ip rule which is independent of route table; on windows, we have 'Windows Filter Platform' -- to be honest I haven't looked into it yet, but it supposedly can filter packets and redirect them into different TUN interface without changing the route table. If macOS have two of everything (I know it doesn't know xD), how could it not have a way to determine package routing manually? Do we have any alternative at all? For example, configure anchor in pf.conf. I'm also a little bit curious about other VPN's split tunneling function. Do they not exist/not work on macOS at all?
Jun ’25
FairPlay HLS Downloaded Asset Fails to Play on First Attempt When Offline, Works on Retry
Hello, We're seeing an intermittent issue when playing back FairPlay-protected HLS downloads while the device is offline. Assets are downloaded using AVAggregateAssetDownloadTask with FairPlay protection. After download, asset.assetCache.isPlayableOffline == true. On first playback attempt (offline), ~8% of downloads fail. Retrying playback always works. We recreate the asset and player on each attempt. During the playback setup, we try to load variants via: try await asset.load(.variants) This call sometimes fails with: Error Domain=NSURLErrorDomain Code=-1009 “The Internet connection appears to be offline.” UserInfo={NSUnderlyingError=0x105654a00 {Error Domain=NSURLErrorDomain Code=-1009 “The Internet connection appears to be offline.” UserInfo={NSDescription=The Internet connection appears to be offline.}}, NSErrorFailingURLStringKey=file:///private/var/mobile/Containers/Data/Application/2DDF9D7C-9197-46BE-8690-C23EE75C9E90/Library/com.apple.UserManagedAssets.XVvqfh/Baggage_9DD4E2D3F9C0E68F.movpkg/, NSErro
1
0
147
Jun ’25
Reply to When DHCP is used, the Network Extension will cause the machine to fail to obtain an IP address
We’re currently tracking an issue that sounds very similar to the one you’re reporting here, namely that enabling a transparent proxy causes DHCP problems after you disconnect a network interface (r. 150505789). This is not fixed in the latest public release of macOS (15.5) but, as always, I encourage you to re-test on new beta releases as we seed them. Share and Enjoy — Quinn “The Eskimo!” @ Developer Technical Support @ Apple let myEmail = eskimo + 1 + @ + apple.com
Jun ’25
Reply to How can I bundle resources along with my launch agent?
I don't think SharedSupport is the appropriate place. The legacy documentation says that this location is for additional non-critical resources that do not impact the ability of the application to run. Unfortunately, I think the legacy migration documentation is wrong because it explicitly says to use the Resources folder for executable code. Don't do that. Here is the current documentation. None of that is really what you're asking about, but I think it leads up to it. Typically, launch agents are single-file executables. I'm not familiar with the taskinfo tool or its output. But what it says makes sense. A Launch Agent would run with a user space UI role. That's how Launch Agents work. That's not the same as a Launch Daemon that runs as root and does not require a login session. When I look at how most other apps handle these things, what you're describing sounds more like a Login Item. A login item would live in Contents/Library/LoginItems. Of all the apps I have installed that contain items in Co
Jun ’25
Reply to Network Extension – Delayed Startup Time
However, that’s really just a guess. Are you able to reproduce this yourself? Or are you just going on logs returned from this user? I'm unable to reproduce it; all the above info is from the user's logs. That is, the systen thinks that the tunnel is connecting, so it can’t act on the connection immediately. Eventually the first connection attempt times out and then it connects again. It sounds reasonable, but from the logs, the extension isn't running. Is there any way to solve or detect such cases? Is there any data I can ask the user to help understand what happened?
Jun ’25
Reply to SIGABORT with ExtAudioFileWrite and .m4a file
Hi Joël, Just to follow up on this issue, I found that the value your code is passing for the inNumberFrames argument to ExtAudioFileWrite is actually the number of frames multiplied by the number of channels. It should be only the number of frames; do not multiply this by the number of channels, and you should find that it solves the issue. The term frame is sometimes confusing as it's not always clear whether a frame spans all channels or just a single channel. It often depends on the context. However, a good rule of thumb for APIs such as ExtAudioFileWrite, where a frame count is passed along with an audio buffer list, is that the number of frames multiplied by the mBytesPerFrame field of the audio stream basic description should equal the mDataByteSize of the audio buffer. Thanks, - rhymu
Topic: Media Technologies SubTopic: Audio Tags:
Jun ’25
Reply to Push to talk channelManager(_:didActivate:) doesn't get called
I am implementing the new Push to talk framework and I found an issue where channelManager(:didActivate:) is not called after I immediately return a NOT NIL activeRemoteParticipant from incomingPushResult. I have tested it and it could play the PTT audio in foreground and background. This issue is only occurring when I join the PTT Channel from the app foreground, then kill the app. The channel gets restored via channelDescriptor(restoredChannelUUID:). After the channel gets restored, I send PTT push. I can see that my device is receiving the incomingPushResult and returning the activeRemotePartipant and the notification panel is showing that A is speaking - but channelManager(:didActivate:) never gets called. Having looked a multiple PTT app and MANY CallKit apps (which uses the same underlying audio infrastructure), the only reason I've ever seen this happen is that your app previously activated (or tried to) the audio session itself. The system cannot activate an already active s
Topic: App & System Services SubTopic: General Tags:
Jun ’25
Push to talk channelManager(_:didActivate:) doesn't get called
I am implementing the new Push to talk framework and I found an issue where channelManager(:didActivate:) is not called after I immediately return a NOT NIL activeRemoteParticipant from incomingPushResult. I have tested it and it could play the PTT audio in foreground and background. This issue is only occurring when I join the PTT Channel from the app foreground, then kill the app. The channel gets restored via channelDescriptor(restoredChannelUUID:). After the channel gets restored, I send PTT push. I can see that my device is receiving the incomingPushResult and returning the activeRemotePartipant and the notification panel is showing that A is speaking - but channelManager(:didActivate:) never gets called. Thus resulting in no audio being played. Rejoining the channel fixes the issue. And reopening the app also seems to fix the issue.
1
0
102
Jun ’25
Reply to Technical Inquiry Regarding DriverKit USB Serial Communication Issues on iPadOS
App Stops Functioning After Repeated Builds When I first build and run the sample code without any modifications, it works as expected. However, after making changes and running the app repeatedly on the iPad, it eventually reaches a state where the app stops functioning completely — no logs are printed, and device communication fails. So, my first and strongest recommendation is that you stop using the iPad and shift your development efforts entire to the mac, even if your final target is exclusively iPadOS. Particularly in early development, ALL iPadOS does is make EVERYTHING about DEXT development more difficult. More specifically: The build/test/debug cycle is slower. The lack of critical tools like IORegistryExplorer.app make it very difficult to determine exactly what's going on. The logging infrastructure is more cumbersome. Because of how opaque iPadOS is vs. macOS, fixing issues is unnecessarily cumbersome and complex. Putting all of that another way: A DEXT that is not fully functional an macOS will
Topic: App & System Services SubTopic: Drivers Tags:
Jun ’25
Reply to Alert structure and playing sound
I realized I never got back to this post to show what I ended up with. Like I mentioned before, I was thinking of putting the audio player in shared application data. It looks like this ... import SwiftUI import Observation import os @preconcurrency import AVFoundation @main struct Chain_TimerApp: App { @State private var appData = ApplicationData.shared var body: some Scene { WindowGroup { ContentView() .environment(appData) } } } @Observable class ApplicationData: @unchecked Sendable { var timers: [TimerData] = [] var timerRunningStates: [UUID: Bool] = [:] var isSerial: Bool = false var audioData: AudioData let logger = Logger(subsystem: Bundle.main.bundleIdentifier ?? Chain Timer, category: ApplicationData) static let shared: ApplicationData = ApplicationData() . . . For the AudioData structure struct AudioData { var audioPlayer: AVAudioPlayer var sound: String = classicAlarm var logger: Logger init(logger: Logger) { self.logger = logger audioPlayer = AVAudioPlayer() } mutating func setSo
Topic: UI Frameworks SubTopic: SwiftUI Tags:
Jun ’25
Alert structure and playing sound
Hello, I'm working on an SwiftUI iOS app that shows a list of timers. When the timer is up then I pop up an alert struct. The user hits ok to dismiss the alert. I am trying to include an alarm sound using AVFoundation. I can get the sounds to play if I change the code to play when a button clicks so I believe I have the url path correct. But I really want it to play during the alert pop up. I have not been able to find examples where this is done using an alert so I suspect I need a custom view but thought I'd try the alert route first. Anyone try this before? @State var audioPlayer: AVAudioPlayer? .alert(isPresented: $showAlarmAlert) { playSound() -- Calls AVFoundation return Alert(title: Text(Time's Up!)) } func playSound() { let alertSoundPath = Bundle.main.url(forResource: classicAlarm, withExtension: mp3)! do { audioPlayer = try AVAudioPlayer(contentsOf: alertSoundPath) audioPlayer?.play() } catch { appData.logger.debug(Error playing sound: (alertSoundPath)) }
4
0
79
Jun ’25
Reply to Assistance Needed with Enabling Speech Recognition Entitlement for iOS App
Subject: Clarification on Speech Recognition Capability Requirement for iOS Hi Quinn, The Eskimo Thank you for your reply, and I really appreciate your time. To clarify — I was referring to Apple’s official documentation, including: Asking Permission to Use Speech Recognition https://developer.apple.com/documentation/speech/asking-permission_to_use_speech_recognition Recognizing Speech in Live Audio https://developer.apple.com/documentation/speech/recognizing_speech_in_live_audio While these documents don’t explicitly mention the need to enable the Speech Recognition capability in the Developer Portal, I’ve come across several trusted sources that do suggest it’s required for full and stable functionality. For example: Apple Developer Forum: Thread discussing Speech Framework entitlement https://developer.apple.com/forums/thread/116446 Stack Overflow: Speech recognition capability and entitlement setup https://stackoverflow.com/a/43084875 Both of these sources explain that enabling the Speech Recogni
Jun ’25