Search results for

Popping Sound

19,350 results found

Post

Replies

Boosts

Views

Activity

The menu can't be shown in background process in MacOS 26(beta)
After I upgraded to MacOS 26(beta), my program caused the system to pop up a window as shown in the following picture. My application is a process with only a tray icon. I found that my tray icon is not displayed in the current version, even though I clicked the Always Allow button. Here are my questions: 1.Will this related feature remain consistent in the official release? 2.How can I create a cmd process that only displays a system tray icon (no main window) like Alfred?
2
0
71
Jul ’25
Reply to Notification Service Extension and the main thread
Question, if I am writing async code in the notification service extension, I understand it terminates after 30 seconds. Correct, though I always recommend that anyone setting up things like timeouts use a shorter value just in case. So I'd probably build around ~25s, not 30s. If I want to wait until these async methods finish before calling the content handler, I believe an option I have is to use dispatch groups. However, I am open to other solutions if there are better options. What are you actually waiting on? In general, I've become very nervous anytime I see code that uses dispatch groups because they seem to be used as a slightly awkward band-aid trying to make something work that doesn't really want to work. Case in point here, the main reason an NSE would be waiting is network activity, in which case the simpler solution would be to simply set the right timeout on that network activity. Having said that.... My question is, if I use dispatch groups, is there any issue in using the main queue here? Or
Jun ’25
Method to capture voice input when using CPVoiceControlTemplate
In my navigation CarPlay app I am needing to capture voice input and process that into text. Is there a built in way to do this in CarPlay? I did not find one, so I used the following, but I am running into issues where the AVAudioSession will throw an error when I am trying to set active to false after I have captured the audio. public func startRecording(completionHandler: @escaping (_ completion: String?) -> ()) throws { // Cancel the previous task if it's running. if let recognitionTask = self.recognitionTask { recognitionTask.cancel() self.recognitionTask = nil } // Configure the audio session for the app. let audioSession = AVAudioSession.sharedInstance() try audioSession.setCategory(.record, mode: .default, options: [.duckOthers, .interruptSpokenAudioAndMixWithOthers]) try audioSession.setActive(true, options: .notifyOthersOnDeactivation) let inputNode = self.audioEngine.inputNode // Create and configure the speech recognition request. self.recognitionRequest = SFSpeechAudioBufferR
3
0
107
Jun ’25
Graceful shutdown during background audio playback.
Hello. My team and I think we have an issue where our app is asked to gracefully shutdown with a following SIGTERM. As we’ve learned, this is normally not an issue. However, it seems to also be happening while our app (an audio streamer) is actively playing in the background. From our perspective, starting playback is indicating strong user intent. We understand that there can be extreme circumstances where the background audio needs to be killed, but should it be considered part of normal operation? We hope that’s not the case. All we see in the logs is the graceful shutdown request. We can say with high certainty that it’s happening though, as we know that playback is running within 0.5 seconds of the crash, without any other tracked user interaction. Can you verify if this is intended behavior, and if there’s something we can do about it from our end. From our logs it doesn’t look to be related to either memory usage within the app, or the system as a whole. Best, John
0
0
61
Jun ’25
Reply to Map Switcher MapKit iOS 14 and up
The main point here is that to be able to change the style of a map in iOS 14, you must use MKMapView. Another thing to note is that SwiftUI's Map and MKMapView actually rely on the same underlying technologies. The key difference is that MKMapView (UIKit) gives you much more granular control, whereas the SwiftUI Map is a higher-level, declarative wrapper that, in iOS 14, simply doesn't expose the needed map style API. Remember, SwiftUI was only a year old at that point, so it didn't have half the features it does today, including proper integration with other frameworks. Regarding tracking and user-following: you can definitely replicate this behaviour using MKMapView. It just takes a bit more code, but a quick skim through the documentation will point you in the right direction. You can make use of view properties and delegate methods to handle your tracking logic as needed, as well as any additional features you want to implement. For example, you can set userTrackingMode to .follow or .followWithHeading,
Topic: UI Frameworks SubTopic: SwiftUI Tags:
Jun ’25
Reply to Removing Matter device artefacts.
I’m a newbie on this and playing with Arduino Nano with Matter examples, which, by default, generate Matter Bridges named Matter Accessory. I’ve found that the best way to delete test matter devices is to delete the Bridge within the devices. Long click the accessory, choose Accessory Settings from the pop-up menu, scroll down to Bridge and select that. Now you’ll be in the bridge setting and can scroll down there to get to delete bridge. What I don’t know how to do is delete bridges, like those shown here, when you don’t have an associated accessory (device) interface.
Topic: App & System Services SubTopic: Hardware Tags:
Jun ’25
Unable to use Bluetooth in watchOS companion app if iOS uses AccessorySetupKit
FB18383742 Setup 🛠️ Xcode 16.4 (16F6) 📱 iPhone 13 mini (iOS 18.0.1) ⌚️ Apple Watch Series 10 (watchOS 11.3.1) Observations As AccessorySetupKit does not request Core Bluetooth permissions, when a watchOS companion app is installed after having installed the iOS app, the toggle in the watch settings for Privacy & Security > Bluetooth is turned off and disabled After removing the iPhone associated with the Apple Watch, Bluetooth works as expected in the watchOS app Upon reinstalling the iOS app, there's a toggle for Bluetooth in the iOS ASK app's settings and the ASK picker cannot be presented 🤨 From ASK Documentation: AccessorySetupKit is available for iOS and iPadOS. The accessory’s Bluetooth permission doesn’t sync to a companion watchOS app. But this doesn't address not being able to use Core Bluetooth in a watch companion app at all 🥲 Reproducing the bug Install the iOS + watchOS apps Launch iOS app, tap start scan, observe devices can be discovered (project is set up to find heart rate monitors
2
0
352
Jun ’25
Reply to Verifying braille output in an iOS app without a physical braille device?
Okay, after more research and reading these articles: Use a braille display with VoiceOver on iPhone Change your VoiceOver settings on iPhone It sounds like braille displays just use the VoiceOver output and whether it reads parentheses is part of the device's Verbosity › Punctuation setting. Then looking into whether I can set that programmatically, I came across this .speechAlwaysIncludesPuncturation() modifier, which may do exactly what I want. I'll give that a try report back! 🤞
Jun ’25
Reply to Request authorization for the notification center crash iOS app on Swift 6
At the time of posting, this problem still persist. I had to downgrade the Swift Version Language that the Swift Compiler uses from Swift 6 to Swift 5. As soon as I invoke this function, I get hosed and the app crashes: UNUserNotificationCenter.current().requestAuthorization(options: [.alert, .sound, .badge]) { granted, error in // ... } I hear that this is a known Apple Bug. Is this true?
Topic: Programming Languages SubTopic: Swift Tags:
Jun ’25
Reply to ASAF Panner Pro Tools Plug In
Bumping this because I haven't found where access these tools either! I'm surprised the announcement only mentioned AAX/Pro Tools and Davinci Resolve. (And no mention of Logic Pro?) Please add support AU/VST3 versions to supports DAWs like REAPER, which has become a go-to audio editor for higher order ambisonic and spatial work.
Topic: Media Technologies SubTopic: Audio Tags:
Jun ’25
AVAssetResourceLoaderDelegate for radio stream
Hi everyone, I’m trying to use AVAssetResourceLoaderDelegate to handle a live radio stream (e.g. Icecast/HTTP stream). My goal is to have access to the last 30 seconds of audio data during playback, so I can analyze it for specific audio patterns in near-real-time. I’ve implemented a custom resource loader that works fine for podcasts and static files, where the file size and content length are known. However, for infinite live streams, my current implementation stops receiving new loading requests after the first one is served. As a result, the playback either stalls or fails to continue. Has anyone successfully used AVAssetResourceLoaderDelegate with a continuous radio stream? Or maybe you can suggest betterapproach for buffering and analyzing live audio? Any tips, examples, or advice would be appreciated. Thanks!
0
0
86
Jun ’25
Reply to Permission requirements for LAContext's canEvaluatePolicy
I can give you a good answer for this: [quote='790333021, jonayuan, /thread/790333, /profile/jonayuan'] When exactly does the biometric authentication permission pop-up appear for users - is it when calling canEvaluatePolicy(…) or evaluatePolicy(…)? [/quote] The latter. The purpose of canEvaluatePolicy(…) is to tell you whether a policy is supported or not. It’s the act of evaluating that policy that triggers side effects, like a biometrics user interaction. [quote='790333021, jonayuan, /thread/790333, /profile/jonayuan'] Do I need to include a privacy string in my app to use the LAContext's canEvaluatePolicy(…) function? [/quote] Now, that’s a more subtle question. There’s an implicit assumption in the API that folks would only call canEvaluatePolicy(…) because they want to, at some point, evaluate a policy, and that obviously requires a privacy string. It’s easy to imagine the Local Authentication implementation requiring the privacy string in your unusal case. Your tests confirm that the current i
Topic: Privacy & Security SubTopic: General Tags:
Jun ’25
External Mic (Hollyland Lark M2) Not Working After iOS 26 Update on iPhone 11 Pro Max
I am writing to report an issue I’m facing after updating my iPhone 11 Pro Max to iOS 26. I have been using the Hollyland Lark M2 external microphone via the Lightning port, and it was working perfectly before the update. However, after upgrading to iOS 26, the iPhone no longer detects it correctly. The device now recognizes the mic as a pair of wired earphones, and it fails to capture any audio input. The microphone itself works flawlessly on other devices, so this appears to be an iOS-specific issue. Could you please confirm: • Whether this is a known issue in iOS 26? • If there are any settings or steps I can take to resolve this? • Whether a fix is planned in an upcoming iOS patch? I would appreciate any guidance or solution you can provide. Thank you for your support.
3
0
83
Jun ’25
Permission requirements for LAContext's canEvaluatePolicy
Hi, I am developing an app that checks if biometric authentication capabilities (Face ID and Touch ID) are available on a device. I have a few questions: Do I need to include a privacy string in my app to use the LAContext's canEvaluatePolicy function? This function checks if biometric authentication is available on the device, but does not actually trigger the authentication. From my testing, it seems like a privacy declaration is only required when using LAContext's evaluatePolicy function, which would trigger the biometric authentication. Can you confirm if this is the expected behavior across all iOS versions and iPhone models? When exactly does the biometric authentication permission pop-up appear for users - is it when calling canEvaluatePolicy or evaluatePolicy? I want to ensure my users have a seamless experience. Please let me know if you have any insights on these questions. I want to make sure I'm handling the biometric authentication functionality correctly in my app. Thank you!
2
0
99
Jun ’25