Search results for

Popping Sound

19,735 results found

Post

Replies

Boosts

Views

Activity

hapticpatternlibrary.plist error with Text entry fields in Simulator only
When I have a TextField or TextEditor, tapping into it produces these two console entries about 18 times each: CHHapticPattern.mm:487 +[CHHapticPattern patternForKey:error:]: Failed to read pattern library data: Error Domain=NSCocoaErrorDomain Code=260 The file “hapticpatternlibrary.plist” couldn’t be opened because there is no such file. UserInfo={NSFilePath=/Library/Audio/Tunings/Generic/Haptics/Library/hapticpatternlibrary.plist, NSURL=file:///Library/Audio/Tunings/Generic/Haptics/Library/hapticpatternlibrary.plist, NSUnderlyingError=0x600000ca1b30 {Error Domain=NSPOSIXErrorDomain Code=2 No such file or directory}} <_UIKBFeedbackGenerator: 0x600003505290>: Error creating CHHapticPattern: Error Domain=NSCocoaErrorDomain Code=260 The file “hapticpatternlibrary.plist” couldn’t be opened because there is no such file. UserInfo={NSFilePath=/Library/Audio/Tunings/Generic/Haptics/Library/hapticpatternlibrary.plist, NSURL=file:///Library/Audio/Tunings/Generic/Haptics/Libra
0
0
9
21h
Reply to How to properly use PermissionKit to ask permission
The same error occurs to me. I assigned guardians to iPhones logged in with my child's account and activated communication restrictions. In addition, the following alerts and apps pop up on the console despite the fact that requests are made only when you are under a certain age using the AgeService before calling permissionKit. Error Domain=AskToCore.ATMessageComposeValidationError Code=4 The user is in a region that does not support this type of ask. NSLocalizedDescription: The user is in a region that does not support this type of ask. NSLocalizedFailureReason: The user must be in a supported region to use this feature. I'm not living in texas.
1d
Alarm sounds ios without critical alerts
Hello guys, i need a little help. Im building an alarm clock app, pretty good one, and i have my own sounds i want to use as the alarm ring but notifications on apple cant work when the phone is turned off or the device is in silent mode (Or at least thats how i understand it) unless they have this feature called critical alerts that lets you have notifications even when the phone is turned off or silented. Without this, the phone can do just one beep and only when you open the notification, then it starts ringing but how is this supposed to wake you up? Alarmy has this worked out fine and i cant figure out how, maybe someone here knows. Im thinking maybe they have the critical alerts enabled but then i dont know why Apple would approve theirs and not mine. I tried to submit for the critical alerts feature but apple didn’t approve it saying the app is not the use case and im kinda lost. The whole app could be ruined because of this. So my question is. is there any way how i can use my custom sounds
1
0
133
1d
Reply to CLMonitor API Missing Geofence Entry Events After Initial Registration
Even when a CLBackgroundActivitySession is instantiated immediately upon background launch, CLCircularGeographicCondition Enter events are suppressed. It's possible I'm overlooking something here, but what you're describing basically sounds like expected behavior. Quoting the CLBackgroundActivitySession class reference: Use CLBackgroundActivitySession to start a background activity session that allows a when-in-use authorized app to receive location updates or monitoring events. A when-in-use session can extend into the background (for example, when you background a navigation app that's providing directions) but it cannot start in the background. Everything else you've described is in line with what I'd expect when a session isn't active. __ Kevin Elliott DTS Engineer, CoreOS/Hardware
1d
Reply to DCAppAttestService errors: com.apple.devicecheck.error 3 and 4
Error 3 is invalidKey. Under normal circumstances, you receive this error if something goes wrong with generating or retrieving the attestation key. Unfortunately due to a bug, there are some instances, where a user's instance of the app will constantly fail. If these subset of users have had their phones since iOS 17.0 or earlier, this would be explained by a known bug in earlier versions of iOS, which impacts an underlying dependency of DCAppAttestService and is fixed in iOS 17.1. While we don’t expect any new issues popping up from this point on, unfortunately any device that got stuck in this state from before 17.1 will not be automatically cleared. While there might be actions for the users to clear this error state, experience taught us that they are just as likely to make matters worse, so we are no longer recommending developers to reach out to their customers to resolve the issue. One solution would be to treat any app instance in this state which is running on an earlier iOS, or persistentl
Topic: Privacy & Security SubTopic: General Tags:
1d
Broken YouTube audio at anything other than 1x speed
Since updating to tvoS 26.1/2 YouTube audio breaks when playing videos at anything other than 1x speed. At 1.25x or higher, the audio is static-y and distorted almost immediately. Strangely, AirPods work fine, it only happens with other audio outputs. Anyone else experiencing this, or have a fix? tvOS: 26.2 (23J582) YouTube app: 4.51.08 Speed Output Static 1x TV speakers No 1x HomePods No 1x AirPods Pro 3 No 1.25x TV speakers Yes 1.25x HomePods Yes 1.25x HomePods Mini Yes 1.25x Sonos One Yes 1.25x AirPods Pro 3 No
1
0
36
2d
AVSpeechSynthesizer system voices (SLA clarification)
Hello, I am building an iOS-only, commercial app that uses AVSpeechSynthesizer with system voices, strictly using the APIs provided by Apple. Before distributing the app, I want to ensure that my current implementation does not conflict with the iOS Software License Agreement (SLA) and is aligned with Apple’s intended usage. For a better playback experience (more accurate estimation of utterance duration and smoother skip forward/backward during playback), I currently synthesize speech using: AVSpeechSynthesizer.write(_:toBufferCallback:) Converting the received AVAudioPCMBuffer buffers into audio data Storing the audio inside the app sandbox Playing it back using AVAudioPlayer / AVAudioEngine The cached audio is: Generated fully on-device using system voices Stored only inside the app’s private container Used only for internal playback controls (timeline, seek, skip ±5 seconds) Never shared, exported, uploaded, or exposed outside the app The alternative approaches would be: Keeping
0
0
219
2d
notarytool is giving me HTTP status error
I am using the xcrun notarytool submit --apple-id xxxxx@gmail.com --password xxxxx--team-id xxxxxx --output-format json --wait --no-progress /my/dmg/file to notarize my DMG file. But it always gives me back the error, Error: HTTP status code: 403. A required agreement is missing or has expired. This request requires an in-effect agreement that has not been signed or has expired. Ensure your team has signed the necessary legal agreements and that they are not expired. I did log in my developer account and found no place to sign any agreement. Actually in the morning when I logged in the developer account, it indeed pop up the agreement for me to sign and I did sign it. But now it seems I don't have any more agreements to sign. So, any ideas about what I should do?
3
0
373
2d
Reply to Sporadic "no route to host" over ssh
It’s better to reply as a reply, rather than in the comments; see Quinn’s Top Ten DevForums Tips for this and other titbits. Regarding this: it doesn't seem to be an issue with DenyMulticast I don’t think multicast is a factor here, either for good or for ill. iOS has additional restrictions around multicast, but those do not apply on macOS. this command always works for any user When run how? From Terminal? If so, that’s not the evidence that you think it is. Quoting TN3179: macOS automatically allows local network access by: Any daemon started by launchd Any program running as root Command-line tools run from Terminal or over SSH, including any child processes they spawn Based on the info you’ve provided so far, it really does sound you’re bumping into local network privacy issues. This is complicated by the fact that Unix-y programs tend to do things that confuse local network privacy, for example: Common Unix-y techniques, like calling daemon man page, can break responsible code inference. Such p
3d
Reply to Show / Hide HAL Virtual Audio Device Based on App State
I am developing a macOS virtual audio device using an Audio Server Plug-In (HAL). I want the virtual device to be visible to all applications only when my main app is running, and completely hidden from all apps when the app is closed. The goal is to dynamically control device visibility based on app state without reinstalling the driver. What is the recommended way for the app to notify the HAL plug-in about its running or closed state? Any guidance on best-practice architecture for this scenario would be appreciated. So, full disclaimer, I don't have a lot of direct experience with AudioServer plugins, so the advice I'm giving is coming from a general device management/architecture perspective, NOT from detailed experience with this particular plugin type. In any case, the place I would probably start with here is the sample project Building an Audio Server Plug-in and Driver Extension. You don't need any kind of DEXT for this, but that sample does how to deal with device appearan
Topic: App & System Services SubTopic: Drivers Tags:
3d
Reply to CoreBluetooth multi-peripheral high-frequency BLE streaming shows uneven packet distribution and lag on some A16/A17 iPads
This all sounds like an issue with specific hardware units. Let's go the bug report route on this. First we need a Bluetooth diagnostic log. Please go to https://developer.apple.com/bug-reporting/profiles-and-logs/ and follow the instructions for Bluetooth for iOS to install a logging profile on your device. Then, once the logging profile is installed: reproduce the problem, keeping track of the actual time of the actions you take and the result you see. also include a sniffer log of the same if you have it. Please include the actual log, and an ASCII export of it from the logger. make sure there aren’t any extraneous BLE devices around, and no other apps are trying to connect to some other BLE device while you are conducting this test. Once the problem is reproduced, follow the instructions at the above link to trigger a sysdiagnose Please also let me know the BLE hardware/firmware/SDK you are using, along with version info. Once you have these, along with any other information you think would be us
Topic: App & System Services SubTopic: Core OS Tags:
3d
CGContextDrawShading broken on MacOS Tahoe for apps built with MacOS SDK older than 14.5
On MacOS Tahoe (26.0 26.1 or 26.2), when loading an application that was built with an SDK older than version SDK 14.5, all CGContextDrawShading calls to draw onto the screen (inside of NSView drawRect:) fail silently, filling the path with a single color instead of a gradient. If rendering into a local CGBitmapContext instead of the NSView context on Tahoe, CGShading works as expected. On MacOS 15 and earlier, CGShading works as expected too. If the app is built with SDK version 14.5 or newer, CGShading works normally on MacOS Tahoe. For recent applications, they can of course be rebuilt with a more recent version of the SDK, which fixes the problem. However for Audio Units or any other type of plug-in, even if they are built with the appropriate SDK, if they are loaded inside of a legacy application that was built with an older SDK, the problem arises, which customers complain about and do not understand. I have noticed that there had been a few changes in MacOS Tahoe regarding the CGShading APIs,
Topic: UI Frameworks SubTopic: AppKit
1
0
43
3d
Logic Pro - discover channel upstream latency
Hello everyone, I've written an audio unit plugin that needs to be aware of any upstream latency caused by heavy plugins before it on the channel. Is there any way to query this? I know that Logic applies PDC at the channel's output (summing point), but I need to know what the accumulated latency is at the point the audio enters my plugin. Thanks!
0
0
299
4d
iOS React Native: Can two WebRTC stacks (Wazo & Jitsi) share media?
Hi everyone, I’m building a React Native iOS app where I’m integrating Wazo (native WebRTC) and Jitsi (WebView / WebRTC). Use case: Wazo is used to maintain a background call session (mainly signaling + audio keep-alive). Jitsi is used in the foreground for video calls. Problem: When Jitsi starts, it takes control of the microphone and camera. The Wazo call disconnects after ~5 minutes (likely due to media / audio session conflict). Even if Wazo audio/video is muted or tracks are disabled, the session still drops. My questions: Is it officially supported or recommended to run two WebRTC stacks (Wazo + Jitsi) simultaneously on iOS? Can Wazo stay connected without active audio/video tracks while Jitsi uses mic/camera? Is there a way to release Wazo media streams temporarily (but keep signaling alive) while Jitsi is loading or active? Are there any AVAudioSession / background mode limitations on iOS that make this impossible by design? If this is not supported, what is the rec
1
0
299
4d
Is Picture in Picture supported for WebRTC / WebView video on iOS (outside app)?
Hello, I am implementing video calling on iOS and need to support Picture in Picture (PiP) behavior similar to FaceTime or WhatsApp. What works Audio continues correctly in background CallKit UI works as expected Video works correctly while the app is in the foreground What I’m trying to achieve When the user presses the Home button or switches apps, I want to show a system Picture in Picture window (floating video outside the app). Current setup Video is rendered via WebRTC The video is displayed inside a WKWebView (HTML / JavaScript) PiP works only while the app is foregrounded When the app backgrounds, the video disappears (only audio remains) Questions Does iOS support system Picture in Picture for: WebRTC video WKWebView / HTML video 2 Is AVPictureInPictureController limited only to: AVPlayerLayer AVSampleBufferDisplayLayer 3 If PiP requires native rendering: Is it mandatory to render WebRTC frames natively using AVSampleBufferDisplayLayer? Is PiP explicitly unsupported for WebView / HT
Topic: Safari & Web SubTopic: General Tags:
0
0
52
4d