Search results for

Popping Sound

19,350 results found

Post

Replies

Boosts

Views

Activity

Reply to Permission requirements for LAContext's canEvaluatePolicy
I can give you a good answer for this: [quote='790333021, jonayuan, /thread/790333, /profile/jonayuan'] When exactly does the biometric authentication permission pop-up appear for users - is it when calling canEvaluatePolicy(…) or evaluatePolicy(…)? [/quote] The latter. The purpose of canEvaluatePolicy(…) is to tell you whether a policy is supported or not. It’s the act of evaluating that policy that triggers side effects, like a biometrics user interaction. [quote='790333021, jonayuan, /thread/790333, /profile/jonayuan'] Do I need to include a privacy string in my app to use the LAContext's canEvaluatePolicy(…) function? [/quote] Now, that’s a more subtle question. There’s an implicit assumption in the API that folks would only call canEvaluatePolicy(…) because they want to, at some point, evaluate a policy, and that obviously requires a privacy string. It’s easy to imagine the Local Authentication implementation requiring the privacy string in your unusal case. Your tests confirm that the current i
Topic: Privacy & Security SubTopic: General Tags:
Jun ’25
Reply to App Notifications - Audible Alert Schedule
If you mark a notification as Critical Alert, it will override all Focus, Volume, and Mute settings user may have set and will be break through them all and be audible at the loudness level you set in the notification. If you don't want this to happen, you should not be using Critical Alerts. Also, your use case may not be eligible to receive the special entitlement to allow this after all. What you should be using instead is the UNNotificationInterruptionLevel.active setting, where the notification will light up the screen and can play a sound if not muted, and will respect the systemwide notification settings and focus modes like Sleep, Work, etc. where the app has been set to be silenced in those modes. This way the user can sleep through system maintenance and still get their news alerts by freely configuring your app into their focus schedules. Ref: Focus on iPhone Argun Tekant /  DTS Engineer / Core Technologies
Topic: Community SubTopic: Apple Developers Tags:
Jun ’25
Reply to About GCD (Grand Central Dispatch) in an extension.
SO, let me actually jump back to what you said here: However In this App that we are confirming that when the iPhone goes into sleep mode, self.activeTimerProc() is not called at 10-second intervals, What you're trying to do here is exactly the kind of behavior push provider is trying to discourage. That is, on a well designed network, there's no reason why your push provider couldn't: Open a connection to the server and perform initial setup. Leave the connection open without sending or receiving any data. An arbitrarily long period of time later, the server sends a call notification and the actual call process starts. Now, many voip apps do not actually work that way. In my experience that, that's generally caused by one of both of these two factors: The server implementation is being treated as a fixed constraint which cannot be modified and that (generally, VERY old) implementation include requirements which directly prevent the approach above. The underlying WiFi network infrastructure is very poor/broke
Jun ’25
Reply to Can't post
Thank you very much for taking time to look into it. I tried to create a post on an issue our team is having with Wallet popping up when our application using HCE is in foreground. The issue seemed to be related to the category I chose: App & system services -> Wallet. No matter the text I tried to enter each time I got message about sensitive language. Eventually I tried to change the category to another and the post was accepted and copy-pasted post content, strangely when I checked it's category today it's in App & system services -> Wallet, and it starts with warning about sensitive language....
Jun ’25
Reply to ARKit camera transform orientation vector doesn't match physical device heading (despite `.gravityAndHeading`)
Hello @himanshujnaidu, I can't tell for sure but it sounds like the matrix you show above is different than how you are interpreting it. Camera Transform = simd_float4x4( [ [0.98446155, -0.030119859, 0.172998, 0.0], // column 0 [0.023979114, 0.9990097, 0.037477385, 0.0], // column 1 [-0.17395553, -0.032746706, 0.98420894, 0.0], // column 2 [0.024039675, -0.037087332, -0.22780673, 0.99999994] // column 3 ]) First up, keep in mind that float4x4's init method is column major. So each 'row' in that code is a column in the matrix. So in this case the position of the camera relative to the origin is 0.024039675, -0.037087332, -0.22780673 Second, the upper-left part of the float4x4 matrix is a float3x3 and represents the orientation. The eulerAngles property on the frame.camera re-interprets this 3x3 matrix as the familiar Euler angles. You can find the orientation of the camera relative to the frame via let rotation = frame.camera.eulerAngles rotation.y will be zero when facing north, 90° when facing east.
Topic: Spatial Computing SubTopic: ARKit Tags:
Jun ’25
Suspected safari memory leak for new os ver 26
Hi, this is my first post in the community, so please correct me if i am posting this somewhat in a wrong manner. Im using my Apple M1 Pro(14inch, 2021) and installed the os 26 yesterday. Today, I was using Safari, and all of sudden it gets frozen, then the following window popped up. Is this something expected? i.e. my usage is somewhat unusual or is there any report around potential memory leak in Safari? appreciate any suggestions, as Safari is my main browser and currently on hold due to this issue. Thanks
2
0
134
Jun ’25
AVSpeechSynthesisMarker - iOS 18 - Sync lost
Hello, in an AVSpeechSynthesisProviderAudioUnit sending word position to host using AVSpeechSynthesisMarker / AVSpeechSynthesisMarker.Mark.word seems to be broken on iOS 18. On the app/client side all the events are received immediately whereas they should be received synchronised with the audio. The exact same code works perfectly on iOS 17 On the AVSpeechSynthesisProviderAudioUnit the AVSpeechSynthesisMarker are appended with the correct Position/SampleOffset let wordPos = NSMakeRange(characterRange.location, characterRange.length) let marker = AVSpeechSynthesisMarker(markerType: AVSpeechSynthesisMarker.Mark.word, forTextRange:wordPos, atByteSampleOffset:byteSampleOffset) // also tried with // let marker = AVSpeechSynthesisMarker(wordRange:wordPos, atByteSampleOffset:byteSampleOffset) markerArray.append(marker) print(word : pos (characterRange) - offset (byteSampleOffset)) // send events to host speechSynthesisOutputMetadataBlock?(markerArray, self.request!) word : pos {7, 7} - offset 2208 word : p
4
0
554
Dec ’24
No notification sound when CarPlay is connected
We're building a taxi driver app. Our goal is simple: Play a notification sound when a new ride request arrives — even if the iPhone is connected to CarPlay. We use Firebase push with sound: default and interruption-level: time-sensitive. The app requests .carPlay and uses a category with .allowInCarPlay. Everything works when CarPlay is disconnected — we get sound and banner. But when connected to CarPlay: the push is delivered silently, no sound is played (even on the phone), Siri Announce is enabled, but nothing is spoken. Questions: Is notification sound blocked when CarPlay is active, unless the app has CarPlay entitlement? Is Siri Announce the only way to notify the driver audibly in this case? Would getting a CarPlay entitlement (e.g. CarPlay.communication) fix this without building a full CarPlay UI? Thanks — all we need is a reliable sound alert when a new ride comes in.
2
0
57
Jun ’25
How to capture audio from the stream that's playing on the speakers?
Good day, ladies and gents. I have an application that reads audio from the microphone. I'd like it to also be able to read from the Mac's audio output stream. (A bonus would be if it could detect when the Mac is playing music.) I'd eventually be able to figure it out reading docs, but if someone can give a hint, I'd be very grateful, and would owe you the libation of your choice. Here's the code used to set up the AudioUnit: -(NSString*) configureAU { AudioComponent component = NULL; AudioComponentDescription description; OSStatus err = noErr; UInt32 param; AURenderCallbackStruct callback; if( audioUnit ) { AudioComponentInstanceDispose( audioUnit ); audioUnit = NULL; } // was CloseComponent // Open the AudioOutputUnit description.componentType = kAudioUnitType_Output; description.componentSubType = kAudioUnitSubType_HALOutput; description.componentManufacturer = kAudioUnitManufacturer_Apple; description.componentFlags = 0; description.componentFlagsMask = 0; if( component = AudioComponentF
1
0
115
Jun ’25
Reply to Sign in with apple using firebase and angular gives me "Invalid web redirect url" error
I am having a similar issue, and TN3107 is not providing sufficient information to track down the problem. Specifically, the use case that I am trying to enable is for users who have an account on my iOS app with their AppleID to log in to my website and upload an audio file for use in the app. This works for users who have authenticated with email or with Google sign in. But, SignIn with AppleID gives Invalid Request, Invalid Web Redirect URL, if I use https://.firebaseapp.com/__/auth/handler as the redirect URL with the ServicesID set in firebase to my bundle identifier, and it gives me Invalid Client if I use the ServicesID set in firebase to the ServicesID identifer set in my Apple Identifers list. Please provide some additional direction where I may be going astray. The documentation would indicate that I need to have the ServicesID identifier used in Firebase, and the redirect handler set as above, but that is throwing an Invalid Client error. Firebase Authentication is working just fine for ap
Topic: Privacy & Security SubTopic: General Tags:
Jun ’25
DUNS Number Mismatch
I am trying to create a developer account for my organization. I have validated our DUNS number and Entity Name on https://developer.apple.com/enroll/duns-lookup/. However, when I try to add this number to create a developer account, I get a pop-up suggesting a mismatch in the entity and DUNS number. I have tried this several times without any success. Request you to please help me resolve this issue. Thanks. Sanket
4
0
972
Feb ’24
Reply to Background Modes Capability Missing in App ID Configuration
First off, let me quickly expand on what I said here: The ONLY thing you need to do to enable background location updates is include location in your Background Modes Info.plist key. That statement is also true of voip apps, except that they should include the voip and audio Background Modes. Note that failing to include audio doesn't cause an immediate failure, but can cause weird/unpredictable behavior, particularly across the broader range of iOS version. Critically the word ONLY here is important, as voip app signing is identical to any other standard iOS app. Moving to your push failure, are you receiving voip pushes through PushKit when your app is in the foreground? If you are, then issues like this: In my case, i specified all these but then the VOIP notification doesnt come when the device is in the background mode or terminated mode. ...are caused by logic problem in your app. The details of what that might actually be vary but some examples I've seen are: Assuming the user will al
Jun ’25