Search results for

Popping Sound

19,349 results found

Post

Replies

Boosts

Views

Activity

Reply to Dynamic Library cannot call exposed C function
@DTS Engineer after much tweaking I've managed to reduce the working configuration to: s.user_target_xcconfig = { 'STRIP_STYLE' => 'non-global' } This works but sets the stripping style of the whole user project. I took a look at the output of Xcode build phases for the pod target and -exported_symbols_list doesn't work with cocoapods (out of the box) because cocoapods generates static lib by default. However setting s.static_framework = false did produce a dynamic framework but the symbols still are stripped. I'm not sure what other consequences STRIP_STYLE would have... sounds like a setting this only for my library is a bad idea, but I'm out of ideas on how to keep the symbols. I also tried passing each symbol directly to the linker and that also did not work s.user_target_xcconfig = { 'OTHER_LDFLAGS' => '$(inherited) -Wl,-u,_ios_prepare_request -Wl,-u,_ios_set_request_header -Wl,-u,_ios_present_webview -W...' }
Topic: Code Signing SubTopic: General Tags:
Aug ’25
iOS 26 UIBarButtonItems in navigation bar flashing wrong background during push/pop
An iOS app has a UINavigationController with a UINavigationBar that is non-translucent (e.g. black). When performing a push (or pop) to navigate to or from another UIViewController the UIBarButtonItems on the navigation bar are flashing a white background. With a dark navigation bar this is very noticeable and not desirable. This only occurs when run on iOS 26 and is related to Liquid Glass I've created FB19660024 with a minimal Xcode workspace to reproduce, along with a video showing the behavior. This is a cosmetic bug, not affecting functionality, but is a very undesirable effect on apps with dark and non-translucent navigation bars. Has anyone else seen this and found a workaround?
3
0
224
Aug ’25
Reply to Beta 5 hardEdge Scrollstyle blurs my whole table
When Apple compensates me hourly at your salary to devise sample projects for the bugs I find in your new stuff, maybe I'll take the time to do so. I have a day job, and my hobby is having a top 10 iOS app. I'm already working stupid hours to make your stuff look good (and have had the most fun since you made everything flat in iOS 7 and killed a bunch of joy so that's not actually a complaint). My report was mostly to help others with a workaround if they got themselves in that state. I mean, maybe I'm doing something weird here. I've isolated the bugs I care about in sample projects. It usually takes at least an hour to do a good job because I have a very complex view hierarchy. I'm not making a sample project for every bug for a trillion dollar company. That sounds like work, not something I should spend time away from my family for.
Topic: UI Frameworks SubTopic: UIKit Tags:
Aug ’25
AVAudioUnit host - PCM buffer output silent
Hi, I just started to develop audio unit hosting support in my application. Offline rendering seems to work except that I hear no output, but why? I suspect with the player goes something wrong. I connect to CoreAudio in a different location in the code. Here are some error messages I faced so far: 2025-08-14 19:42:04.132930+0200 com.gsequencer.GSequencer[34358:18611871] [avae] AVAudioEngineGraph.mm:4668 Can't retrieve source node to play sequence because there is no output node! 2025-08-14 19:42:04.151171+0200 com.gsequencer.GSequencer[34358:18611871] [avae] AVAudioEngineGraph.mm:4668 Can't retrieve source node to play sequence because there is no output node! 2025-08-14 19:43:08.344530+0200 com.gsequencer.GSequencer[34358:18614927] AUAudioUnit.mm:1417 Cannot set maximumFramesToRender while render resources allocated. 2025-08-14 19:43:08.346583+0200 com.gsequencer.GSequencer[34358:18614927] [avae] AVAEInternal.h:104 [AVAudioSequencer.mm:121:-[AVAudioSequencer(AVAudioSequencer_Player) startAndReturnE
3
0
365
Aug ’25
Reply to Unable to discover the BLE device which is in range
How are you identifying these particular devices? And is your app in the foreground when you are not able to identify? This problem sounds like one of the two typical cases we see, 1- The advertised name and the GAP name of the devices are different. When the device has never been seen before, CoreBluetooth will report the advertised name. Once connected, the GAP name of the device will be cached, and the next encounter, CoreBluetooth will report the cached GAP name instead of the advertised name. If these names are different, and you are specifically looking for a match in the peripherals name, this could be the mismatch. I would suggest to either make both names the same, or look for either of the possible names. 2- what is the advertising rate of the Wiser device? When your app is not in the foreground, the scan rate will dramatically drop down, and if the peripheral is not advertising fast enough, 3 minutes may not be enough to reliably detect the advertising. To reliably detect the device under
Topic: App & System Services SubTopic: Core OS Tags:
Aug ’25
macOS VPN apps outside of the App Store
Apple is encouraging VPN apps on macOS to transition to Network Extension APIs, if they haven't done so yet, see: TN3165: Packet Filter is not API WWDC25: Filter and tunnel network traffic with NetworkExtension Using Network Extension is fine for VPN apps that are distributed via the Mac App Store. Users get one pop-up requesting permission to add VPN configurations and that's it. However, VPN apps that are distributed outside of the App Store (using Developer ID) cannot use Network Extension in the same way, such apps need to install a System Extension first (see TN3134: Network Extension provider deployment). Installing a System Extension is a very poor user experience. There is a pop-up informing about a system extension, which the user has to manually enable. The main button is OK, which only dismisses the pop-up and in such case there is little chance that the user will be able to find the correct place to enable the extension. The other button in that pop-up navigates
4
0
113
Aug ’25
CMFormatDescription.audioStreamBasicDescription has wrong or unexpected sample rate for audio channels with different sample rates
In my app I use AVAssetReaderTrackOutput to extract PCM audio from a user-provided video or audio file and display it as a waveform. Recently a user reported that the waveform is not in sync with his video, and after receiving the video I noticed that the waveform is in fact double as long as the video duration, i.e. it shows the audio in slow-motion, so to speak. Until now I was using CMFormatDescription.audioStreamBasicDescription.mSampleRate which for this particular user video returns 22'050. But in this case it seems that this value is wrong... because the audio file has two audio channels with different sample rates, as returned by CMFormatDescription.audioFormatList.map({ $0.mASBD.mSampleRate }) The first channel has a sample rate of 44'100, the second one 22'050. If I use the first sample rate, the waveform is perfectly in sync with the video. The problem is given by the fact that the ratio between the audio data length and the sample rate multipli
0
0
88
Aug ’25
Reply to CallKit UI with speaker button is not functional - Only speaker mode is enabled
Let me start by returning to what I said here: Indeed, the fact this works in iOS 26 is an accidental oversight, not an intentional choice. That is not an exaggeration. The ability to report outgoing calls from the background was a compatibility workaround we preserved in iOS 13 to support PTT apps. The ONLY reason it continues to work in iOS 26 is because I didn't think about it when we were disabling the PTT entitlement. It will not continue to work and continuing to rely on it is a mistake. User has now ended call in CallKit UI, To resume the ongoing audio call - client requires to report to callKi, which is on end call action and no early call process at this moment I'm sorry, but this isn't something CallKit will continue to support. You need to stop doing this, as it WILL break in the future. __ Kevin Elliott DTS Engineer, CoreOS/Hardware
Topic: App & System Services SubTopic: General Tags:
Aug ’25
tvOS 26 Bugs – Persistent UI animation issues: app launch stutter, text rendering jumps, shadow jumps, abrupt swipe transitions
Hello :-) I‘m not entirely sure, if I‘m on the correct Place here. But I would like to report some Bugs with tvOS 26 (Beta 6) to the Apple Engineers! Description I am reporting multiple persistent UI animation issues observed in tvOS 26 (Beta 3). These issues have been reproducible across multiple tvOS releases. They are subtle but noticeable, and they affect the overall polish and perceived quality of the system. I am happy to provide high-quality video captures for each of the issues described below. ⸻ Bug #1: App launch animation stutter/jump Summary: The zoom-in animation from a Springboard icon to full-screen app stutters or jumps at the moment the app becomes full screen. Steps to reproduce: 1. On Springboard, select any app icon. 2. Observe the zoom-in animation. Expected result: Smooth, continuous zoom without frame drops or jumps. Actual result: Animation visibly stutters/jumps at the full-screen transition. Possible cause: Timing issue in Core Animation interpolation or abrupt view hierarchy switch.
3
0
396
Aug ’25
Reply to SpeechTranscriber/SpeechAnalyzer being relatively slow compared to FoundationModel and TTS
Ah, nice, let's see, first baseline without prepareToAnalyze: The KPI I'm interested is the time between the last audio above the noise-ground level and the final transcript (e.g. between the user stopping to speak and the transcription being ready to trigger actions): n: 11, avg: 2.2s, Var: 0.75 Then, with calling prepareToAnalyze: n: 11, avg: 1.45s, Var: 1.305 (the delay varied greatly between 0.05s and 3s) So yeah, based on this small sample, preparing did seem to decrease the delay.
Topic: Media Technologies SubTopic: Audio Tags:
Aug ’25
Reply to CallKit UI with speaker button is not functional - Only speaker mode is enabled
[quote='852438022, DTS Engineer, /thread/793663?answerId=852438022#852438022'] That statement was not a vague warning. We are ACTIVELY shutting down the PTT workarounds we created in iOS 13. That includes starting outgoing calls from the background. Indeed, the fact this works in iOS 26 is an accidental oversight, not an intentional choice. [/quote] Client is already migrated to PTC framework. we have a use case which actually needs to report Audio and video calls to CallKit (in this case BLE permission is enabled.) As mentioned earlier: Reported a incoming Video call using CallKit User has now initiated a Audio call (Full duplex call) which is using the active callKit session. User has now ended call in CallKit UI, To resume the ongoing audio call - client requires to report to callKi, which is on end call action and no early call process at this moment
Topic: App & System Services SubTopic: General Tags:
Aug ’25
Java remote debugging stymied by connection refused on local network
I am trying to setup remote Java debugging between two machines running macOS (15.6 and 26). I am able to get the Java program to listen on a socket. However, I can connect to that socket only from the same machine, not from another machine on my local network. I use nc to test the connection. It reports Connection refused when trying to connect from the other machine. This issue sounds like it could be caused by the Java program lacking Local Network system permission. I am familiar with that issue arising when a program attempts to connect to a port on the local network. In that case, a dialog is displayed and System Settings can be used to grant Local Network permission to the client program. I don't know whether the same permission is required on the program that is receiving client requests. If it is, then I don't know how to grant that permission. There is no dialog, and System Settings does not provide any obvious way to grant permission to a program that I specify. Note that a Java applicatio
5
0
325
Aug ’25
tvOS 26 – Persistent UI animation issues: app launch stutter, text rendering jumps, shadow jumps, abrupt swipe transitions
Intro I am reporting multiple persistent UI animation issues observed in tvOS 26 (Beta 6). These issues have been reproducible across multiple tvOS releases. They are subtle but noticeable, and they affect the overall polish and perceived quality of the system. I am happy to provide high-quality video captures for each of the issues described below. ⸻ Bug #1: App launch animation stutter/jump Summary: The zoom-in animation from a Springboard icon to full-screen app stutters or jumps at the moment the app becomes full screen. Steps to reproduce: 1. On Springboard, select any app icon. 2. Observe the zoom-in animation. Expected result: Smooth, continuous zoom without frame drops or jumps. Actual result: Animation visibly stutters/jumps at the full-screen transition. Possible cause: Timing issue in Core Animation interpolation or abrupt view hierarchy switch. ⸻ Bug #2: Text rendering weight change (“jump”) during transitions Summary: Text inside apps changes visual weight mid-transition from scaled preview to fu
0
0
88
Aug ’25
Reply to Perspective problem
Hello @brother_z , thank you for your question! If you are seeing a virtual object drift from its original position, that sounds like unexpected behavior and I would recommend submitting a bug report via Feedback Assistant. However there are many things that could be preventing your device from tracking its position correctly, such as obstructed cameras or high velocity motion (like wearing Apple Vision Pro on a train), so it's hard to diagnose without more details about what you're trying to do. Access to the main camera requires an entitlement. The extrinsic value of the camera will be a 4x4 matrix representing its pose relative to the device. The math you've shared here looks correct, although I'm not sure what your tag object is? Are the tag objects you are referring to Entities you've created? If you do file a feedback request, I recommend sharing as much of your project in the request as you are able to, and then share the number here so we can track it on our end. Thank you!
Topic: Spatial Computing SubTopic: ARKit Tags:
Aug ’25
Reply to pushkit and callkit
Has iOS 18 introduced new permission requirements or entitlements for VoIP push notifications? No. The last significant change here was the iOS 13 CallKit requirements. There hasn't been any change to them since then. Do I need to explicitly request a new type of user permission for VoIP notifications? No. Are there additional background modes, Info.plist keys, or PushKit changes required for VoIP to work in background and terminated states on iOS 18? No. Looking over your list, I did notice this: . Background modes for Voice over IP and Background Processing are enabled. Did you also include audio/Audio, AirPlay, and Picture in Picture? Historically, the architecture of VoIP apps has always relied on two different background categories in order to function: voip” -> Allows apps to use PushKit for call notifications and CallKit for call management. audio” -> Keeps the app awake in the background while the app is actually on a call (the same way it would keep any long-playing audio
Aug ’25