Search results for

Popping Sound

19,735 results found

Post

Replies

Boosts

Views

Activity

Local Network Discovery Works in Debug but Not in TestFlight (Wi-Fi Speaker Connection Issue)
Hi team, I’m having an issue with my iOS app related to local network communication and connecting to a Wi-Fi speaker. My app works similar to the “4Stream” application. The speaker and the mobile device must be on the same Wi-Fi network so the app can discover and connect to the speaker. What’s happening: When I run the app directly from Xcode in debug mode, everything works perfectly. The speaker gets discovered. The speaker gets connected successfully. The connection flow completes without any problem. But when I upload the same build to TestFlight, the behaviour changes completely. The app gets stuck on the “Connecting…” screen. The speaker is not discovered. But the same code is working fine on Android It never moves forward from that state. So basically: Debug Mode: Speaker is detected and connected properly TestFlight: Stuck at “Connecting…”, speaker does NOT get connected This makes me believe something related to local network access, multicast, Wi-Fi info permissions, or Bonjour discovery is not bei
1
0
163
Dec ’25
iPhone 17(iOS26) Unable to join the Wi-Fi(TKIP)
Device: iPhone 17 Series System: iOS 26.0.0 Wi-Fi: TKIP encryption protocol Question: Unable to join the network We have several products that are used by connecting to iPhone via Wi-Fi. Recently, many customers who have purchased the iPhone 17 series have reported that they are unable to connect to Wi-Fi. For Wi-Fi with TKIP encryption, after entering the password correctly to connect to the Wi-Fi, a pop-up appears stating Unable to join the network.. Only Wi-Fi with WPA2-AES can be used normally. Before that, during the iPhone 11 era or even earlier, the TKIP encryption method was in normal use. However, the new iPhone models were incompatible with it, which obviously caused great inconvenience. I hope the engineers can fix this issue to support Wi-Fi with older encryption protocols.
5
0
523
Dec ’25
Reply to Can't Provision A Device
you said selected it as a preview device - what do you mean here? It sounds like you selected your model of phone for the simulator. In the middle top of the Xcode window, it shows your target name a run destination. By default, that run destination for an iOS app is a simulator. Plug in your phone. If it doesn't appear in the popup menu as a run destination, choose Manage run destinations... from that menu. It should show up as discovered in the list on the left of the Run Destinations window. The first time you pair the phone with Xcode takes quite a while (several minutes for me).
Dec ’25
Reply to CallKit VoIP → App launch → Auto WebRTC/MobileRTC connection: Does Apple allow this flow?
So, the first thing to understand is that what you're describing here: Our app receives a CallKit VoIP call. When the user taps “Answer”, the app launches and automatically connects to a real-time audio session using WebRTC or MobileRTC. ...is NOT what's actually happens on iOS. Your app doesn't receive a CallKit call, nor is CallKit something that really controls how your app works. This is how incoming voip pushes actually work: The device receives an voip push for your app. The system either launches or wakes your app (depending on whether or not your app is running). Your app receives the voip push. Your app reports a new call into CallKit. The system present the incoming call UI, sending updates back to your app about the actions the user takes in that UI. If the user answers the call, the system activates the audio session you previously configured. The critical thing to understand here is that CallKit is best understood as an interface framework (albiet a very narrowly focused one), N
Topic: App & System Services SubTopic: General Tags:
Dec ’25
CallKit VoIP → App launch → Auto WebRTC/MobileRTC connection: Does Apple allow this flow?
Our app receives a CallKit VoIP call. When the user taps “Answer”, the app launches and automatically connects to a real-time audio session using WebRTC or MobileRTC. We would like to confirm whether the following flow (“CallKit Answer → app opens → automatic WebRTC or MobileRTC audio session connection”) complies with Apple’s VoIP Push / CallKit policy. In addition, our service also provides real-time video-class functionality using the Zoom Meeting SDK (MobileRTC). When an incoming CallKit VoIP call is answered, the app launches and the user is automatically taken to the Zoom-based video lesson flow: the app opens → the user is landed on the Zoom Meeting pre-meeting room → MobileRTC initializes immediately. In the pre-meeting room, audio and video streams can already be active and MobileRTC establishes a connection, but the actual meeting screen is not joined until the user explicitly taps “Join”. We would like to confirm whether this flow for video lessons (“CallKit Answer → app
1
0
68
Dec ’25
Reply to Binary executable requires Accessibility Permissions in Tahoe
[quote='868001022, Hylus-Arbor, /thread/808897?answerId=868001022#868001022, /profile/Hylus-Arbor'] maybe the Accessibility is compromised in the same way with 26.1 [/quote] Yes. This bug seems to affect a wide range of privileges controlled by System Settings > Privacy & Security. I haven’t explored the full list, but others have noticed that it also affects Screen & System Audio Recording, so it wouldn’t surprise me if it affected Accessibility as well. [quote='868001022, Hylus-Arbor, /thread/808897?answerId=868001022#868001022, /profile/Hylus-Arbor'] I updated my own Mac from 15.7 to 26.1. [/quote] My recommendation is that you test this stuff in one or more VMs. That way you have direct control over what system software is installed, and you can revert to a known ‘clean’ snapshot between each test. Share and Enjoy — Quinn “The Eskimo!” @ Developer Technical Support @ Apple let myEmail = eskimo + 1 + @ + apple.com
Topic: Privacy & Security SubTopic: General Tags:
Dec ’25
How to Implement Screen Mirroring in iOS for Google TV?
I am developing an iOS application that supports screen mirroring to Google TV (or Chromecast with Google TV). My goal is to mirror the iPhone/iPad screen in real time to a Google TV device. What I Have Tried So Far I have explored multiple approaches but haven't found a direct way to achieve low-latency screen mirroring. Here are some of my findings: Google Cast SDK: Google Cast SDK is primarily designed for casting media (videos, images, audio) rather than real-time mirroring. It supports custom receiver applications, but there are no direct APIs for full screen mirroring. Casting a recorded video is possible, but it introduces latency and is not real-time. ReplayKit for Screen Capture: RPScreenRecorder.shared().startCapture(handler: ...) allows capturing the iPhone screen as a video stream. However, sending this stream to Google TV in real time is a challenge. I could potentially encode the video as HLS and stream it, but the delay is significant. RTSP/UDP Streaming: Some third-party libraries sup
1
0
630
Dec ’25
Reply to How to Implement Dynamic Type for UITextFields Without Resetting Data
Hello! Thanks for taking the time to write about this. This doesn't sound right, and is probably a bug. But to be sure, we need some more information to investigate. Can you please file a bug report and include code sample sysdiagnose logs video recording to the following link? https://developer.apple.com/bug-reporting/ The site will walk you through these steps. You can reply what the Feedback ID is for this issue and I'll make sure it gets to the right team to investigate.
Dec ’25
Reply to Are read-only filesystems currently supported by FSKit?
Huh. Out of curiosity, did APFS always do that to prevent that issue? No, at least not entirely. I don't remember if it was an issue on macOS (where booting support wasn't added until later), but there was a short interval in iOS 8 where it was possible to completely fill an iOS device. However, the end result of that wasn't quite as bad as I made it sound (or it theoretically could be). The nature of COW also means that in a normal I/O flow, the I/O request that uses the last available space is effectively guaranteed to be the writes for the next/new data that's going to be written. So what actually ended up happening was: The kernel panicked, since APFS couldn't really do anything else. The file system was left in a dirty state (since it didn't unmount). The fsck prior to remount found that pending data and cleared it (since that's all it could really do). ...which then freed up enough space for the file system to be functional again, at least enough that you could delete files and clear the issue.
Topic: App & System Services SubTopic: Core OS Tags:
Dec ’25
[Core Bluetooth]The Application Playing a Notification Tone (AVAudioPlayer, System sounds) Should Automatically Route Audio to the Connected BLE accessory which uses HFP Profile
The iOS application is a Central Manager connected to a Bluetooth Low Energy (BLE) accessory that utilizes the Hands-Free Profile (HFP). When the application plays a notification tone (using AVAudioPlayer or System Sounds), the audio is incorrectly routed to the device's internal speaker instead of the active HFP headset. How do we programmatically ensure that these notification tones are automatically and reliably routed to the connected HFP headset
6
0
183
Dec ’25
Reply to [Core Bluetooth]The Application Playing a Notification Tone (AVAudioPlayer, System sounds) Should Automatically Route Audio to the Connected BLE accessory which uses HFP Profile
For Notification Tones, Audio session category is configured to AVAudioSessionCategoryPlayAndRecord with VoiceChat mode and AllowBluetooth option. Tones play using system audio services without an active audio session. No background audio session activation occurs for notification tones. Are you talking about using AudioServicesPlayAlertSound()/AudioServicesPlaySystemSound()? If so, then the answer here is basically no, you can't change any of the details of how those APIs behave. Both of those APIs work by shifting the actual playback out of process, instead of having your app directly play the audio. This allows them to be used in apps that don't have ANY audio configuration at all, as well as in contexts that would otherwise be complex/problematic. For example, it lets apps with arbitrarily complex and configurable audio configuration play alert sounds without having to figure out how to mix and route that alert sound with the
Topic: App & System Services SubTopic: Core OS Tags:
Dec ’25
Reply to Xcode 26 Coding Assistant & Anthropic
Sounds like you are running up against the anthropic usage limitation. There is no metric data on what the trigger limit actually is, but I find myself clapped up against this about once a month or so (depending on how heavily I've been relying on the code assistant). Anthropic has an unlimited tier but it's something like $200 per month, so...
Dec ’25
Reply to [Core Bluetooth]The Application Playing a Notification Tone (AVAudioPlayer, System sounds) Should Automatically Route Audio to the Connected BLE accessory which uses HFP Profile
How are you doing this? The audio system should not be allowing PlayAndRecord to directly activate in a background app. For Notification Tones Audio session category is configured to AVAudioSessionCategoryPlayAndRecord with VoiceChat mode and AllowBluetooth option Tones play using system audio services without an active audio session No background audio session activation occurs for notification tones
Topic: App & System Services SubTopic: Core OS Tags:
Dec ’25
Reply to Disable ISO15693Tag Popup
Is there any way to use the hardware RF reading capabilities of an iPhone to read ISO15693 RF tags silently, and without a UI pop-up? Perhaps using other native iOS libraries than the NFC library? If not, is there a way for a business to request this feature be allowed in internally used apps only?
Topic: App & System Services SubTopic: Drivers Tags:
Dec ’25