Hi i had problem after upgrade to ios 26. I try connected to my xiaomi earbud bluetooh, all functioning well (watch movie, listen song) except when on call, the bluetooth cannot function at all. Only able speak on phone or loud speaker. Already try do setting at Audio & Visual > Call Audio Routing > Bluetooth Headset. Anyone facing the same issue?
Search results for
Popping Sound
19,350 results found
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Hi! We are planning to build an app for a research project that collects sensitive information (such as symptoms, photos and audio). We don't want to store this data locally on the phone or within the app but rather have it securely transferred to a safe SFTP server. Is it possible to implement this i iOS, and if so, does anyone have any recommendations on how to do this?
if using the in-house app, there may be crashes in the new iOS 18.3 and later versions This sounds like it could be a known issue affecting some versions of iOS 18, and is specific to in-house or enterprise distribution. You'll want to read the details on this other thread, which includes steps you can take for resolution. If that turns out not to be the case, then as darkpaw says, we'll need more information from you to give us a starting point. Since you're highlighting an enterprise app, I suggest you do so by filing a bug report with a sysdiagnose attached. You can post the FB number of the report here, and we can take a look from there. — Ed Ford, DTS Engineer
Topic:
Accessibility & Inclusion
SubTopic:
General
I'm developing an iOS app that uses Siri Shortcuts to enhance the user experience. Currently, I have implemented functionality that allows users to perform certain actions via Siri Shortcuts. My team wants to improve the user experience by giving an instructional audio prompt (e.g., say 'hey Siri [action name]' if you want to [perform action]) to users. However, we want to ensure this prompt is only played when the user has already enabled Siri Shortcuts. The challenge is determining whether Siri Shortcuts are properly enabled before suggesting their use. We want to avoid situations where users follow our audio instructions to use Siri, only to discover that Siri Shortcuts aren't properly configured on their device. Since we're using Siri Shortcuts for this feature, the standard requestSiriAuthorization(_:) method doesn't apply to our use case(It said You don’t need to request authorization if your app only supports actions in the Shortcuts app. in https://developer.apple.com/documentation/s
Thanks for confirming this is a custom intent. In that case, checking for Siri authorization at the API level isn't required. The system may ask people to confirm their intention when using your intent through Sri for the first time, but that's not something you need to worry about at the API level. I didn't find the api to check if Siri enabled or not at user's end(like if Siri is enabled on their phone), is it because this information is not shared with the app? A custom SiriKit intent is surfaced in multiple places throughout the system experience, such as Spotlight, certain system provided widgets that suggest relevant actions to the user based on their use patterns, and more. Checking for Siri authorization isn't really useful, because that's a binary answer for only one place of many where your intent could run. Thus, the APIs are set up so that your app doesn't need to reason about all of the numerous situations. I want to come back to your original premise: My team wants to improve the user experience
Topic:
App & System Services
SubTopic:
Automation & Scripting
Tags:
My app has a network extension, and I've noticed each phone having their connectivity interupted by calls on the push provider, calling stop with the noNetworkAvailable reason. The point of confusion is that each phone seems to get it's interruption at different times. For example one will get an interuption at 1:00, while the others is fine, while at 3:00 another will get an interuption, while the others are fine. What were the phones physically doing at the time the interruption occurred? The most common reason this occurs is simply that the device moved out of WiFi coverage. A few notes on that point: Providing good WiFi coverage over a large area under real world conditions is sufficiently difficult that unless a large scale network has been well surveyed and professionally maintained, significant dead zones are all be guaranteed. In my experience, users reports of their active (I was in range so it should have worked) are completely unreliable. Network configuration is more complicated and error prone th
Topic:
App & System Services
SubTopic:
Networking
Tags:
I have 3 phones iPhone 14 iOS 18.3 iPhone Xr iOS 18.5 iPhone Xr iOS 18.4.1 My app has a network extension, and I've noticed each phone having their connectivity interupted by calls on the push provider, calling stop with the noNetworkAvailable reason. The point of confusion is that each phone seems to get it's interuption at different times. For example one will get an interuption at 1:00, while the others is fine, while at 3:00 another will get an interuption, while the others are fine. This is confusing since a no network available seems to imply a problem with the router, or access point, but if that were the case, one would believe it should affect all the phones on the wifi. I don't see less interuptions on the iPhone14 vs the iPhone Xr. Do you believe the iOS version is affecting the performance? Could you please give me some insight, as to what could be going on inside these phones? P.S. I also see an error pop up when using NWConnection, this is inside the App. The state update handler will s
I would like to have a SwiftData predicate that filters against an array of PersistentIdentifiers. A trivial use case could filtering Posts by one or more Categories. This sounds like something that must be trivial to do. When doing the following, however: let categoryIds: [PersistentIdentifier] = categoryFilter.map { $0.id } let pred = #Predicate { if let catId = $0.category?.persistentModelID { return categoryIds.contains(catId) } else { return false } } The code compiles, but produces the following runtime exception (XCode 26 beta, iOS 26 simulator): 'NSInvalidArgumentException', reason: 'unimplemented SQL generation for predicate : (TERNARY(item != nil, item, nil) IN {}) (bad LHS)' Strangely, the same code works if the array to filter against is an array of a primitive type, e.g. String or Int. What is going wrong here and what could be a possible workaround?
Since 17.4 Dev Beta 2, I have been having Bluetooth issues. I had hoped it would have cleared up but even in 17.4.1 it continues. Airpod and Echo Auto are the only 2 audio devices I have. The audio will become chopping, rubber band or sound robotic and sometime completely disconnect. While driving it will occur on both audio devices. Sometimes I'm stopped at red light and the issue occurs. The phone is less than 3 feet from the device at all times. I have read forums and removed and readded the devices but that did not help. I really do not want to have to reset my phone since my 2FA apps do not recover in a restore. Anyone have any suggestions?
Glad to see that we have the capability to edit rich text in TextEditor with the latest os update, but I didn't get any clue to enable the attachment for this textEditor, either image/audio/video or other attachments. Any solution on this with TextEditor or I have to use the UIKit and AppKit alternatives?
Topic:
UI Frameworks
SubTopic:
SwiftUI
Hi, I am trying to do similar things here. I want to build a keyboard that record audio and then process with Whisper API to get more robust dication. My keyboard extension works fine on simulator. but when i try to run it on real device. I always get Error Domain=NSOSStatusErrorDomain Code=561015905 Session activation failed UserInfo={NSLocalizedDescription=Session activation failed} when i do private var audioRecorder: AVAudioRecorder? private var audioSession = AVAudioSession.sharedInstance() try audioSession.setCategory(.playAndRecord, mode: .default, options: [.allowBluetooth]) try audioSession.setActive(true)
Topic:
App & System Services
SubTopic:
Hardware
Tags:
I’m building a SwiftUI app whose primary job is to play audio. I manage all of the Now-Playing metadata and Command center manually via the available shared instances: MPRemoteCommandCenter.shared() MPNowPlayingInfoCenter.default().nowPlayingInfo In certain parts of the app I also need to display videos, but as soon as I attach another AVPlayer, it automatically pushes its own metadata into the Control Center and overwrites my audio info. What I need: a way to show video inline without ever having that video player update the system’s Now-Playing info (or Control Center). In my app, I start by configuring the shared audio session do { try AVAudioSession.sharedInstance().setCategory(.playback, mode: .default, options: [ .allowAirPlay, .allowBluetoothA2DP ]) try AVAudioSession.sharedInstance().setActive(true) } catch { NSLog(%@, **** Failed to set up AVAudioSession (error.localizedDescription)) } and then set the MPRemoteCommandCenter commands and MPNowPlayingInfoCenter nowPlayingInfo
Thank you for your reply, I also just had an AVKit lab and this indeed seems like an issue with the framework(s). To follow up, I filed a feedback (FB18058056) mentioning yours and this discussion. I also realized that using the CustomAVPlayerView as shown above actually works as long as the app doesn't configure the MPRemoteCommandCenter (by adding targets to the shared instance). In my use case, I do have to configure it (as for many other I recon) so I had to come up with another fix. Personally what I did is to pass the scenePhase to the views configuring/displaying the videos so that when the app enters the .inactive or .background state I can pause the AVPlayers and setup the nowPlayingInfo again (either to nil or current audio). I also do this cleanup as soon as the view disappears (via .onDisappear{}). Ultimately this is not great as there are still some side-effects, for example, If no audio is playing but the user simply opens the view with the video players, the app will appear as
Topic:
Media Technologies
SubTopic:
Video
Tags:
Since it sounds like the behavior changed across iOS versions, you can file a bug report to report a regression. If you do, post the FB number here for the record. That said, multi-megabyte URLs are unusual, and this means your use here likely falls outside the original intention of the openURL functionality — it isn't a general purpose data exchange API. If you zoom out from your current implementation details, what is the underlying problem that you are solving for? — Ed Ford, DTS Engineer
Topic:
UI Frameworks
SubTopic:
UIKit
Tags:
Hello, That's a bit of a semantic conundrum isn't it? We recommend reaching out to App Review to determine how to use the Captions box in this particular case. It seems incorrect to indicate the presence of captions when they aren't there in an audio or video asset. Perhaps there is a different and more direct way to indicate that the app is accessible to deaf users. Again, seek App Review's guidance.
Topic:
Accessibility & Inclusion
SubTopic:
General
Tags: