I'm struggling to convert Swift 5 to Swift 6. As advised in doc, I first turned strict concurrency ON. I got no error. Then, selected swift6… and problems pop up. I have a UIViewController with IBOutlets: eg a TextField. computed var eg duree func using UNNotification: func userNotificationCenter I get the following error in the declaration line of the func userNotificationCenter: Main actor-isolated instance method 'userNotificationCenter(_:didReceive:withCompletionHandler:)' cannot be used to satisfy nonisolated requirement from protocol 'UNUserNotificationCenterDelegate' So, I declared the func as non isolated. This func calls another func func2, which I had also to declare non isolated. Then I get error on the computed var used in func2 Main actor-isolated property 'duree' can not be referenced from a nonisolated context So I declared duree as nonsilated(unsafe). Now comes the tricky part. The computed var references the IBOutlet dureeField if dureeField.text == X leading to the error Main actor-
Search results for
Popping Sound
19,349 results found
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
After updating to iOS 18.5, we’ve observed that outgoing audio from our app intermittently stops being transmitted during VoIP calls using AVAudioSession configured with .playAndRecord and .voiceChat. The session is set active without errors, and interruptions are handled correctly, yet audio capture suddenly ceases mid-call. This was not observed in earlier iOS versions (≤ 18.4). We’d like to confirm if there have been any recent changes in AVAudioSession, CallKit, or related media handling that could affect audio input behavior during long-running calls. func configureForVoIPCall() throws { try setCategory( .playAndRecord, mode: .voiceChat, options: [.allowBluetooth, .allowBluetoothA2DP, .defaultToSpeaker]) try setActive(true) }
I'm a bit unclear on your exact question, so let me try to outline a few things for you in hopes that you'll identify your path forward from this information. I need to upload and push to my mobile, but I can't b/c of the old software. I've moved the files to my wife's newer MacBook, but the app doesn't work. It sounds like your iPhone is running a version of iOS that is newer than iOS 16.2. I'm picking iOS 16.2 there because that was the most recent version of iOS at the time Xcode 14.2 was released, and an older Xcode won't know how to talk to an iOS version that is newer than that. You should be able to take your Xcode project — which includes the Xcode project file, and all for resource code and assets — and open it in a newer versions of Xcode without an issue. I presume the Xcode version on your wife's Mac here is more recent than Xcode 14.2; however, depending on the iOS version of your iPhone, you may need to update that Xcode version even further. We have a table on this support page that ex
Topic:
Developer Tools & Services
SubTopic:
Xcode
Cross-posting this from https://developer.apple.com/forums/thread/795707 per ask from DTS Engineer: Is there any way to ensure iOS apps we develop using Foundation Models can only be purchasable/downloadable on App Store by folks with capable devices? I would've thought there would be a Required Capabilities that App Store would hook into for Apple Intelligence-capable devices, but I don't seem to see it in the documentation here: https://developer.apple.com/documentation/bundleresources/information-property-list/uirequireddevicecapabilities The closest seems to be iphone-performance-gaming-tier as that seems to target all M1 and above chips on iPhone & iPad. There is an ipad-minimum-performance-m1 that would more reasonably seem to ensure Foundation Models is likely available, but that doesn't help with iPhone. So far, it seems the only path would be to set Minimum Deployment to iOS 26 and add iphone-performance-gaming-tier as a required capability, but I'm a bit worried that capability might diverge in
Topic:
App Store Distribution & Marketing
SubTopic:
App Store Connect
Tags:
App Store
iOS
Apple Intelligence
Regarding try to setActive(YES) for workaround, this only happens 5 seconds after our app detects Audio Session is not activated call started. That doesn't change my answer. Directly activating the audio session will never be reliable and can disrupt CallKit's own ability to manage your audio session. It happens to not be the cause of the problem you're currently looking at, but it can and will cause problems that look EXACTLY the same. Putting this another way, the problem with using setActive ISN'T that it doesn't work, it's that it can SORT of work in a way that can mask and create other issues, both now and in the future. We agree the major key issue is the SessionID 0x0. I have created a feedback for this issue, and submitted sysdiagnose/console logs in it. However, the issue happened on Apr 4, the sysdiagnose was generated on Apr 7. So we are not sure if it could contain the key data on Apr 4. I'll try and take a look later today; however, I've seen enough already to raise the
Topic:
App & System Services
SubTopic:
General
Tags:
[quote='852648022, Etresoft, /thread/795994?answerId=852648022#852648022, /profile/Etresoft'] You were relaying what the end user told you [/quote] I stated the opposite: This is what happens for me. I only wrote that the user brought the issue to my attention. Never use the App Store. That sounds like a very generic statement and I don't see a reason why not. Would you like to elaborate?
Topic:
App & System Services
SubTopic:
General
Tags:
Hello @Jir_253, Thanks for your questions. I’d recommend that you start by watching Set the scene with SwiftUI in visionOS from this year’s WWDC. Your “CWindow” sounds similar to the Tools window covered in this talk: WindowGroup(Tools, id: tools) { ToolsView() } .restorationBehavior(.disabled) .defaultLaunchBehavior(.suppressed) Similar to iOS where you cannot quit the application yourself, you cannot dismiss the last window of your application. When a user goes to close the main window when the secondary window is open, you cannot close this secondary window if it’s the only window group that is open in your application. Consider adding an affordance to the secondary window to reopen the main window in this case. There's no supported way for you to create parent/child window relationships with the APIs currently available. If you'd like us to consider adding the necessary functionality, please file an enhancement request using Feedback Assistant. Once you file the request, please post the FB number
Topic:
Spatial Computing
SubTopic:
General
Tags:
in beta5 now the custom sound configuration works and it actually plays sound when alarm runs off BUT the sound is played only for once. has anyone figured out on how to put it on repeat? or do I have to wait on this for another couple of weeks💀
Topic:
App & System Services
SubTopic:
General
I’m experiencing the same issue. When you set the alpha parameter to a semi-transparent value (like 0.5), the title becomes slightly visible. This suggests that the background layer is rendered above the title when prefersLargeTitles is enabled. Sounds like a bug in iOS 26 or Xcode. FB19434429
Topic:
UI Frameworks
SubTopic:
UIKit
Tags:
Hi @DTS Engineer Thank you so much for checking this. Regarding try to setActive(YES) for workaround, this only happen 5 seconds after our app detect Audio Session is not activated call started. We agree the major the key issue is the SessionID 0x0. I have created a feedback for this issue, and submit sysdiagnose/console logs in it. However, the issue happen at Apr 4, the sysdiagnose was generated at Apr 7. So We are not sure if it could contain the key data at Apr 4. Please help to check it. The issue happen intermediately, and it's hard to reproduce. We can update the feedback again once it can be reproduced again. feedback ticket: FB19429215 (CallKit does not activate audio session with higher probability after upgrading to iOS 18.4.1)
Topic:
App & System Services
SubTopic:
General
Tags:
A housekeeping note — here is a link to a related thread Caleb started on a different slice of the problem, for anyone who may come to this in the future. I reread what you originally wrote, and it sounds like you had a reason to pull out this web bundle build into the aggregate target. Is there a reason where that couldn't be a script in the main app target that I'm not seeing? It seems like the file level input and output analysis of this step in the main app build phase would be enough. The one downside I see is that if the script does need to run in full, that could potentially be a place where you don't get a lot of concurrently running build tasks doing other things, and thus takes up more wall clock time than is ideal, but also maybe isn't that different from the pre-build script you started with. Another idea would be to question why this script needs to run via an Xcode build at all. If the source files don't change often, could other techniques like a git commit hook targeting the source fi
Topic:
Developer Tools & Services
SubTopic:
Xcode
Tags:
RealityKit spatial audio crackles and pops on iOS 26.0 beta 5. It works correctly on iOS 18.6 and visionOS 26.0 beta 5. The APIs used are AudioPlaybackController, Entity.prepareAudio, Entity.play Videos of the expected and observed behavior are attached to the feedback FB19423059. The audio should be a consistent, repeating sound, but it seems oddly abbreviated and the volume varies unexpectedly. Thank you for investigating this issue.
Thank you for the detailed reply. I've submitted a bug report as requested: FB19421676 – Push-to-Talk Framework: Microphone activation tone does not play when sending while audio session is active in full duplex mode. Thanks to the context you provided regarding how the PTT framework functions, I was able to identify the cause of the transmission delay I was experiencing. It turns out that isVoiceProcessingInputMuted was set to true when starting a transmission, and only reverted to false once audio output stopped. This was the source of the delay between initiating transmission and receiving valid microphone input. By manually setting isVoiceProcessingInputMuted to false on the input node at the start of transmission, I was able to eliminate this delay and begin receiving microphone samples immediately. I'm still relatively new to Swift and iOS audio development, and I was wondering if there are any sample projects or best practices that demonstrate integrating audio with
Topic:
Media Technologies
SubTopic:
Audio
Tags:
Could you please review the following processes for any potential issues? Yes. It's very simple. Your app should NEVER do this: try to setActive(YES) for workaround The CallKit audio session is NOT a standard audio session. It's a restricted audio session configuration that: Has different, higher, session priority than any other audio session. Is allowed to activate in the background IF that activation comes from the properly authorized daemon. That means: sometimes it won't ...in the standard case, activation will fail because you're not the proper authorization. However, that activation attempt can disrupt the system and interfere with other activations. However... sometimes it will succeed, These APIs are inherently race condition prone, which means that activation could succeed because, for example: Your app was entering the foreground and was able to activate a PlayAndRecord. Timing issues inside the audio system meant that it allowed activation under circumst
Topic:
App & System Services
SubTopic:
General
Tags:
Since iOS 18, the system setting “Allow Audio Playback” (enabled by default) allows third-party app audio to continue playing while the user is recording video with the Camera app. This has created a problem for the app I’m developing. ➡️ The problem: My app plays continuous audio in both foreground and background states. If the user starts recording video using the iOS Camera app, the app’s audio — still playing in the background — gets captured in the video — obviously an unintended behavior. Yes, the user could stop the app manually before starting the video recording, but that can’t be guaranteed. As a developer, I need a way to stop the app’s audio before the video recording begins. So far, I haven’t found a reliable way to detect when video recording starts if ‘Allow Audio Playback’ is ON. ➡️ What I’ve tried: — AVAudioSession.interruptionNotification → doesn’t fire — devicesChangedEventStream → not triggered I don’t want to request mic permission (ap