I would like to have a SwiftData predicate that filters against an array of PersistentIdentifiers. A trivial use case could filtering Posts by one or more Categories. This sounds like something that must be trivial to do. When doing the following, however: let categoryIds: [PersistentIdentifier] = categoryFilter.map { $0.id } let pred = #Predicate { if let catId = $0.category?.persistentModelID { return categoryIds.contains(catId) } else { return false } } The code compiles, but produces the following runtime exception (XCode 26 beta, iOS 26 simulator): 'NSInvalidArgumentException', reason: 'unimplemented SQL generation for predicate : (TERNARY(item != nil, item, nil) IN {}) (bad LHS)' Strangely, the same code works if the array to filter against is an array of a primitive type, e.g. String or Int. What is going wrong here and what could be a possible workaround?
Search results for
Popping Sound
19,351 results found
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Hi, I am trying to do similar things here. I want to build a keyboard that record audio and then process with Whisper API to get more robust dication. My keyboard extension works fine on simulator. but when i try to run it on real device. I always get Error Domain=NSOSStatusErrorDomain Code=561015905 Session activation failed UserInfo={NSLocalizedDescription=Session activation failed} when i do private var audioRecorder: AVAudioRecorder? private var audioSession = AVAudioSession.sharedInstance() try audioSession.setCategory(.playAndRecord, mode: .default, options: [.allowBluetooth]) try audioSession.setActive(true)
Topic:
App & System Services
SubTopic:
Hardware
Tags:
Thank you for your reply, I also just had an AVKit lab and this indeed seems like an issue with the framework(s). To follow up, I filed a feedback (FB18058056) mentioning yours and this discussion. I also realized that using the CustomAVPlayerView as shown above actually works as long as the app doesn't configure the MPRemoteCommandCenter (by adding targets to the shared instance). In my use case, I do have to configure it (as for many other I recon) so I had to come up with another fix. Personally what I did is to pass the scenePhase to the views configuring/displaying the videos so that when the app enters the .inactive or .background state I can pause the AVPlayers and setup the nowPlayingInfo again (either to nil or current audio). I also do this cleanup as soon as the view disappears (via .onDisappear{}). Ultimately this is not great as there are still some side-effects, for example, If no audio is playing but the user simply opens the view with the video players, the app will appear as
Topic:
Media Technologies
SubTopic:
Video
Tags:
Hi all, I've developed an audio DSP application in C++ using AudioToolbox and CoreAudio on MacOS 14.4.1 with Xcode 15. I use an AudioQueue for input and another for output. This works great. I'm now adding real-time audio analysis eg spectral analysis. I want this to run independently of my audio processing so it can not interfere with audio playback. Taps on AudioQueues seem to be a good way of doing this... Since the analytics won't modify the audio data, I am using a Siphon Tap by setting the AudioQueueProcessingTapFlags to kAudioQueueProcessingTap_PreEffects | kAudioQueueProcessingTap_Siphon; This works fine on my output queue. However, on my input queue the Tap callback is called once and then a EXC_BAD_ACCESS occurs - screen shot below. NB: I believe that a callback should only call AudioQueueProcessingTapGetSourceAudio when not using a Siphon, so I don't call it. Relevant code: AudioQueueProcessingTapCallback tap_callback) { // Makes an audio tap fo
Since it sounds like the behavior changed across iOS versions, you can file a bug report to report a regression. If you do, post the FB number here for the record. That said, multi-megabyte URLs are unusual, and this means your use here likely falls outside the original intention of the openURL functionality — it isn't a general purpose data exchange API. If you zoom out from your current implementation details, what is the underlying problem that you are solving for? — Ed Ford, DTS Engineer
Topic:
UI Frameworks
SubTopic:
UIKit
Tags:
I have 3 phones iPhone 14 iOS 18.3 iPhone Xr iOS 18.5 iPhone Xr iOS 18.4.1 My app has a network extension, and I've noticed each phone having their connectivity interupted by calls on the push provider, calling stop with the noNetworkAvailable reason. The point of confusion is that each phone seems to get it's interuption at different times. For example one will get an interuption at 1:00, while the others is fine, while at 3:00 another will get an interuption, while the others are fine. This is confusing since a no network available seems to imply a problem with the router, or access point, but if that were the case, one would believe it should affect all the phones on the wifi. I don't see less interuptions on the iPhone14 vs the iPhone Xr. Do you believe the iOS version is affecting the performance? Could you please give me some insight, as to what could be going on inside these phones? P.S. I also see an error pop up when using NWConnection, this is inside the App. The state update handler will s
Hello, That's a bit of a semantic conundrum isn't it? We recommend reaching out to App Review to determine how to use the Captions box in this particular case. It seems incorrect to indicate the presence of captions when they aren't there in an audio or video asset. Perhaps there is a different and more direct way to indicate that the app is accessible to deaf users. Again, seek App Review's guidance.
Topic:
Accessibility & Inclusion
SubTopic:
General
Tags:
Sounds good! Will try and catch it.
Topic:
Developer Tools & Services
SubTopic:
Xcode
Tags:
I am developing an Apple app that has to download 1000s of MP4 files between 5 and 150MB each at one time (can take an hour or more - is OK). I use Apple's file picker to select the files, and then it freezes and I cannot download anything. I had this working a few months ago but now I cannot get it to work. Can someone give me some advice please? Unfortunately, you haven't really provided enough detail to go on here. Where are the files coming from? Does it this work with a smaller file count? Does everything work fine if you getting the files from the local device (not the network, as it sounds like you're doing here). What actually happens in your app when it freezes? __ Kevin Elliott DTS Engineer, CoreOS/Hardware
Topic:
App & System Services
SubTopic:
Core OS
Tags:
A filelist does not work here because the filelist does not get updated automatically when you add a new header into that folder. It sounds like you have dynamic inputs to the file list, and you could meet your needs if the file list supported wildcard patterns. Xcode doesn't support wildcards in file lists, so it's a worthwhile enhancement request to file. Please post the FB number here if you do. — Ed Ford, DTS Engineer
Topic:
Developer Tools & Services
SubTopic:
Xcode
Tags:
I need to direct text-to-speech generated audio from my app simultaneously to a bluetooth speaker device AND to the internal iPad speaker. The app uses AVSpeechSynthesizer and several third party speech engines. How best to do this? I noticed the outputChannels property on AVSpeechSynthesizer...are there any examples of how to use this?
Topic:
Accessibility & Inclusion
SubTopic:
General
In SwiftUI on macOS, A menu-style Picker is drawn as a pop-up button. It generally looks and behaves the same as an NSPopUpButton in AppKit. SwiftUI introduced iOS-like looking UI for settings in macOS, and consequently, the Picker also has its own style when placed inside a Form. A Form-style Picker displays only up/down chevrons and draws the background only when the mouse hovers over it. It also changes its width dynamically based on the selected item. Form { Picker(Animal:, selection: $selection) { ForEach([Dog, Cow], id: .self) { Text($0) } .pickerStyle(.menu) } You can find it, for instance, in the Print dialog. My question is: I couldn't find a way to draw an NSPopUpButton in AppKit with this style. Does anyone know how to achieve this in AppKit? Some might say I should just use SwiftUI straightforwardly, but I would like to use it in a print panel accessory that currently still avoids using SwiftUI but its dialog has SwiftUI.Form-looking.
When the distance filter is set to 5, the time interval between GPS data points while walking can sometimes be quite large, which means that Apple Fitness will not display the training route information. However, when cycling, the speed is higher, resulting in smaller time intervals between two GPS data points, so Apple Fitness will display the training route information. OK, that sound like an issue on the Fitness app then. Thanks for filing the FB17792319. I noticed that your report is mostly in Chinese. It will be even more helpful if you can add English translation so all the Fitness engineers can read. Thanks again. Best, —— Ziqiao Chen Worldwide Developer Relations.
Topic:
App & System Services
SubTopic:
Health & Fitness
Tags:
@Meran: Yeah, the error message makes it clear that Apple Intelligence is not available when starting your Mac from an external volume. The framework won't work unless Apple Intelligence in available. Would you mind to share the ID of your feedback report that has your sysdiagnose? Thanks. @Lightandshadow: Xcode is telling me Foundation Models is not available on Mac Catalyst, but the docs say it's supported. Lack of support as a known issue is not listed in Xcode 26's release notes. This sounds like a documentation issue worth a feedback report as well. Would you mind to file a feedback report for folks to track? Thanks. Best, —— Ziqiao Chen Worldwide Developer Relations.
Topic:
Machine Learning & AI
SubTopic:
Foundation Models
Tags:
Among the millions of users of our online product, we have identified through data metrics that the silent audio data capture rate on iPadOS 18.4.1 or 18.5 has increased abnormally. However, we are unable to reproduce the issue. Has anyone encountered a similar issue? The parameters we used are as follows: AudioSession: category:AVAudioSessionCategoryPlayAndRecord mode:AVAudioSessionModeDefault option:77 preferredSampleRate:48000.000000 preferredIOBufferDuration:0.010000 AudioUnit format.mFormatID = kAudioFormatLinearPCM; format.mSampleRate = 48000.0; format.mChannelsPerFrame = 2; format.mBitsPerChannel = 16; format.mFramesPerPacket = 1; format.mBytesPerFrame = format.mChannelsPerFrame * 16 / 8; format.mBytesPerPacket = format.mBytesPerFrame * format.mFramesPerPacket; format.mFormatFlags = kAudioFormatFlagsNativeEndian | kLinearPCMFormatFlagIsPacked | kLinearPCMFormatFlagIsSignedInteger; component.componentType = kAudioUnitType_Output; component.componentSubType = kAudioUnitSubType_RemoteIO; componen