Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.

All subtopics

Post

Replies

Boosts

Views

Activity

MPMusicPlayerController prepareToPlay errors
I'm getting a variety of errors when I call prepareToPlay on the MPMusicPlayerController. Sometimes they happen, sometimes they don't. I'm trying to play songs from the Apple Music service. When I don't get the errors, it plays just fine. I have iOS v13.5.1 on my iPhone Xs and I'm using Xcode 11.5. This is my code: let applicationMusicPlayer = MPMusicPlayerController.applicationMusicPlayer applicationMusicPlayer.setQueue(with: [trackID]) applicationMusicPlayer.prepareToPlay(completionHandler:{ error in if let error = error { print(error.localizedDescription) return } DispatchQueue.main.async{ applicationMusicPlayer.play() } } These are the various errors I'm getting: [SDKPlayback] Failed to prepareToPlay error: Error Domain=MPMusicPlayerControllerErrorDomain Code=2 "Queue was interrupted by another queue" UserInfo={NSDebugDescription=Queue was interrupted by another queue} [SDKPlayback] Failed to prepareToPlay error: Error Domain=MPMusicPlayerControllerErrorDomain Code=9 "Preparing queue timed out" UserInfo={NSDebugDescription=Preparing queue timed out} [SDKPlayback] Failed to prepareToPlay error: Error Domain=MPMusicPlayerControllerErrorDomain Code=6 "Failed to prepare to play" UserInfo={NSDebugDescription=Failed to prepare to play} [SDKPlayback] applicationQueuePlayer _establishConnectionIfNeeded timeout [ping did not pong]
7
0
4.8k
Jun ’20
Could not load some videos
Hello there! I am trying to use PHPickerViewController to load videos, but I got a problem: I could load some videos only not all. I refer to the existing thread - https://developer.apple.com/forums/thread/652695, but dosen't work. This is the code I persent PHPickerViewController var config = PHPickerConfiguration() config.selectionLimit = 1 config.filter = .videos config.preferredAssetRepresentationMode = .current let picker = PHPickerViewController(configuration: config) picker.delegate = self present(picker, animated: true, completion: nil) Below is the relevant implementation of the method: func picker(_ picker: PHPickerViewController, didFinishPicking results: [PHPickerResult]): picker.dismiss(animated: true, completion: nil) for result in results {   result.itemProvider.loadFileRepresentation(forTypeIdentifier: UTType.movie.identifier) { (url, error) in     if let error = error {       print(error)       return     }     guard let url = url else { return }     let fileName = "\(Date().timeIntervalSince1970).\(url.pathExtension)"     let newUrl = URL(fileURLWithPath: NSTemporaryDirectory() + fileName)     try? FileManager.default.copyItem(at: url, to: newUrl)     DispatchQueue.main.async {       self.playVideo(newUrl)     }   } } Before I print error in line 5, Xcode printed 3 lines of error: [AXRuntimeCommon] Unknown client: TestPHPicker [default] [ERROR] Could not create a bookmark: NSError: Cocoa 257 "The file couldn’t be opened because you don’t have permission to view it." } Error copying file type public.movie. Error: Error Domain=NSItemProviderErrorDomain Code=-1000 "Cannot load representation of type public.movie" UserInfo={NSLocalizedDescription=Cannot load representation of type public.movie, NSUnderlyingError=0x283a4a610 {Error Domain=NSCocoaErrorDomain Code=4101 "Couldn’t communicate with a helper application." UserInfo={NSUnderlyingError=0x283a48b10 {Error Domain=PHAssetExportRequestErrorDomain Code=2 "(null)" UserInfo={NSUnderlyingError=0x283a4a550 {Error Domain=CloudPhotoLibraryErrorDomain Code=82 "Failed to download CPLResourceTypeOriginal" UserInfo=0x28219b300 (not displayed)}}}}}} And I print error in line 5: Error Domain=NSItemProviderErrorDomain Code=-1000 "Cannot load representation of type public.movie" UserInfo={NSLocalizedDescription=Cannot load representation of type public.movie, NSUnderlyingError=0x283a4a610 {Error Domain=NSCocoaErrorDomain Code=4101 "Couldn’t communicate with a helper application." UserInfo={NSUnderlyingError=0x283a48b10 {Error Domain=PHAssetExportRequestErrorDomain Code=2 "(null)" UserInfo={NSUnderlyingError=0x283a4a550 {Error Domain=CloudPhotoLibraryErrorDomain Code=82 "Failed to download CPLResourceTypeOriginal" UserInfo=0x28219b300 (not displayed)}}}}}} For some videos I can load successfully, and some videos I got error. I don't know why this happened. I am testing this on an iPhone X iOS 14.0(18A373). Xcode 12.0 (12A7209). Thanks for help!
6
0
3.3k
Sep ’20
Adding HTTP headers to a AVURLAsset
Hi, for the implementation of an audio player with signed URL's, I need to be able to set an authorization header to the request for an AVURLAsset. This works but not on Airplay when trying to stream multiple songs in a queue. For each item I do: let headerFields: [String: String] = ["Authorization": getIdToken()!] super.init(url: url, options: ["AVURLAssetHTTPHeaderFieldsKey": headerFields]) But only the first 2 songs in the queue actually get this authorization header sent along, somehow it is removed for subsequent songs. Any ideas on how I can fix this? thanks, Thomas
1
1
1.3k
Jan ’21
What causes "issue_type = overload" in coreaudiod with USB audio interface?
I have a USB audio interface that is causing kernel traps and the audio output to "skip" or dropout every few seconds. This behavior occurs with a completely fresh install of Catalina as well as Big Sur with the stock Music app on a 2019 MacBook Pro 16 (full specs below). The Console logs show coreaudiod got an error from a kernel trap, a "USB Sound assertion" in AppleUSBAudio/AppleUSBAudio-401.4/KEXT/AppleUSBAudioDevice.cpp at line 6644, and the Music app "skipping cycle due to overload." I've added a short snippet from Console logs around the time of the audio skip/drop out. The more complete logs are at this gist: https://gist.github.com/djflux/08d9007e2146884e6df1741770de5105 I've also opened a Feedback Assistant ticket (FB9037528): https://feedbackassistant.apple.com/feedback/9037528 Does anyone know what could be causing this issue? Thanks for any help. Cheers, Flux aka Andy. Hardware Overview:  Model Name: MacBook Pro  Model Identifier: MacBookPro16,1  Processor Name: 8-Core Intel Core i9  Processor Speed: 2.4 GHz  Number of Processors: 1  Total Number of Cores: 8  L2 Cache (per Core): 256 KB  L3 Cache: 16 MB  Hyper-Threading Technology: Enabled  Memory: 64 GB  System Firmware Version: 1554.80.3.0.0 (iBridge: 18.16.14347.0.0,0) System Software Overview: System Version: macOS 11.2.3 (20D91)  Kernel Version: Darwin 20.3.0  Boot Volume: Macintosh HD  Boot Mode: Normal  Computer Name: mycomputername  User Name: myusername  Secure Virtual Memory: Enabled  System Integrity Protection: Enabled USB interface: Denon DJ DS1 Snippet of Console logs error 21:07:04.848721-0500 coreaudiod HALS_IOA1Engine::EndWriting: got an error from the kernel trap, Error: 0xE00002D7 default 21:07:04.848855-0500 Music HALC_ProxyIOContext::IOWorkLoop: skipping cycle due to overload default 21:07:04.857903-0500 kernel USB Sound assertion (Resetting engine due to error returned in Read Handler) in /AppleInternal/BuildRoot/Library/Caches/com.apple.xbs/Sources/AppleUSBAudio/AppleUSBAudio-401.4/KEXT/AppleUSBAudioDevice.cpp at line 6644 ... default 21:07:05.102746-0500 coreaudiod Audio IO Overload inputs: 'private' outputs: 'private' cause: 'Unknown' prewarming: no recovering: no default 21:07:05.102926-0500 coreaudiod   CAReportingClient.mm:508  message {   HostApplicationDisplayID = "com.apple.Music";   cause = Unknown;   deadline = 2615019;   "input_device_source_list" = Unknown;   "input_device_transport_list" = USB;   "input_device_uid_list" = "AppleUSBAudioEngine:Denon DJ:DS1:000:1,2";   "io_buffer_size" = 512;   "io_cycle" = 1;   "is_prewarming" = 0;   "is_recovering" = 0;   "issue_type" = overload;   lateness = "-535";   "output_device_source_list" = Unknown;   "output_device_transport_list" = USB;   "output_device_uid_list" = "AppleUSBAudioEngine:Denon DJ:DS1:000:1,2"; }: (null)
38
12
18k
Apr ’21
MusicKit: developer token request failed
The MusicKit video states that you just enable "MusicKit" in your application identifier and "you're done!" Ok, so I did that, and I'm seeing the following error when trying to run a song query: [DataRequesting] Failed retrieving MusicKit tokens: Error Domain=ICErrorDomain Code=-8200 "Media API Token Service's response was invalid (status code: Unauthorized (401))." UserInfo={NSDebugDescription=Media API Token Service's response was invalid (status code: Unauthorized (401))., NSUnderlyingError=0x6000023a0c60 {Error Domain=AMSErrorDomain Code=301 "Invalid Status Code" UserInfo={NSLocalizedDescription=Invalid Status Code, AMSURL=https://sf-api-token-service.itunes.apple.com/apiToken?REDACTED, AMSStatusCode=401, AMSServerPayload={ status = verificationFailure; }, NSLocalizedFailureReason=The response has an invalid status code}}}. Throwing .developerTokenRequestFailed. Is this just broken on Apple's side? Is there some other magic string that needs to be added to the plist other than NSAppleMusicUsageDescription?
7
0
3.9k
Jun ’21
Upstream Service Error when using MusicDataRequest
Hey there! I'm trying to use MusicDataRequest to fetch the contents of a user's library. Most of the documented endpoints I've tried seem to be working as expected, but the /me/library/artists and /me/library/albums endpoints are consistenty giving me a 500 Upstream Service Error. Here's an example of my code, and the resulting error: let url = URL(string: "https://api.music.apple.com/v1/me/library/albums")! let request = MusicDataRequest(urlRequest: URLRequest(url: url)) do { let response = try await request.response()     let string = String(data: response.data, encoding: .utf8)!     print("success: \(string)") } catch {     print("error: \(error)") } MusicDataRequest.Error(   status: 500,   code: 50001,   title: "Upstream Service Error",   detailText: "Error fetching library content",   id: "5OFXMJAGNU2WCTDKNAYYP4BJXI",   originalResponse: MusicDataResponse(    data: 153 bytes,    urlResponse: <NSHTTPURLResponse: 0x0000000280f04dc0>   ) ) If I replace /albums with /songs or /playlists in the above code everything works as expected. Is there something I'm missing from the albums and artists requests? Or is this a bug with the API?
13
0
3.3k
Jul ’21
How to get the album info for a song fetched by MusicDataRequest?
I'm slowly learning the new MusicKit beta for swift. I've learned to successfully retrieve tracks of type Song using MusicDataRequest, using the following: ... let countryCode = try await MusicDataRequest.currentCountryCode if let url = URL(string: "https://api.music.apple.com/v1/catalog/\(countryCode)/songs?filter[isrc]=\(isrc)") {   let dataRequest = MusicDataRequest(urlRequest: URLRequest(url: url))   let dataResponse = try await dataRequest.response() ... However, when I decode the data, there does not seem to be any album information that I can see. I've tried adding includes=albums to the URL, but I don't think that's the right approach, because when I veiw the Song struct in MusicKit, I don't see a reference to an Album type anywhere. Any advice on how to retrieve the album information would be most appreciated. Thanks.
7
0
3.3k
Aug ’21
AVAudioPCMBuffer Memory Management
I’m using AVAudioEngine to get a stream of AVAudioPCMBuffers from the device’s microphone using the usual installTap(onBus:) setup. To distribute the audio stream to other parts of the program, I’m sending the buffers to a Combine publisher similar to the following: private let publisher = PassthroughSubject<AVAudioPCMBuffer, Never>() I’m starting to suspect I have some kind of concurrency or memory management issue with the buffers, because when consuming the buffers elsewhere I’m getting a range of crashes that suggest some internal pointer in a buffer is NULL (specifically, I’m seeing crashes in vDSP.convertElements(of:to:) when I try to read samples from the buffer). These crashes are in production and fairly rare — I can’t reproduce them locally. I never modify the audio buffers, only read them for analysis. My question is: should it be possible to put AVAudioPCMBuffers into a Combine pipeline? Does the AVAudioPCMBuffer class not retain/release the underlying AudioBufferList’s memory the way I’m assuming? Is this a fundamentally flawed approach?
1
1
1k
Oct ’21
MPMusicPlayerController queue information
Is there a way to get the current queue items from an MPMusicPlayerController? I need to know when the items I've set to the queue finish playing completely but cannot find any way to do this. I am not using MusicKit but setting the queue via play parameters. From what I can tell so far, after the queue finishes playing, it pauses and resets to the first item in the queue. So even after playback is done, there is no way to know that it finished on its own.
2
0
1.1k
Nov ’21
Background audio issues with AVPictureInPictureController
I know that if you want background audio from AVPlayer you need to detatch your AVPlayer from either your AVPlayerViewController or your AVPlayerLayer in addition to having your AVAudioSession configured correctly. I have that all squared away and background audio is fine until we introduce AVPictureInPictureController or use the PiP behavior baked into AVPlayerViewController. If you want PiP to behave as expected when you put your app into the background by switching to another app or going to the homescreen you can't perform the detachment operation otherwise the PiP display fails. On an iPad if PiP is active and you lock your device you continue to get background audio playback. However on an iPhone if PiP is active and you lock the device the audio pauses. However if PiP is inactive and you lock the device the audio will pause and you have to manually tap play on the lockscreen controls. This is the same between iPad and iPhone devices. My questions are: Is there a way to keep background-audio playback going when PiP is inactive and the device is locked (iPhone and iPad) Is there a way to keep background-audio playback going when PiP is active and the device is locked? (iPhone)
1
0
1.5k
Dec ’21
AVCaptureVideoDataOutput video range value exceed the range
CVPixelBuffer.h defines kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange = '420v', /* Bi-Planar Component Y'CbCr 8-bit 4:2:0, video-range (luma=[16,235] chroma=[16,240]). baseAddr points to a big-endian CVPlanarPixelBufferInfo_YCbCrBiPlanar struct */ kCVPixelFormatType_420YpCbCr10BiPlanarVideoRange = 'x420', /* 2 plane YCbCr10 4:2:0, each 10 bits in the MSBs of 16bits, video-range (luma=[64,940] chroma=[64,960]) */ But when I set above format camera output, and I find the output pixelbuffer's value is exceed the range.I can see [0 -255] for 420YpCbCr8BiPlanarVideoRange and [0,1023] for 420YpCbCr10BiPlanarVideoRange Is it a bug or something wrong of the output?If it is not how can I choose the correct matrix transfer the yuv data to rgb?
1
0
1.3k
Dec ’21
How to obtain an AVAudioFormat for a canonical format?
I receive a buffer from[AVSpeechSynthesizer convertToBuffer:fromBuffer:] and want to schedule it on an AVPlayerNode. The player node's output format need to be something that the next node could handle and as far as I understand most nodes can handle a canonical format. The format provided by AVSpeechSynthesizer is not something thatAVAudioMixerNode supports. So the following:   AVAudioEngine *engine = [[AVAudioEngine alloc] init];   playerNode = [[AVAudioPlayerNode alloc] init];   AVAudioFormat *format = [[AVAudioFormat alloc] initWithSettings:utterance.voice.audioFileSettings];   [engine attachNode:self.playerNode];   [engine connect:self.playerNode to:engine.mainMixerNode format:format]; Throws an exception: Thread 1: "[[busArray objectAtIndexedSubscript:(NSUInteger)element] setFormat:format error:&nsErr]: returned false, error Error Domain=NSOSStatusErrorDomain Code=-10868 \"(null)\"" I am looking for a way to obtain the canonical format for the platform so that I can use AVAudioConverter to convert the buffer. Since different platforms have different canonical formats, I imagine there should be some library way of doing this. Otherwise each developer will have to redefine it for each platform the code will run on (OSX, iOS etc) and keep it updated when it changes. I could not find any constant or function which can make such format, ASDB or settings. The smartest way I could think of, which does not work:   AudioStreamBasicDescription toDesc;   FillOutASBDForLPCM(toDesc, [AVAudioSession sharedInstance].sampleRate,                      2, 16, 16, kAudioFormatFlagIsFloat, kAudioFormatFlagsNativeEndian);   AVAudioFormat *toFormat = [[AVAudioFormat alloc] initWithStreamDescription:&toDesc]; Even the provided example for iPhone, in the documentation linked above, uses kAudioFormatFlagsAudioUnitCanonical and AudioUnitSampleType which are deprecated. So what is the correct way to do this?
4
0
2.1k
Feb ’22
How to set the Portrait effect on/off in live camera view in AVFoundation in Swift?
I am using AVFoundation for live camera view. I can get my device from the current video input (of type AVCaptureDeviceInput) like: let device = videoInput.device The device's active format has a isPortraitEffectSupported. How can I set the Portrait Effect on and off in live camera view? I setup the camera like this: private var videoInput: AVCaptureDeviceInput! private let session = AVCaptureSession() private(set) var isSessionRunning = false private var renderingEnabled = true private let videoDataOutput = AVCaptureVideoDataOutput() private let photoOutput = AVCapturePhotoOutput() private(set) var cameraPosition: AVCaptureDevice.Position = .front func configureSession() { sessionQueue.async { [weak self] in guard let strongSelf = self else { return } if strongSelf.setupResult != .success { return } let defaultVideoDevice: AVCaptureDevice? = strongSelf.videoDeviceDiscoverySession.devices.first(where: {$0.position == strongSelf.cameraPosition}) guard let videoDevice = defaultVideoDevice else { print("Could not find any video device") strongSelf.setupResult = .configurationFailed return } do { strongSelf.videoInput = try AVCaptureDeviceInput(device: videoDevice) } catch { print("Could not create video device input: \(error)") strongSelf.setupResult = .configurationFailed return } strongSelf.session.beginConfiguration() strongSelf.session.sessionPreset = AVCaptureSession.Preset.photo // Add a video input. guard strongSelf.session.canAddInput(strongSelf.videoInput) else { print("Could not add video device input to the session") strongSelf.setupResult = .configurationFailed strongSelf.session.commitConfiguration() return } strongSelf.session.addInput(strongSelf.videoInput) // Add a video data output if strongSelf.session.canAddOutput(strongSelf.videoDataOutput) { strongSelf.session.addOutput(strongSelf.videoDataOutput) strongSelf.videoDataOutput.videoSettings = [kCVPixelBufferPixelFormatTypeKey as String: Int(kCVPixelFormatType_32BGRA)] strongSelf.videoDataOutput.setSampleBufferDelegate(self, queue: strongSelf.dataOutputQueue) } else { print("Could not add video data output to the session") strongSelf.setupResult = .configurationFailed strongSelf.session.commitConfiguration() return } // Add photo output if strongSelf.session.canAddOutput(strongSelf.photoOutput) { strongSelf.session.addOutput(strongSelf.photoOutput) strongSelf.photoOutput.isHighResolutionCaptureEnabled = true } else { print("Could not add photo output to the session") strongSelf.setupResult = .configurationFailed strongSelf.session.commitConfiguration() return } strongSelf.session.commitConfiguration() } } func prepareSession(completion: @escaping (SessionSetupResult) -&gt; Void) { sessionQueue.async { [weak self] in guard let strongSelf = self else { return } switch strongSelf.setupResult { case .success: strongSelf.addObservers() if strongSelf.photoOutput.isDepthDataDeliverySupported { strongSelf.photoOutput.isDepthDataDeliveryEnabled = true } if let photoOrientation = AVCaptureVideoOrientation(interfaceOrientation: interfaceOrientation) { if let unwrappedPhotoOutputConnection = strongSelf.photoOutput.connection(with: .video) { unwrappedPhotoOutputConnection.videoOrientation = photoOrientation } } strongSelf.dataOutputQueue.async { strongSelf.renderingEnabled = true } strongSelf.session.startRunning() strongSelf.isSessionRunning = strongSelf.session.isRunning strongSelf.mainQueue.async { strongSelf.previewView.videoPreviewLayer.session = strongSelf.session } completion(strongSelf.setupResult) default: completion(strongSelf.setupResult) } } } Then to I set isPortraitEffectsMatteDeliveryEnabled like this: func setPortraitAffectActive(_ state: Bool) { sessionQueue.async { [weak self] in guard let strongSelf = self else { return } if strongSelf.photoOutput.isPortraitEffectsMatteDeliverySupported { strongSelf.photoOutput.isPortraitEffectsMatteDeliveryEnabled = state } } } However, I don't see any Portrait Effect in the live camera view! Any ideas why?
2
0
1.9k
May ’22
Thoughts on MusicLibraryRequest as a replacement for MPMediaQuery
I'm very excited about the new MusicLibrary API, but after a couple of days of playing around with it, I have to say that I find the implementation of filtering MusicLibraryRequests a little confusing. MPMediaQuery has a fairly extensive list of predicates that can be applied, including string and persistentID comparisons for artist, album artist genre, and more. It also lets you filter on an item’s title. MusicLibraryRequests let you filter on the item’s ID, or on its MusicKit Artist and Genre relationships. To me, this seems like it adds an extra step.  With an MPMediaQuery, if I wanted to fetch every album by a given artist, I’d apply an MPMediaPropertyPredicate looking at MPMediaItemPropertyAlbumArtist and compare the string. It was also easy to change the MPMediaPredicateComparison to .contains to match more widely. If I wanted to surface albums by “Aesop Rock” or “Aesop Rock & Blockhead,” I could use that. In the MusicLibraryRequest implementation, it looks like I need to perform a MusicLibraryRequest<Artist> first in order to get the Artist objects. There’s no filter for the name property, so if I don’t have their IDs, I’ve got to use filter(text:). From there, I can take the results of that request and apply them to my MusicLibraryRequest<Album> using the filter(matching:memberOf) function.  I could use filter(text:) on the MusicLibraryRequest<Album>, but that filters across multiple properties (title and artistName?) and is less precise than defining the actual property I want to match against. I think my ideal version of the MusicLibraryRequest API would offer something like filter(matching:equalTo:) or filter(matching:contains:) that worked off of KeyPaths rather than relationships. That seems more intuitive to me. I’m not saying we need every property from every filterable MPMediaItemProperty key, but I’d love to be able to do it on title, artistName, and other common metadata. That might look something like: filter(matching: \.title, contains: “Abbey Road”) filter(matching: \.artistName, equalTo: “Between The Buried And Me”) I noticed that filter(text:) is case insensitive, which is awesome, and something I’ve wanted for a long time in MPMediaPropertyPredicate. As a bonus, it would be great if a KeyPath based filter API supported a case sensitivity flag. This is less of a problem when dealing with Apple Music catalog content, but users’ libraries are a harsh environment, and you might have an artist “Between The Buried And Me” and one called “Between the Buried and Me.” It would be great to get albums from both with something like: filter(matching: \.artistName, equalTo: “Between The Buried And Me”, caseSensitive: false)  I've submitted the above as FB10185685. I also submitted another feedback this morning regarding filter(text:) and repeating text as FB10184823. My last wishlist item for this API (for the time being!) is exposing the MPMediaItemPropertyAlbumPersistentID as an available filter attribute. I know, I know… hear me out. If you take a look at the other thread I made today, you’ll see that due to missing metadata in MusicKit, I still have some use cases where I need to be able to reference an MPMediaItem and might need to fetch its containing MPMediaItemCollection to get at other tracks on the album. It would be nice to seamlessly be able to fetch the MPMediaItemCollection or the library Album using a shared identifier, especially when it comes to being able to play the album in MusicKit’s player rather than Media Player’s.  I've submitted that list bit as FB10185789 Thanks for bearing with my walls of text today. Keep up the great work!
11
1
3.2k
Jun ’22
Issue setting a queue with library and non-library items at the same time (plus a couple more MusicKit issues)
As the summer continues, I have been diving deeper and deeper into MusicKit, largely with great results. A few issues have arisen that I've outlined here, feedbacks already filed and numbers included here. All of this happens on the lasted developer beta and latest Xcode beta. Thanks! FB10967343 - Setting the queue with library and non-library items at the same time doesn't work correctly In my app, I am working on a feature that lets a user shuffle songs from a collection of albums that may or may not be in their library. However, I’ve discovered an issue where the queue does not seem to work correctly when mixing these types. I’ve attempted to load ApplicationMusicPlayer by creating a Queue and to load applicationQueuePlayer using a MPMusicPlayerPlayParametersQueueDescriptor, but the same issue occurs each time. The queue is able to play songs from the same source, but if it’s been playing a library song and tries to move to a non-library song, the queue stops.  The first thing I do is pick random songs from each album, using a MusicLibraryRequest or a MusicCatalogResourceRequest as appropriate, then taking a randomElement() from the ensuing MusicItemCollection for the album.  I append each track to an array, which I then cast to MusicItemCollection so I’ve now got a MusicItemCollection consisting of the tracks I want. If I’m in MusicKit land, I simply set the queue as follows:  player.queue = ApplicationMusicPlayer.Queue(for: tracks) It takes a bit more doing in MediaPlayer, but in theory this should also work, right?    do {         let paramObjects = tracks.compactMap {             $0.playParameters         }         let params = try paramObjects.map({try JSONEncoder().encode($0)}) let dicts = try params.compactMap {               try JSONSerialization.jsonObject(with: $0, options: []) as? [String:Any]           }           let finalParams = dicts.compactMap {                 MPMusicPlayerPlayParameters(dictionary: $0)             } let descriptor = MPMusicPlayerPlayParametersQueueDescriptor(playParametersQueue: finalParams) mediaPlayer.setQueue(with: descriptor) } catch { print(error) } In either case, the following issue occurs: say that I end up with a queue made up of one library song, then one non-library song. The player will play just the first song, then it acts as if the queue has ended. Say that it has two non-library songs, then one library song. Just the two non-library songs play. Indeed, printing queue.entries shows just the number of items that were from the same source type. FB10967076 - Publishing changes from background thread error when inserting queue items When using the .insert method on ApplicationMusicPlayer.Queue on the last iOS 16 and Xcode betas, it returns a “Publishing changes from background thread” error even though the function I’m doing in is marked as a @MainActor and the stacktace indicates it was on the main thread. FB10967277 - song.with([.albums], preferredSource: .library) generates thousands of lines of EntityQueries in the console I’ve noticed that when using the preferredSource: .library when requesting additional properties on a library item creates ~6,000 of “EntityQuery” entries in the console, all in the span of a second. This doesn’t seem to be leading to any major performance issues, but it sure seems like something isn't right. let request = MusicLibraryRequest<Song>.init() do { let response = try await request.response() guard let song = response.items.first else { return } let songWithAlbums = try await song.with([.albums], preferredSource: .library) } catch { print(error) } generates the following output (except... 6,000 of them) 2022-07-31 13:02:07.729003-0400 MusicKitFutzing[9405:2192606] [EntityQuery] Finished fetching results in 0s 2022-07-31 13:02:07.729047-0400 MusicKitFutzing[9405:2192605] [EntityQuery] Finished executing query in 0.00100017s 2022-07-31 13:02:07.729202-0400 MusicKitFutzing[9405:2192611] [EntityQuery] Finished executing query in 0s 2022-07-31 13:02:07.729240-0400 MusicKitFutzing[9405:2192605] [EntityQuery] Finished fetching results in 0s
1
1
1.4k
Jul ’22
NSCocoaErrorDomain code = 134093 , only in iOS16
i have received a lot of crash log only in iOS16 the crash occured when i called : [[PHImageManager defaultManager] requestImageDataForAsset:asset options:options resultHandler:resultHandler] here is the crash log Exception Type: NSInternalInconsistencyException ExtraInfo: Code Type: arm64 OS Version: iPhone OS 16.0 (20A5328h) Hardware Model: iPhone14,3 Launch Time: 2022-07-30 18:43:25 Date/Time: 2022-07-30 18:49:17 *** Terminating app due to uncaught exception 'NSInternalInconsistencyException', reason:Unhandled error (NSCocoaErrorDomain, 134093) occurred during faulting and was thrown: Error Domain=NSCocoaErrorDomain Code=134093 "(null)" Last Exception Backtrace: 0 CoreFoundation 0x00000001cf985dc4 0x1cf97c000 + 40388 1 libobjc.A.dylib 0x00000001c8ddfa68 0x1c8dc8000 + 96872 2 CoreData 0x00000001d56d2358 0x1d56cc000 + 25432 3 CoreData 0x00000001d56fa19c 0x1d56cc000 + 188828 4 CoreData 0x00000001d5755be4 0x1d56cc000 + 564196 5 CoreData 0x00000001d57b0508 0x1d56cc000 + 935176 6 PhotoLibraryServices 0x00000001df1783e0 0x1df0ed000 + 570336 7 Photos 0x00000001df8aa88c 0x1df85d000 + 317580 8 PhotoLibraryServices 0x00000001df291de0 0x1df0ed000 + 1723872 9 CoreData 0x00000001d574e518 0x1d56cc000 + 533784 10 libdispatch.dylib 0x00000001d51fc0fc 0x1d51f8000 + 16636 11 libdispatch.dylib 0x00000001d520b634 0x1d51f8000 + 79412 12 CoreData 0x00000001d574e0a0 0x1d56cc000 + 532640 13 PhotoLibraryServices 0x00000001df291d94 0x1df0ed000 + 1723796 14 PhotoLibraryServices 0x00000001df291434 0x1df0ed000 + 1721396 15 Photos 0x00000001df8a8380 0x1df85d000 + 308096 16 Photos 0x00000001df89d050 0x1df85d000 + 262224 17 Photos 0x00000001df87f62c 0x1df85d000 + 140844 18 Photos 0x00000001df87ee94 0x1df85d000 + 138900 19 Photos 0x00000001df87e594 0x1df85d000 + 136596 20 Photos 0x00000001df86b5c8 0x1df85d000 + 58824 21 Photos 0x00000001df86d938 0x1df85d000 + 67896 22 Photos 0x00000001dfa37a64 0x1df85d000 + 1944164 23 Photos 0x00000001dfa37d18 0x1df85d000 + 1944856 24 youavideo -[YouaImageManager requestImageDataForAsset:options:resultHandler:] (in youavideo) (YouaImageManager.m:0) 27 25 youavideo -[YouaAlbumTransDataController requstTransImageHandler:] (in youavideo) (YouaAlbumTransDataController.m:0) 27 26 youavideo -[YouaAlbumTransDataController requstTransWithHandler:] (in youavideo) (YouaAlbumTransDataController.m:77) 11 27 youavideo -[YouaUploadTransDataOperation startTrans] (in youavideo) (YouaUploadTransDataOperation.m:102) 19 28 Foundation 0x00000001c9e78038 0x1c9e3c000 + 245816 29 Foundation 0x00000001c9e7d704 0x1c9e3c000 + 268036 30 libdispatch.dylib 0x00000001d51fa5d4 0x1d51f8000 + 9684 31 libdispatch.dylib 0x00000001d51fc0fc 0x1d51f8000 + 16636 32 libdispatch.dylib 0x00000001d51ff58c 0x1d51f8000 + 30092 33 libdispatch.dylib 0x00000001d51febf4 0x1d51f8000 + 27636 34 libdispatch.dylib 0x00000001d520db2c 0x1d51f8000 + 88876 35 libdispatch.dylib 0x00000001d520e338 0x1d51f8000 + 90936 36 libsystem_pthread.dylib 0x00000002544b9dbc 0x2544b9000 + 3516 37 libsystem_pthread.dylib 0x00000002544b9b98 0x2544b9000 + 2968 i can't find the error code 134093 definition i don't know what's going wrong in iOS16 Would anyone have a hint of why this could happen and how to resolve it? thanks very much
7
5
3.4k
Aug ’22