Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.

All subtopics

Post

Replies

Boosts

Views

Activity

MPMusicPlayerController queue information
Is there a way to get the current queue items from an MPMusicPlayerController? I need to know when the items I've set to the queue finish playing completely but cannot find any way to do this. I am not using MusicKit but setting the queue via play parameters. From what I can tell so far, after the queue finishes playing, it pauses and resets to the first item in the queue. So even after playback is done, there is no way to know that it finished on its own.
2
0
922
Nov ’21
Background audio issues with AVPictureInPictureController
I know that if you want background audio from AVPlayer you need to detatch your AVPlayer from either your AVPlayerViewController or your AVPlayerLayer in addition to having your AVAudioSession configured correctly. I have that all squared away and background audio is fine until we introduce AVPictureInPictureController or use the PiP behavior baked into AVPlayerViewController. If you want PiP to behave as expected when you put your app into the background by switching to another app or going to the homescreen you can't perform the detachment operation otherwise the PiP display fails. On an iPad if PiP is active and you lock your device you continue to get background audio playback. However on an iPhone if PiP is active and you lock the device the audio pauses. However if PiP is inactive and you lock the device the audio will pause and you have to manually tap play on the lockscreen controls. This is the same between iPad and iPhone devices. My questions are: Is there a way to keep background-audio playback going when PiP is inactive and the device is locked (iPhone and iPad) Is there a way to keep background-audio playback going when PiP is active and the device is locked? (iPhone)
1
0
1.2k
Dec ’21
AVCaptureVideoDataOutput video range value exceed the range
CVPixelBuffer.h defines kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange = '420v', /* Bi-Planar Component Y'CbCr 8-bit 4:2:0, video-range (luma=[16,235] chroma=[16,240]). baseAddr points to a big-endian CVPlanarPixelBufferInfo_YCbCrBiPlanar struct */ kCVPixelFormatType_420YpCbCr10BiPlanarVideoRange = 'x420', /* 2 plane YCbCr10 4:2:0, each 10 bits in the MSBs of 16bits, video-range (luma=[64,940] chroma=[64,960]) */ But when I set above format camera output, and I find the output pixelbuffer's value is exceed the range.I can see [0 -255] for 420YpCbCr8BiPlanarVideoRange and [0,1023] for 420YpCbCr10BiPlanarVideoRange Is it a bug or something wrong of the output?If it is not how can I choose the correct matrix transfer the yuv data to rgb?
1
0
1.1k
Dec ’21
How to obtain an AVAudioFormat for a canonical format?
I receive a buffer from[AVSpeechSynthesizer convertToBuffer:fromBuffer:] and want to schedule it on an AVPlayerNode. The player node's output format need to be something that the next node could handle and as far as I understand most nodes can handle a canonical format. The format provided by AVSpeechSynthesizer is not something thatAVAudioMixerNode supports. So the following:   AVAudioEngine *engine = [[AVAudioEngine alloc] init];   playerNode = [[AVAudioPlayerNode alloc] init];   AVAudioFormat *format = [[AVAudioFormat alloc] initWithSettings:utterance.voice.audioFileSettings];   [engine attachNode:self.playerNode];   [engine connect:self.playerNode to:engine.mainMixerNode format:format]; Throws an exception: Thread 1: "[[busArray objectAtIndexedSubscript:(NSUInteger)element] setFormat:format error:&nsErr]: returned false, error Error Domain=NSOSStatusErrorDomain Code=-10868 \"(null)\"" I am looking for a way to obtain the canonical format for the platform so that I can use AVAudioConverter to convert the buffer. Since different platforms have different canonical formats, I imagine there should be some library way of doing this. Otherwise each developer will have to redefine it for each platform the code will run on (OSX, iOS etc) and keep it updated when it changes. I could not find any constant or function which can make such format, ASDB or settings. The smartest way I could think of, which does not work:   AudioStreamBasicDescription toDesc;   FillOutASBDForLPCM(toDesc, [AVAudioSession sharedInstance].sampleRate,                      2, 16, 16, kAudioFormatFlagIsFloat, kAudioFormatFlagsNativeEndian);   AVAudioFormat *toFormat = [[AVAudioFormat alloc] initWithStreamDescription:&toDesc]; Even the provided example for iPhone, in the documentation linked above, uses kAudioFormatFlagsAudioUnitCanonical and AudioUnitSampleType which are deprecated. So what is the correct way to do this?
4
0
1.8k
Feb ’22
Low playback volume when using kAudioUnitSubType_VoiceProcessingIO audio unit
I’m developing a voice communication app for the iPad with both playback and record and using AudioUnit of type kAudioUnitSubType_VoiceProcessingIO to have echo cancellation. When playing the audio before initializing the recording audio unit, volume is high. But if I'm playing the audio after initializing the audio unit or when switching to remoteio and then back to vpio the playback volume is low. It seems like a bug in iOS, any solution or workaround for this? Searching the net I only found this post without any solution: https://developer.apple.com/forums/thread/671836
1
2
1.9k
Feb ’22
How can I play Apple Music Live Radio Station?
Hello, I'm using systemMusicPlayer to play Apple Music Live Radio Station got from Apple Music API. But it doesn't work. How can I do that? Error: Test[46751:13235249] [SDKPlayback] Failed to prepareToPlay error: Error Domain=MPMusicPlayerControllerErrorDomain Code=6 "Failed to prepare to play" UserInfo={NSDebugDescription=Failed to prepare to play} My implementation:    let musicPlayerController = MPMusicPlayerController.systemMusicPlayer musicPlayerController.beginGeneratingPlaybackNotifications()      musicPlayerController.setQueue(with: "ra.978194965")     musicPlayerController.play() API response: { “id”: “ra.978194965”, “type”: “stations”, “href”: “/v1/catalog/us/stations/ra.978194965”, “attributes”: { “artwork”: { “width”: 4320, “url”: “https://is2-ssl.mzstatic.com/image/thumb/Features114/v4/e5/10/76/e5107683-9e51-ebc5-3901-d8fbd65f2c2a/source/{w}x{h}sr.jpeg”, “height”: 1080, “textColor3”: “332628”, “textColor2”: “120509”, “textColor4”: “33272a”, “textColor1”: “000000”, “bgColor”: “f4f4f4”, “hasP3”: false }, “url”: “https://music.apple.com/us/station/apple-music-1/ra.978194965”, “mediaKind”: “audio”, “supportedDrms”: [ “fairplay”, “playready”, “widevine” ], “requiresSubscription”: false, “name”: “Apple Music 1”, “kind”: “streaming”, “radioUrl”: “itsradio://music.apple.com/us/station/ra.978194965”, “playParams”: { “id”: “ra.978194965”, “kind”: “radioStation”, “format”: “stream”, “stationHash”: “CgkIBRoFlaS40gMQBA”, “mediaType”: 0 }, “editorialNotes”: { “name”: “Apple Music 1”, “short”: “The new music that matters.”, “tagline”: “The new music that matters.” }, “isLive”: true } },``` Thank you! Best regards, MichaelNg
2
0
2.0k
Feb ’22
MusicKit Dev Token Refresh
I am trying to follow along with this in order to auto generate my dev token on requests to MusicKit, but I am getting an error about by identifier, which was configured Link: https://developer.apple.com/documentation/musickit/using-automatic-token-generation-for-apple-music-api Error: 2022-04-26 14:12:06.353589-0400 [6885:431407] [DataRequesting] Failed retrieving developer token: Error Domain=ICErrorDomain Code=-8200 "Media API Token Service responded with status code: Not Found (404). This suggests that "<set_bundle_ID>" was likely not registered as a valid client identifier." UserInfo={NSDebugDescription=Media API Token Service responded with status code: Not Found (404). This suggests that "<set_bundle_ID>" was likely not registered as a valid client identifier., NSUnderlyingError=0x2827669a0 {Error Domain=AMSErrorDomain Code=301 "Invalid Status Code" UserInfo={NSLocalizedDescription=Invalid Status Code, AMSURL=https://sf-api-token-service.itunes.apple.com/apiToken?REDACTED, AMSStatusCode=404, AMSServerPayload={     message = "Client not found";     status = 40402; }, NSLocalizedFailureReason=The response has an invalid status code}}}. Throwing .developerTokenRequestFailed. error getting token
4
0
1.7k
Apr ’22
How to set the Portrait effect on/off in live camera view in AVFoundation in Swift?
I am using AVFoundation for live camera view. I can get my device from the current video input (of type AVCaptureDeviceInput) like: let device = videoInput.device The device's active format has a isPortraitEffectSupported. How can I set the Portrait Effect on and off in live camera view? I setup the camera like this: private var videoInput: AVCaptureDeviceInput! private let session = AVCaptureSession() private(set) var isSessionRunning = false private var renderingEnabled = true private let videoDataOutput = AVCaptureVideoDataOutput() private let photoOutput = AVCapturePhotoOutput() private(set) var cameraPosition: AVCaptureDevice.Position = .front func configureSession() { sessionQueue.async { [weak self] in guard let strongSelf = self else { return } if strongSelf.setupResult != .success { return } let defaultVideoDevice: AVCaptureDevice? = strongSelf.videoDeviceDiscoverySession.devices.first(where: {$0.position == strongSelf.cameraPosition}) guard let videoDevice = defaultVideoDevice else { print("Could not find any video device") strongSelf.setupResult = .configurationFailed return } do { strongSelf.videoInput = try AVCaptureDeviceInput(device: videoDevice) } catch { print("Could not create video device input: \(error)") strongSelf.setupResult = .configurationFailed return } strongSelf.session.beginConfiguration() strongSelf.session.sessionPreset = AVCaptureSession.Preset.photo // Add a video input. guard strongSelf.session.canAddInput(strongSelf.videoInput) else { print("Could not add video device input to the session") strongSelf.setupResult = .configurationFailed strongSelf.session.commitConfiguration() return } strongSelf.session.addInput(strongSelf.videoInput) // Add a video data output if strongSelf.session.canAddOutput(strongSelf.videoDataOutput) { strongSelf.session.addOutput(strongSelf.videoDataOutput) strongSelf.videoDataOutput.videoSettings = [kCVPixelBufferPixelFormatTypeKey as String: Int(kCVPixelFormatType_32BGRA)] strongSelf.videoDataOutput.setSampleBufferDelegate(self, queue: strongSelf.dataOutputQueue) } else { print("Could not add video data output to the session") strongSelf.setupResult = .configurationFailed strongSelf.session.commitConfiguration() return } // Add photo output if strongSelf.session.canAddOutput(strongSelf.photoOutput) { strongSelf.session.addOutput(strongSelf.photoOutput) strongSelf.photoOutput.isHighResolutionCaptureEnabled = true } else { print("Could not add photo output to the session") strongSelf.setupResult = .configurationFailed strongSelf.session.commitConfiguration() return } strongSelf.session.commitConfiguration() } } func prepareSession(completion: @escaping (SessionSetupResult) -&gt; Void) { sessionQueue.async { [weak self] in guard let strongSelf = self else { return } switch strongSelf.setupResult { case .success: strongSelf.addObservers() if strongSelf.photoOutput.isDepthDataDeliverySupported { strongSelf.photoOutput.isDepthDataDeliveryEnabled = true } if let photoOrientation = AVCaptureVideoOrientation(interfaceOrientation: interfaceOrientation) { if let unwrappedPhotoOutputConnection = strongSelf.photoOutput.connection(with: .video) { unwrappedPhotoOutputConnection.videoOrientation = photoOrientation } } strongSelf.dataOutputQueue.async { strongSelf.renderingEnabled = true } strongSelf.session.startRunning() strongSelf.isSessionRunning = strongSelf.session.isRunning strongSelf.mainQueue.async { strongSelf.previewView.videoPreviewLayer.session = strongSelf.session } completion(strongSelf.setupResult) default: completion(strongSelf.setupResult) } } } Then to I set isPortraitEffectsMatteDeliveryEnabled like this: func setPortraitAffectActive(_ state: Bool) { sessionQueue.async { [weak self] in guard let strongSelf = self else { return } if strongSelf.photoOutput.isPortraitEffectsMatteDeliverySupported { strongSelf.photoOutput.isPortraitEffectsMatteDeliveryEnabled = state } } } However, I don't see any Portrait Effect in the live camera view! Any ideas why?
2
0
1.6k
May ’22
Thoughts on MusicLibraryRequest as a replacement for MPMediaQuery
I'm very excited about the new MusicLibrary API, but after a couple of days of playing around with it, I have to say that I find the implementation of filtering MusicLibraryRequests a little confusing. MPMediaQuery has a fairly extensive list of predicates that can be applied, including string and persistentID comparisons for artist, album artist genre, and more. It also lets you filter on an item’s title. MusicLibraryRequests let you filter on the item’s ID, or on its MusicKit Artist and Genre relationships. To me, this seems like it adds an extra step.  With an MPMediaQuery, if I wanted to fetch every album by a given artist, I’d apply an MPMediaPropertyPredicate looking at MPMediaItemPropertyAlbumArtist and compare the string. It was also easy to change the MPMediaPredicateComparison to .contains to match more widely. If I wanted to surface albums by “Aesop Rock” or “Aesop Rock & Blockhead,” I could use that. In the MusicLibraryRequest implementation, it looks like I need to perform a MusicLibraryRequest<Artist> first in order to get the Artist objects. There’s no filter for the name property, so if I don’t have their IDs, I’ve got to use filter(text:). From there, I can take the results of that request and apply them to my MusicLibraryRequest<Album> using the filter(matching:memberOf) function.  I could use filter(text:) on the MusicLibraryRequest<Album>, but that filters across multiple properties (title and artistName?) and is less precise than defining the actual property I want to match against. I think my ideal version of the MusicLibraryRequest API would offer something like filter(matching:equalTo:) or filter(matching:contains:) that worked off of KeyPaths rather than relationships. That seems more intuitive to me. I’m not saying we need every property from every filterable MPMediaItemProperty key, but I’d love to be able to do it on title, artistName, and other common metadata. That might look something like: filter(matching: \.title, contains: “Abbey Road”) filter(matching: \.artistName, equalTo: “Between The Buried And Me”) I noticed that filter(text:) is case insensitive, which is awesome, and something I’ve wanted for a long time in MPMediaPropertyPredicate. As a bonus, it would be great if a KeyPath based filter API supported a case sensitivity flag. This is less of a problem when dealing with Apple Music catalog content, but users’ libraries are a harsh environment, and you might have an artist “Between The Buried And Me” and one called “Between the Buried and Me.” It would be great to get albums from both with something like: filter(matching: \.artistName, equalTo: “Between The Buried And Me”, caseSensitive: false)  I've submitted the above as FB10185685. I also submitted another feedback this morning regarding filter(text:) and repeating text as FB10184823. My last wishlist item for this API (for the time being!) is exposing the MPMediaItemPropertyAlbumPersistentID as an available filter attribute. I know, I know… hear me out. If you take a look at the other thread I made today, you’ll see that due to missing metadata in MusicKit, I still have some use cases where I need to be able to reference an MPMediaItem and might need to fetch its containing MPMediaItemCollection to get at other tracks on the album. It would be nice to seamlessly be able to fetch the MPMediaItemCollection or the library Album using a shared identifier, especially when it comes to being able to play the album in MusicKit’s player rather than Media Player’s.  I've submitted that list bit as FB10185789 Thanks for bearing with my walls of text today. Keep up the great work!
10
1
2.7k
Jun ’22
Editing a Library Playlist (MusicKit: iOS 16 beta)
I've just begun to dip my toes into the iOS16 waters. One of the first things that I've attempted is to edit a library playlist using: try await MusicLibrary.shared.edit(targetPlaylist, items: tracksToAdd) Where targetPlaylist is of type MusicItemCollection<MusicKit.Playlist>.Element and tracksToAdd is of type [Track] The targetPlaylist was created, using new iOS16 way, here: let newPlaylist = try await MusicLibrary.shared.createPlaylist(name: name, description: description) tracksToAdd is derived by performing a MusicLibraryRequest on a specific playlist ID, and then doing something like this: if let tracksToAdd = try await playlist.with(.tracks).tracks {    // add tracks to target playlist } My problem is that when I perform attempt the edit, I am faced with a rather sad looking crash. libdispatch.dylib`dispatch_group_leave.cold.1:     0x10b43d62c <+0>:  mov    x8, #0x0     0x10b43d630 <+4>:  stp    x20, x21, [sp, #-0x10]!     0x10b43d634 <+8>:  adrp   x20, 6     0x10b43d638 <+12>: add    x20, x20, #0xfbf          ; "BUG IN CLIENT OF LIBDISPATCH: Unbalanced call to dispatch_group_leave()"     0x10b43d63c <+16>: adrp   x21, 40     0x10b43d640 <+20>: add    x21, x21, #0x260          ; gCRAnnotations     0x10b43d644 <+24>: str    x20, [x21, #0x8]     0x10b43d648 <+28>: str    x8, [x21, #0x38]     0x10b43d64c <+32>: ldp    x20, x21, [sp], #0x10 ->  0x10b43d650 <+36>: brk    #0x1 I assume that I must be doing something wrong, but I frankly have no idea how to troubleshoot this. Any help would be most appreciated. Thanks. @david-apple?
10
0
2.6k
Jun ’22
MusicLibraryRequest to get all tracks from a playlist (iOS 16 beta)
Hi there, tl;dr: What's the best way to get all tracks (with catalog IDs) from a playlist that has more than 100 tracks, using MusicLibraryRequest. I'm doing something dumb, not understanding something, and possibly both. I've got an existing, kinda okay function that uses the MusicDataRequest and the Apple Music API to fetch all tracks from a playlist, with pagination like this: func getTracksFromAppleMusicLibraryPlaylist(playlist: AppleMusicPlaylist) async throws -> [MusicKit.Track]? {     var tracksToReturn: [MusicKit.Track] = []     var libraryTracks: [AppleMusicLibraryTrack] = []     @Sendable     func fetchTracks(playlist: AppleMusicPlaylist, offset: Int) async -> AppleMusicPlaylistFetchResponse? {         do {             let playlistId = playlist.id             var playlistRequestURLComponents = URLComponents()             playlistRequestURLComponents.scheme = "https"             playlistRequestURLComponents.host = "api.music.apple.com"             playlistRequestURLComponents.path = "/v1/me/library/playlists/\(playlistId)/tracks"             playlistRequestURLComponents.queryItems = [                 URLQueryItem(name: "include", value: "catalog"),                 URLQueryItem(name: "limit", value: "100"),                 URLQueryItem(name: "offset", value: String(offset)),             ]             if let playlistRequestURL = playlistRequestURLComponents.url {                 let playlistRequest = MusicDataRequest(urlRequest: URLRequest(url: playlistRequestURL))                 let playlistResponse = try await playlistRequest.response()                 let decoder = JSONDecoder()                 // print("Get Tracks Dump")                 // print(String(data: playlistResponse.data, encoding: .utf8)!)                            let response = try decoder.decode(AppleMusicPlaylistFetchResponse.self, from: playlistResponse.data)                 return response             } else {                 print("Bad URL!")             }         } catch {             print(error)         }         return nil     }     Logger.log(.info, "Fetching inital tracks from \(playlist.attributes.name)")     if let response = await fetchTracks(playlist: playlist, offset: 0) {         if let items = response.data {             libraryTracks = items         }         if let totalItemCount = response.meta?.total {             Logger.log(.info, "There are \(totalItemCount) track(s) in \(playlist.attributes.name)")             if totalItemCount > 100 {                 let remainingItems = (totalItemCount - 100)                 let calls = remainingItems <= 100 ? 1 : (totalItemCount - 100) / 100                 Logger.log(.info, "Total items: \(totalItemCount)")                 Logger.log(.info, "Remaining items: \(remainingItems)")                 Logger.log(.info, "Calls: \(calls)")                 await withTaskGroup(of: [AppleMusicLibraryTrack]?.self) { group in                     for offset in stride(from: 100, to: calls * 100, by: 100) {                         Logger.log(.info, "Fetching additional tracks from \(playlist.attributes.name) with offset of \(offset)")                         group.addTask {                             if let response = await fetchTracks(playlist: playlist, offset: offset) {                                 if let items = response.data {                                     return items                                 }                             }                             return nil                         }                     }                     for await (fetchedTracks) in group {                         if let tracks = fetchedTracks {                             libraryTracks.append(contentsOf: tracks)                         }                     }                 }             }         }     } // props to @JoeKun for this bit of magic     Logger.log(.info, "Matching library playlist tracks with catalog tracks...")     for (i, track) in libraryTracks.enumerated() {         if let correspondingCatalogTrack = track.relationships?.catalog?.first {             tracksToReturn.append(correspondingCatalogTrack)             print("\(i) => \(track.id) corresponds to catalog track with ID: \(correspondingCatalogTrack.id).")         } else {             Logger.log(.warning, "\(i) => \(track.id) doesn't have any corresponding catalog track.")         }     }     if tracksToReturn.count == 0 {         return nil     }     return tracksToReturn } While not the most elegant, it gets the job done, and it's kinda quick due to the use of withTaskGroup .esp with playlists containing more than 100 songs/tracks. Regardless, I'm kinda stuck, trying to do something similar with the new MusicLibraryReqeust in iOS 16. The only way I can think of to get tracks from a playlist, using MusicLibraryRequest, having read the new docs, is like this: @available(iOS 16.0, *) func getAllTracksFromHugePlaylist(id: MusicItemID) async throws -> [MusicKit.Track]? {     do {         var request = MusicLibraryRequest<MusicKit.Playlist>()         request.filter(matching: \.id, equalTo: id)         let response = try await request.response()         if response.items.count > 0 {             if let tracks = try await response.items[0].with(.tracks, preferredSource: .catalog).tracks {                 Logger.log(.info, "Playlist track count: \(tracks.count)")                 return tracks.compactMap{$0}             }         }     } catch {         Logger.log(.error, "Could not: \(error)")     }     return nil } The problem with this is that .with seems to be capped at 100 songs/tracks, and I can't see any way to change that. Knowing that, I can't seem to tell MusicLibraryRequest that I want the tracks of the playlist with the initial request, where before I could use request.properties = .tracks, which I could then paginate if available. Any help setting me on the right course would be greatly appreciated.
8
0
2.4k
Jun ’22
Grabbing arbitrary images in FxPlug 4 API
Hi, We're updating our plugins to offer native arm64 support in Final Cut Pro X and thus porting our existing code over to the FxPlug4 API. Here's were we're running into issues: Out plugin features a custom parameter (essentially a push button) that opens a configuration window on the UI thread. This window is supposed to display the frame at the current timeline cursor position. Since the FxTemporalImageAPI which we had been using before for that purpose has been deprecated, how could something like that be implemented? We tried adding a hidden int slider parameter that gets incremented once our selector is hit in order to force a re-render and then grab the current frame in renderDestinationImage:... . However, the render pipeline seems to be stalled until our selector has exited so our force render mechanism seems to be ineffective. So how do we relieably filter out and grab the source image at the current time position in FxPlug 4 ? Thanks! Best, Ray
1
0
1.1k
Jun ’22
iOS16: localIdentifier of PHAsset gets changed after saving to camera roll
Environment: iOS 16 beta 2, beta 3. iPhone 11 Pro, 12 mini Steps to reproduce: Subscribe to Photo Library changes via PHPhotoLibraryChangeObserver, put some logs to track inserted/deleted objects: func photoLibraryDidChange(_ changeInstance: PHChange) { if let changeDetails = changes.changeDetails(for: allPhotosFetchResult) { for insertion in changeDetails.insertedObjects { print("🥶 INSERTED: ", insertion.localIdentifier) } for deletion in changeDetails.removedObjects { print("🥶 DELETED: ", deletion.localIdentifier) } } } Save a photo to camera roll with PHAssetCreationRequest Go to the Photo Library, delete the newly saved photo Come back to the app and watch the logs: 🥶 INSERTED:  903933C3-7B83-4212-8DF1-37C2AD3A923D/L0/001 🥶 DELETED:  39F673E7-C5AC-422C-8BAA-1BF865120BBF/L0/001 Expected result: localIdentifier of the saved and deleted asset is the same string in both logs. In fact: It's different. So it appears that either the localIdentifier of an asset gets changed after successful saving, or it's a bug in the Photos framework in iOS 16. I've checked - in iOS 15 it works fine (IDs in logs match).
2
1
1.4k
Jul ’22
Issue setting a queue with library and non-library items at the same time (plus a couple more MusicKit issues)
As the summer continues, I have been diving deeper and deeper into MusicKit, largely with great results. A few issues have arisen that I've outlined here, feedbacks already filed and numbers included here. All of this happens on the lasted developer beta and latest Xcode beta. Thanks! FB10967343 - Setting the queue with library and non-library items at the same time doesn't work correctly In my app, I am working on a feature that lets a user shuffle songs from a collection of albums that may or may not be in their library. However, I’ve discovered an issue where the queue does not seem to work correctly when mixing these types. I’ve attempted to load ApplicationMusicPlayer by creating a Queue and to load applicationQueuePlayer using a MPMusicPlayerPlayParametersQueueDescriptor, but the same issue occurs each time. The queue is able to play songs from the same source, but if it’s been playing a library song and tries to move to a non-library song, the queue stops.  The first thing I do is pick random songs from each album, using a MusicLibraryRequest or a MusicCatalogResourceRequest as appropriate, then taking a randomElement() from the ensuing MusicItemCollection for the album.  I append each track to an array, which I then cast to MusicItemCollection so I’ve now got a MusicItemCollection consisting of the tracks I want. If I’m in MusicKit land, I simply set the queue as follows:  player.queue = ApplicationMusicPlayer.Queue(for: tracks) It takes a bit more doing in MediaPlayer, but in theory this should also work, right?    do {         let paramObjects = tracks.compactMap {             $0.playParameters         }         let params = try paramObjects.map({try JSONEncoder().encode($0)}) let dicts = try params.compactMap {               try JSONSerialization.jsonObject(with: $0, options: []) as? [String:Any]           }           let finalParams = dicts.compactMap {                 MPMusicPlayerPlayParameters(dictionary: $0)             } let descriptor = MPMusicPlayerPlayParametersQueueDescriptor(playParametersQueue: finalParams) mediaPlayer.setQueue(with: descriptor) } catch { print(error) } In either case, the following issue occurs: say that I end up with a queue made up of one library song, then one non-library song. The player will play just the first song, then it acts as if the queue has ended. Say that it has two non-library songs, then one library song. Just the two non-library songs play. Indeed, printing queue.entries shows just the number of items that were from the same source type. FB10967076 - Publishing changes from background thread error when inserting queue items When using the .insert method on ApplicationMusicPlayer.Queue on the last iOS 16 and Xcode betas, it returns a “Publishing changes from background thread” error even though the function I’m doing in is marked as a @MainActor and the stacktace indicates it was on the main thread. FB10967277 - song.with([.albums], preferredSource: .library) generates thousands of lines of EntityQueries in the console I’ve noticed that when using the preferredSource: .library when requesting additional properties on a library item creates ~6,000 of “EntityQuery” entries in the console, all in the span of a second. This doesn’t seem to be leading to any major performance issues, but it sure seems like something isn't right. let request = MusicLibraryRequest<Song>.init() do { let response = try await request.response() guard let song = response.items.first else { return } let songWithAlbums = try await song.with([.albums], preferredSource: .library) } catch { print(error) } generates the following output (except... 6,000 of them) 2022-07-31 13:02:07.729003-0400 MusicKitFutzing[9405:2192606] [EntityQuery] Finished fetching results in 0s 2022-07-31 13:02:07.729047-0400 MusicKitFutzing[9405:2192605] [EntityQuery] Finished executing query in 0.00100017s 2022-07-31 13:02:07.729202-0400 MusicKitFutzing[9405:2192611] [EntityQuery] Finished executing query in 0s 2022-07-31 13:02:07.729240-0400 MusicKitFutzing[9405:2192605] [EntityQuery] Finished fetching results in 0s
1
1
1.1k
Jul ’22
NSCocoaErrorDomain code = 134093 , only in iOS16
i have received a lot of crash log only in iOS16 the crash occured when i called : [[PHImageManager defaultManager] requestImageDataForAsset:asset options:options resultHandler:resultHandler] here is the crash log Exception Type: NSInternalInconsistencyException ExtraInfo: Code Type: arm64 OS Version: iPhone OS 16.0 (20A5328h) Hardware Model: iPhone14,3 Launch Time: 2022-07-30 18:43:25 Date/Time: 2022-07-30 18:49:17 *** Terminating app due to uncaught exception 'NSInternalInconsistencyException', reason:Unhandled error (NSCocoaErrorDomain, 134093) occurred during faulting and was thrown: Error Domain=NSCocoaErrorDomain Code=134093 "(null)" Last Exception Backtrace: 0 CoreFoundation 0x00000001cf985dc4 0x1cf97c000 + 40388 1 libobjc.A.dylib 0x00000001c8ddfa68 0x1c8dc8000 + 96872 2 CoreData 0x00000001d56d2358 0x1d56cc000 + 25432 3 CoreData 0x00000001d56fa19c 0x1d56cc000 + 188828 4 CoreData 0x00000001d5755be4 0x1d56cc000 + 564196 5 CoreData 0x00000001d57b0508 0x1d56cc000 + 935176 6 PhotoLibraryServices 0x00000001df1783e0 0x1df0ed000 + 570336 7 Photos 0x00000001df8aa88c 0x1df85d000 + 317580 8 PhotoLibraryServices 0x00000001df291de0 0x1df0ed000 + 1723872 9 CoreData 0x00000001d574e518 0x1d56cc000 + 533784 10 libdispatch.dylib 0x00000001d51fc0fc 0x1d51f8000 + 16636 11 libdispatch.dylib 0x00000001d520b634 0x1d51f8000 + 79412 12 CoreData 0x00000001d574e0a0 0x1d56cc000 + 532640 13 PhotoLibraryServices 0x00000001df291d94 0x1df0ed000 + 1723796 14 PhotoLibraryServices 0x00000001df291434 0x1df0ed000 + 1721396 15 Photos 0x00000001df8a8380 0x1df85d000 + 308096 16 Photos 0x00000001df89d050 0x1df85d000 + 262224 17 Photos 0x00000001df87f62c 0x1df85d000 + 140844 18 Photos 0x00000001df87ee94 0x1df85d000 + 138900 19 Photos 0x00000001df87e594 0x1df85d000 + 136596 20 Photos 0x00000001df86b5c8 0x1df85d000 + 58824 21 Photos 0x00000001df86d938 0x1df85d000 + 67896 22 Photos 0x00000001dfa37a64 0x1df85d000 + 1944164 23 Photos 0x00000001dfa37d18 0x1df85d000 + 1944856 24 youavideo -[YouaImageManager requestImageDataForAsset:options:resultHandler:] (in youavideo) (YouaImageManager.m:0) 27 25 youavideo -[YouaAlbumTransDataController requstTransImageHandler:] (in youavideo) (YouaAlbumTransDataController.m:0) 27 26 youavideo -[YouaAlbumTransDataController requstTransWithHandler:] (in youavideo) (YouaAlbumTransDataController.m:77) 11 27 youavideo -[YouaUploadTransDataOperation startTrans] (in youavideo) (YouaUploadTransDataOperation.m:102) 19 28 Foundation 0x00000001c9e78038 0x1c9e3c000 + 245816 29 Foundation 0x00000001c9e7d704 0x1c9e3c000 + 268036 30 libdispatch.dylib 0x00000001d51fa5d4 0x1d51f8000 + 9684 31 libdispatch.dylib 0x00000001d51fc0fc 0x1d51f8000 + 16636 32 libdispatch.dylib 0x00000001d51ff58c 0x1d51f8000 + 30092 33 libdispatch.dylib 0x00000001d51febf4 0x1d51f8000 + 27636 34 libdispatch.dylib 0x00000001d520db2c 0x1d51f8000 + 88876 35 libdispatch.dylib 0x00000001d520e338 0x1d51f8000 + 90936 36 libsystem_pthread.dylib 0x00000002544b9dbc 0x2544b9000 + 3516 37 libsystem_pthread.dylib 0x00000002544b9b98 0x2544b9000 + 2968 i can't find the error code 134093 definition i don't know what's going wrong in iOS16 Would anyone have a hint of why this could happen and how to resolve it? thanks very much
7
5
2.8k
Aug ’22
`videoChat` AVAudioSession.Mode Issues on iPhone 14 Pro Max
I work on a video conferencing application, which makes use of AVAudioEngine and the videoChat AVAudioSession.Mode This past Friday, an internal user reported an "audio cutting in and out" issue with their new iPhone 14 Pro, and I was able to reproduce the issue later that day on my iPhone 14 Pro Max. No other iOS devices running iOS 16 are exhibiting this issue. I have narrowed down the root cause to the videoChat AVAudioSession.Mode after changing line 53 of the ViewController.swift file in Apple's "Using Voice Processing" sample project (https://developer.apple.com/documentation/avfaudio/audio_engine/audio_units/using_voice_processing) from: try session.setCategory(.playAndRecord, options: .defaultToSpeaker) to try session.setCategory(.playAndRecord, mode: .videoChat, options: .defaultToSpeaker) This only causes issues on my iPhone 14 Pro Max device, not on my iPhone 13 Pro Max, so it seems specific to the new iPhones only. I am also seeing the following logged to the console using either device, which appears to be specific to iOS 16, but am not sure if it is related to the videoChat issue or not: 2022-09-19 08:23:20.087578-0700 AVEchoTouch[2388:1474002] [as] ATAudioSessionPropertyManager.mm:71  Invalid input size for property 1684431725 2022-09-19 08:23:20.087605-0700 AVEchoTouch[2388:1474002] [as] ATAudioSessionPropertyManager.mm:225  Invalid input size for property 1684431725 I am assuming 1684431725 is 'dfcm' but I am not sure what Audio Session Property that might be.
13
2
4k
Sep ’22