Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.

All subtopics
Posts under Media Technologies topic

Post

Replies

Boosts

Views

Activity

How to detect a song end?
I'm playing library items (MPMediaItem) and apple music tracks (Track) in MPMusicPlayerApplicationController.applicationQueuePlayer, but I can't use the actual Queue functionality because I can't figure out how to get both media types into the same queue. If there's a way to get both types in a single queue, that would solve my problem, but I've given up on that one. Because I can't use a queue, I have to be able to detect when a song ends so that I can put the next song in the queue and play it. The only way I can figure out to detect when a song ends is by watching the playBackState, and I've actually got that pretty much working, but it's really ugly, because you get playBackState of paused when a song ends, and when a bluetooth speaker disconnects, etc. The only answer I've been able to find on the internet is to watch the MPMusicPlayerControllerNowPlayingItemDidChange, and when that fires, and the nowPlayingItem is NIL, a song ends.. but that's not the case. When a song ends, the nowPlayingItem remains the same. There's got to be an answer to this problem, right?
14
3
5.3k
Jan ’25
Playing fMP4 Raw Chunks in AVPlayer on iOS
Hello Apple Community, We are working on a real-time streaming feature where we receive chunks of raw MP4 data through a custom protocol and store them in a buffer (array). Our goal is to use these data chunks to play a continuous video stream in AVPlayer. What We've Tried: Custom URL Scheme with AVAssetResourceLoaderDelegate: We implemented a custom URL scheme (customscheme://) to serve the buffered data using AVAssetResourceLoaderDelegate. The method shouldWaitForLoadingOfRequestedResource is called only during the initial allocation. It doesn't get triggered when new chunks are appended to the buffer. Despite appending new data to the buffer, AVPlayer doesn’t request further chunks from the delegate. What We Need: We are looking for a solution where: The player continuously fetches data from the buffer as new chunks are added. The playback remains smooth and uninterrupted, even with real-time data being appended. Ideally, this solution works with AVPlayer while adhering to HLS-like behavior without implementing an HLS server. Questions: Is AVAssetResourceLoaderDelegate the right approach for this use case? If so, how can we ensure shouldWaitForLoadingOfRequestedResource is called whenever new data is available in the buffer? Are there alternative APIs or recommended patterns for playing real-time MP4 data chunks in AVPlayer? Would implementing a custom FFmpeg-based player be necessary, or can this be achieved using AVPlayer and its APIs? We appreciate any guidance, suggestions, or examples that can help us achieve this. Thank you!
0
1
526
Jan ’25
TypeScript definitions for MusicKit JS
I take that MusicKit JS is built with TypeScript, based on the attributions in the script: https://js-cdn.music.apple.com/musickit/v3/musickit.js In the script it points to https://js-cdn.music.apple.com/musickit/v1/acknowledgements.txt – I assume this should be the v3 URL for the v3 version? It returns the same content nonetheless. This contains attributions for TypeScript. Currently there's a third-party effort with DefinitelyTyped, which publishes the NPM package @types/musickit-js. The latest supported version available is v1. However, there is no version compatible with v3. This makes it hard to use MusicKit JS v3 in a TypeScript project. Please publish the types, ideally on the CDN along with the musickit.js file. Also consider publishing an officially Apple supported DefinitelyTyped package, or help to maintain the existing @types/musickit-js to make consuming this even easier.
3
1
139
Jun ’25
ScreenCaptureKit and mixed Retina/non-Retina configuration
The two ScreenCaptureKit WWDC22 sessions show how to capture with the new framework but the retina factor is hardcoded to 2 in SCStreamConfiguration. When using on a non-retina display, the screencapture is floating on the upper-left corner of the image buffer. There does not seem to be a simple way to retrieve the retina factor from the SCShareableContent data (when configuring the capture). When processing the streaming output, the SCStreamFrameInfo attachment is supposed to have a scaleFactor property but .scaleFactor does not return a value. I have found out that the attachement dictionary contains SCStreamUpdateFrameDisplayResolution. This entry gives me the retina factor but it is not an official SCStreamFrameInfo key. I list the keys to access it. What is the proper way with ScreenCapture to handle the retina factors ?
1
1
801
Jun ’25
No audio in screen recordings when using AVAudioEngine Voice Processing
Hello, We are developing a real-time speech recognition application and are utilizing AVAudioEngine with voice processing enabled on the input node. However, we have observed that enabling this mode interferes with the built-in iOS screen recording feature - specifically, the recorded video does not capture any audio when this mode is active. Since we want users to be able to record their experience within our app, this issue significantly impacts our functionality. Is there a known workaround or recommended approach to ensure that both voice processing and screen recording can function simultaneously? Any guidance would be greatly appreciated. Thank you!
2
1
322
5d
_MediaPlayer_AppIntents compilation error for iOS 26
getting an interesting error attempting to compile my app in Xcode 26 beta. error: Unable to find module dependency: '_MediaPlayer_AppIntents' (in target 'icatcher' from project 'icatcher') note: A dependency of main module 'MainModuleCrossImportOverlays' (in target 'icatcher' from project 'icatcher') Unable to find module dependency: '_MediaPlayer_AppIntents' Not sure what to try and pull to fix this issue
1
1
118
Jun ’25
iOS Audio App + AirPlay tvOS Metadata Issue
I have an audio app that can play audio on an AirPlay device. On non-Apple TV devices, the AirPlay app (on Roku, Samsung, etc.) shows the now playing metadata: title, artist, and album art. However, on tvOS 18.1, no metadata is shown. The Apple TV device plays the audio, but there is no now playing information shown, nor any other indicators. Other media apps show the "Now Playing" controls on the upper right of the tvOS home screen. Can someone point me in the direction of how to solve this issue? I think I am missing something somewhere in regards to the tvOS metadata implementation.
1
1
557
Nov ’24
MPNowPlayingInfoCenter nowPlayingInfo throttled
Hello, I have been running into issues with setting nowPlayingInfo information, specifically updating information for CarPlay and the CPNowPlayingTemplate. When I start playback for an item, I see lock screen information update as expected, along with the CarPlay now playing information. However, the playing items are books with collections of tracks. When I select a new track(chapter) within the book, I set the MPMediaItemPropertyTitle to the new chapter name. This change is reflected correctly on the lock screen, but almost never appears correctly on the CarPlay CPNowPlayingTemplate. The previous chapter title remains set and never updates. I see "Application exceeded audio metadata throttle limit." in the debug console fairly frequently. From that a I figured that I need to minimize updates to the nowPlayingInfo dictionary. What I did: I store the metadata dictionary in a local dictionary and only set values in the main nowPlayingInfo dictionary when they are different from the current value. I kick off the nowPlayingInfo update via a task that initially sleeps for around 2 seconds (not a final value, just for my current testing). If a previous Task is active, it gets cancelled, so that only one update can happen within that time window. Neither of these things have been sufficient. I can switch between different titles entirely and the information updates (including cover art). But when I switch chapters within a title, the MPMediaItemPropertyTitle continues to get dropped. I know the value is getting set, because it updates on the lock screen correctly. In total, I have 12 keys I update for info, though with the above changes, usually 2-4 of them actually get updated with high frequency. I am running out of ideas to satisfy the throttling thresholds to accurately display metadata. I could use some advice. Thanks.
4
1
134
May ’25
Process to request the restricted entitlement behind “DJ with Apple Music” (tempo control / time-stretch on Apple Music streams)?
Hi, I’m an iOS developer building an app with an use case that needs advanced playback on Apple Music subscription streams, specifically: • Real-time tempo change (BPM) during playback — i.e., time-stretch with key-lock, not just crossfade. • Beat-matched transitions between tracks. From what I can tell, this capability seems to exist only for approved partners and isn’t available through public MusicKit. Question: What’s the official request path to be evaluated for that restricted partner entitlement (application form, questionnaire, NDA, or internal team/BD contact)? If the entitlement identifier is internal, how can I get my account routed to the right Apple Music team? For reference, publicly announced partners include Algoriddim djay, Serato DJ Pro, rekordbox (AlphaTheta), and Engine DJ—all of which appear to implement mixing features that imply advanced playback (tempo/beat-matching) on Apple Music content. I’d prefer not to share product details publicly for the moment and can provide specifics privately if needed. Thanks in advance!
0
1
197
1w
Pictures won't import from iPhone to the Photo app on iMac
Since the OS was recently updated to 18.1.1 on my iPhone 15, I am no longer able to import my pictures into the Photos app on my iMac. I have to mention that my iMac is pretty old and is running OS High Sierra 10.13.6 and is not allowing me to update the OS to a newer version. Anyway, the main error message I get is: "Some items cannot be added to your Photo library because they may be an unrecognizable file format or the file may not contain valid data". Then, for each individual photo that failed to upload, the error message I get reads, "unable to read metadata. The file may be corrupt". However, videos import just fine from my iPhone to iMac. This was not a problem before the recent iPhone update. I tried closing the Photo app and reopening it. I tried restarting my iPhone and iMac but nothing seems to work. Any help would be much appreciated.
3
1
1.8k
Dec ’24
AUv3 recent "Failed to find component with type..." frequent issues
I've been generating new Audio Unit Extension apps with Xcode 16 (and newer), and although they generally work initially, it is easy (although I'm not sure how to do it reliably) to cause the app to no longer be able to instantiate the audiounit. Generally the call to AVAudioUnit.findComponent fails and SimplePlayEngine hits the fatalError("Failed to find component with type...") In the most recent project, merely adding files to the extension (without making any use of them) caused it to go off the rails. If I "Archive" the app+plugin, there is no audio unit extension in the bundle. If I switch to the audiounit extension and build it it's fine. If I look at the build folder in Library/Developer/Xcode/project_folder the extension_name.appex is there. Any ideas? If I can coax an unmodified audio unit extension project to exhibit this behavior I'll attach it here. Right now what I have has code I don't want to share.
4
1
678
Jan ’25
AVAudioPlayer/SKAudioNode audio no longer plays after interruption
Hi 👋! We have a SpriteKit-based app where we play AVAudio sounds in three different ways: Effects (incl. UI sounds) with AVAudioPlayer. Long looping tracks with AVAudioPlayer. Short animation effects on the timeline of SpriteKit's SKScene files (effectively SKAudioNode nodes). We've found that when you exit the app or otherwise interrupt audio plays, future audio plays often fail. For example, there's a WebKit-based video trailer inside the app, and if you play it, our looping background music track (2.) will stop playing, and won't resume as you close the trailer (return from WebKit). This is probably due to us not manually restarting the track (so may well be easily fixed). Periodically played AVAudioPlayer audio (1.) are not affected. However, the more concerning thing is that the audio tracks on SKScene file timelines (3.) will no longer play. My hypothesis is that AVAudioEngine gets interrupted, and needs to be restarted for those AVAudioNode elements to regain functionality. Thing is, we don't deal with AVAudioEngine at all currently in the app, meaning it is never initiated to begin with. Obviously things return to normal when you remove the app from short-term memory and restart it. However, it seems many of our users aren't doing this, and often report audio failing presumably due to some interruption in the past without the app ever being cleared from memory. Any idea why timeline-run SKAudioNodes would fail like this? Should the app react to app backgrounding/foregrounding regarding audio? Any help would be very much appreciated ✌️!
0
1
78
May ’25
MusicKit and sorted artist and album names?
I have an app that gets data from Music.app with both the iTunesLibrary and MusicKit. iTunesLibrary has ITLibArtist.sortName and ITLibAlbum.sortTitle and ITLibAlbum.sortAlbumArtist. I can’t seem to find an equivalent in MusicKit. How are those properties obtained using MusicKit? Thanks. FYI I have filed FB15554956 on this. You also may see my code at https://github.com/bolsinga/itunes_json
1
1
503
Nov ’24
EXIF Makernote no read in Ventura
I have a custom app running on a Mac Studio with Ventura that grabs a snapshot image from a network camera. It then adds some extra information into the EXIF "MakerNote" field. However the metadata cannot be read back out of the image when running Ventrua, it can however be read out of the same image file on a Mac that is not running Ventura. It would appear Apple has removed support for reading MakerNote in Ventura but still supports writing MakerNote in Ventura. This code is about 7 years old and written in ObjC and has worked with no issue until Ventura came along. Calls used CGImageDestinationAddImageFromSource(); // used to write the image to disk with the extra metadata - Works on Ventura CGImageSourceCopyPropertiesAtIndex(); // used to read the meta data from an image - does not return "MakeNote" data Is there a new way to read EXIF "MakeNote" data from image files that was introduced with Ventura?
2
1
921
Nov ’24
iOS 17 camera capture assertions and issues
Hello, Starting in iOS 17, our application started having some issue publishing to our video session. More specifically the video capture seems to be broken in some, but not all sessions. What's troubling is that we're seeing that it fails consistently every 4 sessions. It also fails silently, without reporting any problems to the app. We only notice that there are no frames being rendered or sent to the remote devices. Here's what shows-up in the console: <<<< FigCaptureSourceRemote >>>> Fig assert: "! storage->connectionDied" at bail (FigCaptureSourceRemote.m:235) - (err=0) <<<< FigCaptureSourceRemote >>>> Fig assert: "err == 0 " at bail (FigCaptureSourceRemote.m:253) - (err=-16453) Anyone seeing this? Any idea what could be the cause? Our sessions work perfectly on iOS16 and below. Thanks
3
1
1.3k
2w