Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.

All subtopics
Posts under Media Technologies topic

Post

Replies

Boosts

Views

Activity

Received the "Profile for Dolby Vision MUST be Profile 5 or 10" error for the channel that has only dolby vision profile 5
While validating a Dolby Vision Profile 5 playlist in CMAF format (with segments in MP4), the Media Stream Validator reported the following error in the MUXT-FIX-ISSUES list: However, the playlist correctly specifies Dolby Vision Profile 5 in both the EXT-X-STREAM-INF and EXT-X-I-STREAM-INF tags. Playlist: #EXTM3U #EXT-X-VERSION:8 #EXT-X-INDEPENDENT-SEGMENTS #EXT-X-MEDIA:TYPE=AUDIO,GROUP-ID="audio-ec3",LANGUAGE="und",NAME="Undetermined",AUTOSELECT=YES,CHANNELS="6",URI="var14711339/aud1257/playlist.m3u8?device_profile=hls&seg_size=6&cmaf=1" #EXT-X-STREAM-INF:BANDWIDTH=14680000,AVERAGE-BANDWIDTH=14676380,VIDEO-RANGE=PQ,CODECS="ec-3,dvh1.05.06",RESOLUTION=3840x2160,AUDIO="audio-ec3" var14711339/vid/playlist.m3u8?device_profile=hls&seg_size=6&cmaf=1 #EXT-X-I-FRAME-STREAM-INF:BANDWIDTH=1419881,URI="trk14711339/playlist.m3u8?device_profile=hls&cmaf=1",VIDEO-RANGE=PQ,CODECS="dvh1.05.06",RESOLUTION=3840x2160 Could you please review this and clarify: Why is the Media Stream Validator reporting this error, even though the playlist correctly includes Dolby Vision Profile 5 parameters in CMAF format? Why is this error not reported when using a playlist with TS segments instead of CMAF (MP4)?
1
0
74
Jul ’25
play videos in webM format on iOS Deveices
I would like to play videos in webM format on my iPhone. I understand that it is basically impossible to play videos in webM format on an iPhone, but is there any way to display videos in webM format? I would like to know if there is an official Swift SDK or development kit released by Apple. Or if there are any third-party products, please let me know.
1
0
251
Jul ’25
Does CMIO support "hide" build-in camera
Hi guys, Can I use CMIO to achieve the following feature on macOS when a USB device (Camera/Mic/Speaker) is connected: When a third-party video conferencing app is not in a meeting, ensure the app defaults to using the USB device (Camera/Mic/Speaker). When a third-party conferencing app is in a meeting, ensure the app automatically switches to the USB device (Camera/Mic/Speaker).
2
0
123
Jul ’25
Instagram video embed in WKWebView freezes on start on iOS18.5
Hi everyone! Here's what I observed so far: On device it's reproducible on iOS/iPadOS18.5, but works on iPadOS17.7. On iPhone16 iOS 18.5 simulator that I was extensively using for development it was reproducible until I reset content and settings. On iPhone 16 iOS18.4 simulator, which was also used a lot during development it still works always, so I tend to think it's 18.5 issue. Setting config.websiteDataStore = .nonPersistent() doesn't help. Cleaning WKWebsiteDataStore doesn't help. It works fine using direct URL from the embedded code (see the code below). Can someone provide some insight on how this could be fixed? Here's the code: import SwiftUI import WebKit @main struct IGVideoApp: App { var body: some Scene { WindowGroup { WebView() } } } private struct WebView: UIViewRepresentable { func makeUIView(context: Context) -> WKWebView { let config = WKWebViewConfiguration() config.allowsInlineMediaPlayback = true return .init(frame: .zero, configuration: config) } func updateUIView(_ uiView: WKWebView, context: Context) { let urlString = "https://www.instagram.com/reel/DKHFOGct3z7/?utm_source=ig_embed&amp;utm_campaign=loading" /// It works when loading from the data-instgrm-permalink URL directly // uiView.load(.init(url: .init(string: "\(urlString)")!)) /// It doesn't work whith embedding /// Note: the code part for embedding (<blockquote>...</blockquote>) is taken from my /// Instagram post (https://www.instagram.com/p/DKHFOGct3z7/) /// and stripped down. The urlString was also extracted for demonstration of direct loading. let string = """ <!doctype html> <meta charset="utf-8" /> <meta name="viewport" content="width=device-width, initial-scale=1"> <html> <head /> <body style="background-color:black; margin:0px"> <blockquote class="instagram-media" data-instgrm-captioned data-instgrm-version="14" data-instgrm-permalink="\(urlString)"> </blockquote> <script async src="https://www.instagram.com/embed.js"></script> </body> </html> """ uiView.loadHTMLString(string, baseURL: .init(string: "https://www.instagram.com")) } }
0
0
214
Jul ’25
AVCaptureSession startRunning is slow
AVCaptureSession's startRunning method is thread blocking and seems to be slow. What is this method doing behind the scenes? For context: I'm working on Simulator Camera support and I have a 'fake' AVCaptureDevice that might be causing this. My hypothesis is that AVCaptureSession tries to connect to the device and waits for a notification to be posted back. I'd love to find a way to let my fake device message AVCaptureSession that it's connected.
3
0
206
Jul ’25
Logic Pro cannot load v3 audio unit with framework compiled with Swift 6
Sequoia 15.4.1 (24E263) XCode: 16.3 (16E140) Logic Pro: 11.2.1 I’ve been developing a complex audio unit for Mac OS that works perfectly well in its own bespoke host app and is now well into its beta testing stage. It did take some effort to get it to work well in Logic Pro however and all was fine and working well until: The AU part is an empty app extension with a framework containing its code. The framework contains Swift code for the UI and C code for the DSP parts. When the framework is compiled using the Swift 5 compiler the AU will run in Logic with no problems. (I should also mention that AU passes the most strict auval tests). But… when the framework is compiled with Swift 6 Logic Pro cannot load it. Logic displays a message saying the audio unit could not be loaded and to contact the developer. My own host app loads the AU perfectly well with the Swift 6 version, so I know there’s nothing wrong with the audio unit. I cannot find any differences in any of the built output files except, of course, the actual binary code in the framework. I’ve worked for hours on this and cannot find a solution other than to build the framework in Swift 5. (I worked hard to get all the async code updated and working with Swift 6! so I feel a little cheated!) What is happening? Is this a bug in Logic? Is this a bug in Swift 6 compiler/linker? I’m at the Duh! hands in the air, tearing out hair stage! ( once again!)
1
0
304
Jul ’25
I cannot acquire entitlement named com.apple.developer.avfoundation.video-data-output-prepares-cellular-radio-for-machine-readable-code-scanning.
AVCaptureVideoDataOutput.preparesCellularRadioForNetworkConnection requires com.apple.developer.avfoundation.video-data-output-prepares-cellular-radio-for-machine-readable-code-scanning. But I cannot acquire its entitlement. I can't find its entitlement on 'Certificates, Identifiers &amp; Profiles'. Any solutions? Provisioning profile "iOS Team Provisioning Profile: ......" doesn't include the com.apple.developer.avfoundation.video-data-output-prepares-cellular-radio-for-machine-readable-code-scanning entitlement.
2
0
491
Jul ’25
Will the new Automix feature in iOS 26 be available for third-party apps using MusicKit?
Hi everyone, We’re currently developing a music-based app using MusicKit, and we recently noticed that iOS 26 beta introduces a new “Automix” feature in the Apple Music app. This enables seamless DJ-style transitions between songs—beyond the standard crossfade functionality. We’re trying to understand: Will this Automix feature be accessible to third-party apps that use MusicKit? If not available in the initial iOS 26 release, is there a plan to expose it through public APIs in a future update? Is there any technical documentation, WWDC session, or roadmap info regarding Automix support via MusicKit? This functionality would be a significant enhancement for our app, especially for intelligent audio transitions and curated playlists. Thanks.
1
3
463
Jul ’25
After iOS 18.5, the AVFoundation AVCaptureSessionInterruptionReason.videoDeviceNotAvailableWithMultipleForegroundApps error has caused a significant increase in camera black screen issues.
Issue: After iOS 18.5 release, our app is experiencing a significant increase in AVCaptureSessionInterruptionReason.videoDeviceNotAvailableWithMultipleForegroundApps errors. Details: Our camera-related code has not been updated recently.However, we've observed that the error rate has significantly increased starting from May 2025. The error rate has risen from approximately 0.02% (2 in 10,000 users) to 0.1% (1 in 1,000 users). This represents a 5x increase in error occurrence. The frequency has increased noticeably since iOS 18.5 This is affecting our app's camera functionality and user experience Questions: Are there any known changes in iOS 18.5 regarding camera access management? What are the recommended best practices to handle this interruption reason? Are there any API changes we should be aware of? Best, Shay
1
0
257
Jul ’25
EXT-X-DISCONTINUITY misalignment
We encounter issue with avplayer in case of EXT-X-DISCONTINUITY misalignment between audio and video produced after insertion of gaps. The initial objective is to introduce an EXT-X-DISCONTINUITY in audio playlist after some missing segments (EXT-X-GAP) which durations are aligned to video segments durations, to handle irregular audio durations. Please find below an example of corresponding video and audio playlists: video: #EXTM3U #EXT-X-VERSION:7 #EXT-X-MEDIA-SEQUENCE:872524632 #EXT-X-INDEPENDENT-SEGMENTS #EXT-X-TARGETDURATION:2 #USP-X-TIMESTAMP-MAP:MPEGTS=7096045027,LOCAL=2025-05-09T12:38:32.369100Z #EXT-X-MAP:URI="hls/StreamingBasic-video=979200.m4s" #EXT-X-PROGRAM-DATE-TIME:2025-05-09T12:38:32.369111Z #EXTINF:2.002, no desc hls/StreamingBasic-video=979200-872524632.m4s #EXTINF:2.002, no desc hls/StreamingBasic-video=979200-872524633.m4s #EXTINF:2.002, no desc hls/StreamingBasic-video=979200-872524634.m4s #EXTINF:2.002, no desc hls/StreamingBasic-video=979200-872524635.m4s #EXTINF:2.002, no desc hls/StreamingBasic-video=979200-872524636.m4s ## Media sequence discontinuity #EXT-X-GAP #EXTINF:2.002, no desc hls/StreamingBasic-video=979200-872524637.m4s ## Media sequence discontinuity #EXT-X-GAP #EXTINF:2.002, no desc hls/StreamingBasic-video=979200-872524638.m4s #EXT-X-PROGRAM-DATE-TIME:2025-05-09T12:38:46.383111Z #EXTINF:2.002, no desc hls/StreamingBasic-video=979200-872524639.m4s #EXTINF:2.002, no desc hls/StreamingBasic-video=979200-872524640.m4s audio: EXTM3U #EXT-X-VERSION:7 #EXT-X-MEDIA-SEQUENCE:872524632 #EXT-X-INDEPENDENT-SEGMENTS #EXT-X-TARGETDURATION:2 #USP-X-TIMESTAMP-MAP:MPEGTS=7096045867,LOCAL=2025-05-09T12:38:32.378400Z #EXT-X-MAP:URI="hls/StreamingBasic-audio_99500_eng=98800.m4s" #EXT-X-PROGRAM-DATE-TIME:2025-05-09T12:38:32.378444Z #EXTINF:2.0053, no desc hls/StreamingBasic-audio_99500_eng=98800-872524632.m4s #EXTINF:2.0053, no desc hls/StreamingBasic-audio_99500_eng=98800-872524633.m4s #EXTINF:2.0053, no desc hls/StreamingBasic-audio_99500_eng=98800-872524634.m4s #EXTINF:1.984, no desc hls/StreamingBasic-audio_99500_eng=98800-872524635.m4s #EXTINF:2.0053, no desc hls/StreamingBasic-audio_99500_eng=98800-872524636.m4s ## Media sequence discontinuity #EXT-X-GAP #EXTINF:2.002, no desc hls/StreamingBasic-audio_99500_eng=98800-872524637.m4s ## Media sequence discontinuity #EXT-X-GAP #EXTINF:2.002, no desc hls/StreamingBasic-audio_99500_eng=98800-872524638.m4s #EXT-X-DISCONTINUITY #EXT-X-PROGRAM-DATE-TIME:2025-05-09T12:38:46.778444Z #EXTINF:1.6213, no desc hls/StreamingBasic-audio_99500_eng=98800-872524639.m4s #EXTINF:2.0053, no desc hls/StreamingBasic-audio_99500_eng=98800-872524640.m4s In this case playback is broken with avplayer. Is it conformed to Http Live Streaming? Is it an avplayer bug? What are the guidelines to handle such gaps?
0
0
119
Jul ’25
I need a way to permantently disable Reactions from my app, ideally the universe too
So I've spent the last five years optimizing my video AI system so that it runs with less than 5% CPU while processing a 30fps video feed on a Macbook Pro M2, and everything is great, until Sonoma comes out, and I find myself consuming 40% CPU for the exact same workload. So I fire up Instruments, and the "heaviest stack trace" (see screenshot) turns out to be Espresso doing some completely unasked-for and absolutely useless processing on my video frames. I turn off Reactions, but nothing helps - the CPU consumptions stays at 40%. "Reactions" is nothing but a useless toy to please some WWDC keynote fanboys, I don't want it anywhere near my app or my users, and I especially do not want to take the blame for it pissing away the user's CPU cycles and battery. Now, how do I make it go away, for ever? Best regards Jacob
10
6
1.3k
Jul ’25
IPadOS 17 external camera exposure
I'm developing iPad app that will be mostly dedicated for certain external camera for visually impaired people. The linux UVC api (e.g. using guvcview) allows to enable automatic exposure for the camera. IOs api "isExposureModeSupported" unfortunately returns false for any of the exposure modes. Is it a bug? Or perhaps AVFoundation doesn't support UVC exposure yet?
1
2
535
Jul ’25
Apple Music iOS 26 features in Android
Since many users like me use Apple Music on Android, the app is almost as feature-rich as iOS. It would be fantastic if the developers could add the new iOS 26 features to the Android app, along with a minor UI change. I know it’s challenging to implement liquid glass on Android hardware or design, but features like auto-mix, pronunciation, and translation could be added. kindly consider this request !!!!
1
0
101
Jul ’25
Final Cut Pro Workflow Extension Drag and Drop to Timeline Not Working Despite Proper FCPXML Clip Structure
Our Final Cut Pro workflow extension built with ProExtensionHost framework uses an advanced NSPasteboardItemDataProvider system with multi-version FCPXML support (1.9, 1.10, 1.13) and proper relative path UIDs for Motion templates. We've implemented clip wrapper approach with placeholder assets and elements containing effects to enable direct timeline drag functionality. However, drag and drop from our Final Cut Pro workflow extension directly to timeline is still not working despite proper element structure in our FCPXML. Our implementation creates valid clip elements with effects applied, but Final Cut Pro timeline doesn't accept them during drag operations from our ProExtensionHost-based workflow extension. Steps to Reproduce: Create Final Cut Pro workflow extension using ProExtensionHost framework with NSPasteboardItemDataProvider implementation Generate FCPXML with proper element structure: Expected Result: Clip should be accepted by timeline and effect applied from workflow extension Actual Result: Timeline rejects drag operation from ProExtensionHost-based workflow extension Question: Are there additional requirements or ProExtensionHost API calls needed beyond standard NSPasteboardItemDataProvider for Final Cut Pro workflow extension timeline drag functionality?
0
0
119
Jul ’25
Apple Music / MusicKit and Simulator
It's been an ask for a few years and I'm wondering if there are any plans, or whether the '26 SDKs/Tools allow Apple Music to work in the simulator? I develop for the Vision Pro so the usual 'fix' of running on the device is a bit of a hard ask. At the very least a small sample library that works in the simulator would be welcome (similar to how photos works) Cheers
1
1
125
Jul ’25
How can third-party iOS apps obtain real-time waveform / spectrogram data for Apple Music tracks (similar to djay & other DJ apps)?
Hi everyone, I’m working on an iOS MusicKit app that overlays a metronome on top of Apple Music playback, using ApplicationMusicPlayer. To line the clicks up perfectly I’d like access to low-level audio analysis data—ideally a waveform / spectrogram or beat grid—while the track is playing. I’ve noticed that several approved DJ apps (e.g. djay, Serato, rekordbox) can already: • Display detailed scrolling waveforms of Apple Music songs • Scratch, loop or time-stretch those tracks in real time That implies they receive decoded PCM frames or at least high-resolution analysis data from Apple Music under a special entitlement. My questions: Does MusicKit (or any public framework) expose real-time audio buffers, FFT bins, or beat markers for streaming Apple Music content? If not, is there an Apple program or entitlement that developers can apply for—similar to the “DJ with Apple Music” initiative—to gain that deeper access? Where can I find official documentation or a point of contact for this kind of request? I’ve searched the docs and forums but only see standard MusicKit playback APIs, which don’t appear to expose raw audio for DRM-protected songs. Any guidance, links or insider tips on the proper application process would be hugely appreciated! Thanks in advance.
1
2
161
Jul ’25
AVKit - PiP with AVSampleBufferDisplayLayer Error
AVPictureInPictureControllerContentSource *contentSource = [[AVPictureInPictureControllerContentSource alloc] initWithSampleBufferDisplayLayer:self.renderView.sampleBufferDisplayLayer playbackDelegate:self]; AVPictureInPictureController *pictureInPictureController = [[AVPictureInPictureController alloc] initWithContentSource:contentSource]; pictureInPictureController.delegate = self; (void)pictureInPictureController:(AVPictureInPictureController *)pictureInPictureController failedToStartPictureInPictureWithError:(NSError *)error { //error NSError * domain: @"PGPegasusErrorDomain" - code: -1003 0x00000002819fe3a0 } when first start the PiP play, I got the error "//error NSError * domain: @"PGPegasusErrorDomain" - code: -1003 0x00000002819fe3a0", why? and second start is Ok.
0
0
116
Jul ’25
HDR video & screen brightness
When I play an HDR video in the iPhone Photos app, I can see the HDR effect obviously. But if this HDR video is played continuously for more than 30-40 minutes, the HDR effect will disappear and the brightness will be compressed to the SDR range. This issue will appear on any iPhone. Depending on the phone, it may be 20-30 minutes, or 30-40 minutes, or even a few minutes, such as iPhone 12 mini. Similarly, if I use AVPlayer to play and preview an HDR video, if it plays more than 30-40 minutes, the HDR effect will disappear and the screen brightness will dim. Also the currentEDRHeadroom will gradually decrease to 1 Note, test it with an HDR video longer than 1 hour, and if the video is short, please loop it. My question is how to avoid losing the HDR effect after 30-40 minutes when I use CAMetalLayer to render any HDR video.
1
0
100
Jul ’25