Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.

All subtopics
Posts under Media Technologies topic

Post

Replies

Boosts

Views

Activity

Cannot capture HDR content using ScreenCaptureKit
With the help of Claude and Codex, I've tried upgrading a screen magnification, capture, and pixel peeping app (SnoopX, from Stanford's software repository) to be HDR-aware. For images that are truly HDR (displayed in Apple Photos, Google Photos in Chrome, or Mac Preview), and on an HDR monitor with plenty of EDR headroom (Apple XDR display), I should see pixel values well above 1.0. Apple's Digital Color Meter does. However, my app does not. I think the code is doing all the right things with ScreenCaptureKit, but it never returns values above 1.0. Has anybody gotten this to work? Here is what Codex says about the code it helped me build in my app: • Set captureDynamicRange: We set config​.capture​Dynamic​Range = ​SCCapture​Dynamic​Range​HDRLocal​Display (when available). Note: Gemini’s SCCapture​Mode​HDRLocal​Display isn’t a real symbol — the actual API is SCCapture​Dynamic​Range. • Use preset: We try init​With​Preset: ​SCStream​Configuration​Preset​Capture​HDRStream​Local​Display, but your runtime crashes on init​With​Preset: (unrecognized selector), so we fall back to manual config. That means your system SDK headers mention the preset, but the runtime doesn’t implement it. • Pixel format: We explicitly set k​CVPixel​Format​Type​_64​RGBAHalf. Your logs show the stream buffer is 64RGBAHalf. • Enable HDR on display: EDR headroom logs show current=6.40, so HDR is active. • Color space: We set Extended Linear Display P3 for stream, and we forced the screenshot sampler to convert into extended‑linear P3 before reading. The screenshot itself reports Extended sRGB, but conversion doesn’t yield >1.0 either. Despite all of that, the OS is still delivering tone‑mapped values (max 1.0000) for both the stream and the screenshot path. So I don’t think there’s a missing knob here; it looks like a system‑level limitation on Tahoe for desktop capture.
1
0
51
11h
MusicKit + AirPlay
Hello, I'm working on a MusicKit based SwiftUI app. I've integrated AirPlay using the AVRoutePickerView like so: struct UIKitAirPlayPickerView: UIViewRepresentable { func makeUIView(context: Context) -> AVRoutePickerView { let routePickerView = AVRoutePickerView() routePickerView.prioritizesVideoDevices = false return routePickerView } func updateUIView(_ uiView: AVRoutePickerView, context: Context) {} } The AirPlay menu appears as expected, and selecting an AirPlay device functions as expected. I'm currently sending audio from my app to a HomePod. However, the state of the AVRoutePickerView does not reflect the playback state. There is no cover art and it says "Not Playing". When my device is locked, my lock screen shows the album art, metadata and AirPlay routing as expected. My app uses the ApplicationMusicPlayer however I encounter the same behavior using the SystemMusicPlayer. Any guidance on how to troubleshoot this? Is there any other way to integrate the system AirPlay picker into my app, or is this my only option? Thank you for reading.
1
0
184
11h
UVC over MFi – Is there official support? Implementation guidance?
Hello everyone, I’m looking for more detailed information regarding UVC (USB Video Class) over MFi within the Apple ecosystem and would appreciate some clarification. I’m interested in developing (or interfacing with) an accessory that transmits video over USB using the UVC standard, and I’d like to better understand how this works within the MFi (Made for iPhone) program. Here are my main questions: 1. Do iOS devices provide native support for UVC over USB-C or Lightning within the MFi framework? 2. Are there any specific firmware or authentication requirements when the accessory is MFi-certified? 3. Does UVC support depend solely on the hardware interface (USB-C vs Lightning), or are there additional software-level requirements? 4. Is there any official documentation outlining the recommended flow for implementing UVC-based video capture accessories on iOS? From what I understand, USB-C iPads appear to offer more direct support for standard UVC devices, but it’s not entirely clear how this integrates with the MFi ecosystem with iOS, especially for commercial product development. If anyone has gone through this process or can point me to relevant technical documentation, I would greatly appreciate the guidance. Thank you!
0
0
25
1d
Clarification on SPC Version 3 Availability and Requirements (SDK 26 Certificate Bundle)
Hello, I’m using a valid certificate bundle generated with SDK 26 (combined RSA‑1024 + RSA‑2048). However, all my devices currently still generate SPC v2 during playback, including my iPhone 16 under iOS 26.2. Apple staff mentioned that future iOS versions will send SPC v3 when using an SDK 26 certificate bundle. Could you please clarify: Which iOS/macOS versions will first support SPC v3? Are there any additional client‑side requirements (Safari version, playback APIs, headers, etc.) to trigger SPC v3? Is there any way to test SPC v3 today, e.g., using beta builds? Thank you!
0
1
53
2d
Lost 32 digit ASk
I received my approval for FairPlay Streaming (FPS) and was getting things organized and then my computer crashed. So... Yes, I lost the 32 digit Account Security Key (ASk) that I was warned not to lose repeatedly... I understand that I can't query apple for the existing ASk. I don't see where I can delete the existing cert to request another one. So I assume I'll need to start from scratch either with another FPS approval process. Can someone please direct me on next steps for this boneheaded situation. Thank you
5
0
88
3d
Video in "Made for iPad" apps on macOS
I'm relatively new to Swift development (and native iOS development for that matter) I've got an iOS app that uses the iPhone / iPad built in cameras, and am looking to make this more compatible with macOS. Using the normal AVCaptureDevice.DiscoverySession I seem to get the iPhone Continuity Camera and the in-built MacBook Pro camera but I don't see other input devices that I see in QuickTime Player (for example) such as connected external cameras or Virtual Inputs provided by NDI Virtual Input and OBS. Is there a way to see access these without a specific Mac build (as the rest of the functionality works great, and I'd rather not diverge the codebase too much as it's easier to update one app than two!
0
0
29
4d
AVAudioSession.outputVolume does not reflect system volume changes made while app is in background
I have a question regarding the behavior of AVAudioSession.sharedInstance().outputVolume. Observed behavior: When the app is in the foreground, I read audioSession.outputVolume (for example, 0.1). The app is then moved to the background. While the app is in the background, the user changes the system volume using the hardware buttons (for example, to 0.5). When the app returns to the foreground, audioSession.outputVolume still reports the previous value (0.1). From my testing, outputVolume only seems to update when the system volume is changed while the app is in the foreground. Volume changes made while the app is in the background are not reflected when the app returns to the foreground. Questions: According to Apple’s documentation for AVAudioSession.outputVolume: “The systemwide output volume set by the user.” https://developer.apple.com/documentation/avfaudio/avaudiosession/outputvolume However, based on our testing on iOS 18.6.2 and iOS 18.1, the observed behavior seems to differ from this description. Questions: The documentation states that outputVolume represents the system-wide volume set by the user. In our testing, the value does not reflect volume changes made while the app is in the background and only updates when the app is in the foreground.Is this the expected behavior of AVAudioSession.outputVolume? Is there any other recommended way in Swift to retrieve the current system volume that reflects user changes made both while the app is in the foreground and while it is in the background? Any clarification on the intended behavior or recommended handling would be greatly appreciated.
1
1
161
4d
AVAudioSession.outputVolume not reporting correctly in iOS 18+ devices
I’m using the shared instance of AVAudioSession. After activating it with .setActive(true), I observe the outputVolume, and it correctly reports the device’s volume. However, after deactivating the session using .setActive(false), changing the volume, and then reactivating it again, the outputVolume returns the previous volume (before deactivation), not the current device volume. The correct volume is only reported after the user manually changes it again using physical buttons or Control Center, which triggers the observer. What I need is a way to retrieve the actual current device volume immediately after reactivating the audio session, even on the second and subsequent activations. Disabling and re-enabling the audio session is essential to how my application functions. I’ve tested this behavior with my colleagues, and the issue is consistently reproducible on iOS 18.0.1, iOS 18.1, iOS 18.3, iOS 18.5 and iOS 18.6.2. On devices running iOS 17.6.1 and iOS 16.0.3, outputVolume correctly reflects the current volume immediately after calling .setActive(true) multiple times.
3
1
253
4d
Unable to trigger AudioRecordingIntent from background
I am building an app where I am using AudioRecordingIntent to start audio recording from shortcuts / Action button etc. Whenever I set that up, I notice that I get an error - Unknown NSError Live Activity start failed: The operation couldn’t be completed. Target is not foreground I explicitly try to start the live activity and then start the audio recording and that's when I see this error. How can I make this work? I am unable to find any examples.
0
0
25
4d
Blockchain question
I’m a normie: not a developer at all. My idea might be super dumb. Would it be possible to please let us have a button in iphone photos that when toggled allows us to save certain chosen raw images to an iphone block chain, AND have Apple authenticate they are native photos, marked the milisecond they were taken, that they are native and no AI was used on those images? That might go a long way toward restoring trust in truth in photos again. We could also have the same thing for AI. Marked notification in the data on AI photos that can't be erased. Sorry if this is already underway and I'm just a normal person and therefore don't know it. I just want to be able to trust things again. 🤷🏽‍♀️
0
0
16
4d
ioreg AVBControllerState - AVB/EAV Mode
Hello. To determine wether "AVB/EAV Mode" of a AV-capable network interfaces is turned on or off I query the IO registry and evaluate the property "AVBControllerState". I was wondering if this is the "correct" approach and if there is anything known about the values for this property? Network interfaces without AV capability may also carry this property (e.g.: for my WiFi adapter the value of 1) whereas the value for interfaces with AV capability can be 0 and 3. At least as far as I could observe with my limited amount of test devices at hand. Is it safe to assume that a value of 3 means this feature is turned on, 0 that it is turned off and ignore values of 1? Is there another approach to get to know the status of the "AVB/EAV Mode"? Thanks for any insight. Best regards, Ingo
0
0
33
6d
coreaudio-api mailing list search broken
Hello, The search functionality of the coreaudio-api mailing list archive has been broken for a very long time. Several of the lower-level audio APIs have only been discussed on this mailing list, making it critical for those of us maintaining old audio code. Steps to reproduce: Open https://lists.apple.com/archives/list/coreaudio-api@lists.apple.com/ in your web browser. Enter a search term in the "Search this list" field in the top-right corner of the page. The search will eventually time out with "502 Bad Gateway" Can somebody please forward this information to the current maintainer? I've tried to contact developer support but they weren't sure what to do. Thanks!
1
0
174
1w
Inquiry regarding CoreMediaErrorDomain Code=-12880 in LL-HLS playback
Hello, I am developing a custom player SDK based on AVPlayer. While testing LL-HLS streams, I intermittently encounter the following error: Error Domain=CoreMediaErrorDomain Code=-12880 Since I cannot find documentation for this specific code, could you please clarify its meaning? Specifically, I would like to know if this is a critical error that disrupts playback, or if it is just a warning that can be safely ignored. Any insights would be appreciated. Thank you.
1
0
85
1w
Keeping PiP alive during third-party video recording (camera capture)
I’m building a teleprompter-style app that relies on Picture in Picture. PiP starts correctly on device. Everything works — until another app (e.g. TikTok / Instagram) starts active video recording. When camera capture begins in the foreground app, iOS terminates my PiP session. Some teleprompter apps appear to keep PiP active while recording in other apps, so I’m trying to understand the recommended architectural pattern for this scenario. Is there a documented approach or best practice to keep PiP stable during third-party camera capture? Looking specifically for guidance on the correct AVKit / AVAudioSession configuration for this use case.
0
0
76
1w
Camera layout within a sheet in iOS26
When inputting data within a sheet, I'm allowing the user to take a photo, so the camera is called and presents itself within a 2nd sheet, however the controls are centered within the iPhone's entire screen, cropping the top controls and not extending down to the bottom of the phone's screen. Any help on how to fix this?
2
0
69
1w
Crash when trying to get originatingRecipient
According to the documentation (https://developer.apple.com/documentation/avfoundation/avcontentkeyrequest/originatingrecipient?changes=_3&language=objc), starting with ios 18.4, I can get AVContentKeyRecipient from AVContentKeyRequest. But when I try to get it, I get a crash. What could be the issue? I want to note that I add the asset to the AVContentKeySession using the addContentKeyRecipient method (https://developer.apple.com/documentation/avfoundation/avcontentkeysession/addcontentkeyrecipient(_:)?changes=_3&language=objc).
1
0
127
1w
FairPlay: SDK4 vs SDK26 credentials/certificate for iOS/tvOS client apps
Hi We’re updating our KSM to support SPC v2/v3 and currently operate with both legacy SDK4 credentials (ASK + 1024 cert) and SDK26 credentials (certificate bundle + provisioning data + 1024/2048 keys). Our client apps run across a wide range of iOS/tvOS versions, so we want to follow Apple’s recommended client strategy for certificate selection. The docs describe SHA‑1 vs SHA‑256 in the SPC header, but do not specify which OS versions should use SDK4 vs SDK26 credentials. Could you clarify: Is there an official minimum iOS/tvOS version where you recommend SDK26 credentials for client apps? For older OS versions (e.g. iOS 15), is SDK4 still the recommended choice for client apps? Are there any official migration guidelines for client apps moving from SDK4 to SDK26 credentials? Thanks in advance.
2
0
201
1w
Background Upload Extension Bug on iOS 26.2
While implementing the new background backup feature introduced in iOS 26.1, I create a PHAssetResourceUploadJob in an Extension. On iOS 26.1, the system successfully triggers the upload. However, on iOS 26.2, although the job is created successfully and all related configurations are correctly set, the system does not trigger the upload. Could you please help confirm the cause of this issue? Thank you.
0
0
76
1w
Cannot generate 2048-bit FairPlay Streaming certificate
Hello, I have a problem generating a 2048-bit FairPlay Streaming certificate. I tried generating SDK v26.x certificate in two ways. (1) Use existing certificate (2) Create new certificate Though, in both ways, Apple gives me a certificate bundle of 1024-bit certificate. (fps_certificate.bin) I've uploaded 2048-bit CSR on creating a certificate. Just to note, I have created a SDK v4.x certificate few years ago. Have anyone bumped into a same issue? Or am I missing something?
5
0
865
1w
🎧Define if headphones is only playing device for current session
I need to apply headphone-specific scenario only when headphones are the sole active playback device in my iOS audio app. Problem that there is no absolute way to definitively understand that headphones are the sole active playback device AVAudioSession.currentRoute.outputs portTypes don't guarantee headphones: let session = AVAudioSession.sharedInstance() let outputs = session.currentRoute.outputs let headphonesOnly = outputs.count == 1 && (outputs.first?.portType == .headphones || outputs.first?.portType == .bluetoothA2DP || outputs.first?.portType == .bluetoothHFP || outputs.first?.portType == .bluetoothLE) The issue in code above that listed bluetooth profiles (A2DP, HFP, LE) can be used by any audio device, not only headphones Is there any public API on iOS that can: Distinguish Bluetooth headphones vs Bluetooth speakers when both use A2DP/LE? Expose the user’s “Device Type” classification (headphones / speaker / car stereo, etc.) that is shown in Settings → Bluetooth → Device Type? Provide a more reliable way to know “this route is definitely headphones” for A2DP devices, beyond portType and portName string heuristics?
0
0
63
1w