Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.

All subtopics
Posts under Media Technologies topic

Post

Replies

Boosts

Views

Activity

Selecting an appropriate AVCaptureDeviceFormat
My app currently captures video using an AVCaptureSession set with the AVCaptureSessionPreset1920x1080 preset. However, I'd like to update this behavior, such that video can be recorded at a range of different resolutions. There isn't a preset aligning to each desired resolution, so I thought I'd instead directly set the AVCaptureDeviceFormat. For any desired resolution, I would find the format that is closest without going under the desired resolution, and then crop it down as a post-processing step. However, what I've observed is that there can be a range of available formats for a device at each resolution, with various differing settings. Presumably there is logic within AVCaptureSession that selects a reasonable default based on all these different settings, but since I am applying the format directly, I think I don't have a way to make use of that default logic? And it is undocumented? Does this mean that the only way to select a format is to implement a comparison function that considers all different values of all different properties on AVCaptureDeviceFormat, and then sort the formats according to this comparator? If so, what if some new property is added to AVCaptureDeviceFormat in the future? The sort would not take this new property into account, and the function might select a format with some new undesired property. Are there any guarantees about what types for formats will be supported on a device? For example, can I take for granted that a '420v' format will exist at each resolution? If so I could filter the formats down only to those with this setting without risking filtering out all of the supported formats. I suspect I may be missing something obvious. Any help would be greatly appreciated!
3
0
731
Dec ’24
Custom FairPlay DRM error handling mechanics
Hi, I have a usecase where I'd like to handle and prevent automatic retries whenever certain errors occur during FairPlay content key requests. Here's the current flow: FairPlay certificate is requested and obtained from my server makeStreamingContentKeyRequestData is called on the keyRequest The license server will return a 403 along with a body response containing a json with the detailed code and message The error is caught and handled properly by calling AVContentKeyRequest.processContentKeyResponseError The AVContentKeySession automatically retries up to 8 times by providing a new key request through public func contentKeySession(_ session: AVContentKeySession, didProvide keyRequest: AVContentKeyRequest) My license server gets hit with 8 requests that will always result in a 403, these retries are useless My custom error is succesfully caught later down the line through AVPlayerItem.observe(\.status), this is great Thing is.. I'd like to catch the 403 error and prevent any retry from being made at step 5, ideally through public func contentKeySession(_ session: AVContentKeySession, contentKeyRequest keyRequest: AVContentKeyRequest, didFailWithError err: Error) I've looked for quite a while and just can't seem to find any way of achieving this. Is this not supported at all?
0
8
711
Dec ’24
Increased delay when AUGraph's output device is system output
I'm using an AUGraph to mix audio from different sources for a real time streaming application. Whenever the audio device used as the graph's output device is also the Mac's default output device, the measured latency increases by about 35 milliseconds for wired devices. Any idea why this is? Is there a way around this besides nagging the user to not the use the system output in our app?
0
0
292
Dec ’24
Camera Preview Plugin displayed black screen after update iOS 18.1.1
Hi Team, Camera preview plugin stopped working after upgrading the iOS 18.1.1 in mobile. Is there any way to implement the CameraPreview Plugin application. After clicking on camera icon, only black screen shown, on the other hand same build working fine with prior version of iOS 18. Is there any way to resolved this issue. Error CameraPreview Plugin upgraded to iOS 18
1
0
547
Dec ’24
How to Capture 48MP Photos with Ultra-Wide Camera During AR Session on iPhone 16 Pro?
Hello Developers, I am working on an app where I need to capture 48MP high-resolution photos using the ultra-wide camera of the iPhone 16 Pro while an AR session is running. The goal is to take these photos without interrupting or impacting the AR session, which uses the main wide-angle camera. Despite extensive testing and various approaches, we have been unable to achieve the desired functionality. What We Have Tried So Far 1. Using AVCaptureMultiCamSession: • We attempted to leverage AVCaptureMultiCamSession to simultaneously use the wide-angle camera for ARKit and the ultra-wide camera for photo capture. • However, this approach resulted in resource conflicts, with errors such as Cannot Record (OSStatus error -16409) and dropped frames. Additionally, the ultra-wide camera feed would frequently freeze or stop. 2. Dedicated AVCaptureSession for the Ultra-Wide Camera: • We separated the ultra-wide camera into its own AVCaptureSession while letting ARKit exclusively use the wide-angle camera. • This setup showed initial promise, but the ultra-wide camera feed would still stop running after a very short time (under one second). • Debugging logs indicated potential system-level interruptions, possibly due to resource prioritization by iOS. 3. Notification-Based Monitoring: • We implemented monitoring for session interruptions (AVCaptureSession.wasInterruptedNotification), but this provided limited insights into the exact cause of the session stopping. • We suspect iOS is de-prioritizing the ultra-wide camera session due to resource management policies or conflicts with ARKit. 4. Adjusting Camera Configurations: • We attempted to simplify both ARKit and AVCaptureSession configurations by reducing features like depth data and by using lower session presets for video capture. However, the core issue persisted. The Core Problem • The ultra-wide camera session frequently stops or freezes when used alongside ARKit. • Capturing high-resolution 48MP photos during the AR session is critical to the functionality of our app. Question Has anyone successfully implemented a similar setup? Specifically: • Capturing 48MP photos with the ultra-wide camera while ARKit is actively using the main camera. • Avoiding conflicts between ARKit and AVCaptureSession for the ultra-wide camera. Any insights, suggestions, or alternative approaches would be greatly appreciated. Thank you in advance for your help! 😊
1
0
636
Dec ’24
Delay w/ new AudioTap API when system device is a BL device
I'm capturing audio from other applications on macOS to mix them with other sources in a real time streaming application. I noticed that audio data captured via the new tapping mechanism introduced in macOS 14.2 arrives delayed in my app, when the macOS system device is a Bluetooth headphone, e.g. Apple AirPods. Sometimes this delay is about 300-400 milliseconds, which makes it unusable for live streaming, because the audio is out of sync with the video and also audio captured from other devices. What is confusing to me, is that this also happens when my app does not even use that output device. Is this a known issue? Is there a way around this?
1
0
384
Dec ’24
AudioServicesPlaySystemSound not playing through BluetoothA2DP device
Hello We have an application that play some sound via the system sound APIs from the AudioToolbox framework. AudioServicesCreateSystemSoundID(url as CFURL, &soundID) AudioServicesPlaySystemSoundWithCompletion(soundID) Our make sure that an active audio session is available before playing the system sound. But when the device is connected to a BluetoothA2DP device. The sound are played on through the device speaker and not through the bluetooth A2DP device. Our AudioSesison is configured with the following categories [.allowBluetooth, .defaultToSpeaker, .allowBluetoothA2DP] Sound played from the AVAudioPlayer are played on the allowBluetoothA2DP device with similar code. Is this a bug in the AudioToolbox framework?
2
0
545
Dec ’24
iPhone Video Upload Error - reason unknown
Hi everyone, I'm developing a customization tool in which our customer can upload a mp3 or mp4 file that will be scannable through our AR application. On desktop and Android, this is working perfectly. For some reason however, on iPhone we're unable to load most of the video files. I've checked the clips, and they are .mov/h.264 files which are supported by iPhone. We're currently not sure how we can fix this issue to allow customers that own an iPhone to upload clips to our website. Any tips in the right direction are more than welcome. thanks in advance!
1
0
498
Dec ’24
AVPlayer fail
when I played a local video(I downloaded it to the sandbox),KVO the AVPlayerItem status is AVPlayerItemStatusFailed and error is Error Domain=AVFoundationErrorDomain Code=-11800 "这项操作无法完成" UserInfo={NSLocalizedFailureReason=发生未知错误(24), NSLocalizedDescription=这项操作无法完成, NSUnderlyingError=0x3004137e0 {Error Domain=NSPOSIXErrorDomain Code=24 "Too many open files"}} why?
2
0
353
Dec ’24
AVAudioEngineConfigurationChange Clearing AVPlayerNode
Hi all, I am working on an app where I have live prompts playing, in addition to a voice channel that sometimes becomes active. Right now I am using two different AVAudioSession Configurations so what we only switch to a mic enabled mode when we actually need input from the mic. These are defined below. When just using the device hardware, everything works as expected and the modes change and the playback continues as needed. However when using bluetooth devices such as AirPods where the switch from AD2P to HFP is needed, I am getting a AVAudioEngineConfigurationChange notification. In response I am tearing down the engine and creating a new one with the same 2 player nodes. This does work fine and there are no crashes, except all the audio I have scheduled on a player node has now been cleared. All the completion blocks marked with ".dataPlayedBack" return the second this event happens, and leaves me in a state where I now have a valid engine setup again but have no idea what actually played, or was errantly marked as such. Is this the expected behavior when getting a configuration change notification? Adding some information below to my audio graph for context: All my parts of the graph, I disconnect when getting this event and do the same to the new engine private var inputEngine: AVAudioEngine private var audioEngine: AVAudioEngine private let voicePlayerNode: AVAudioPlayerNode private let promptPlayerNode: AVAudioPlayerNode audioEngine.attach(voicePlayerNode) audioEngine.attach(promptPlayerNode) audioEngine.connect( voicePlayerNode, to: audioEngine.mainMixerNode, format: voiceNodeFormat ) audioEngine.connect( promptPlayerNode, to: audioEngine.mainMixerNode, format: nil ) An example of how I am scheduling playback, and where that completion is firing even if it didn't actually play. private func scheduleVoicePlayback(_ id: AudioPlaybackSample.Id, buffer: AVAudioPCMBuffer) async throws { guard !voicePlayerQueue.samples.contains(where: { $0 == id }) else { return } seprateQueue.append(buffer) if !isVoicePlaying { activateAudioSession() } voicePlayerQueue.samples.append(id) if !voicePlayerNode.isPlaying { voicePlayerNode.play() } if let convertedBuffer = buffer.convert(to: voiceNodeFormat) { await voicePlayerNode.scheduleBuffer(convertedBuffer, completionCallbackType: .dataPlayedBack) } else { throw AudioPlaybackError.failedToConvert } voiceSampleHasBeenPlayed(id) } And lastly my audio session configuration if its useful. extension AVAudioSession { static func setDefaultCategory() { do { try sharedInstance().setCategory( .playback, options: [ .duckOthers, .interruptSpokenAudioAndMixWithOthers ] ) } catch { print("Failed to set default category? \(error.localizedDescription)") } } static func setVoiceChatCategory() { do { try sharedInstance().setCategory( .playAndRecord, options: [ .defaultToSpeaker, .allowBluetooth, .allowBluetoothA2DP, .duckOthers, .interruptSpokenAudioAndMixWithOthers ] ) } catch { print("Failed to set category? \(error.localizedDescription)") } } }
1
0
674
Dec ’24
M3 chip reverse video playback performance
We have developed a simple video player Swift application for macOS, which uses the AVFoundation Framework. A special feature of this app is the ability to play the video backward with speeds like -0.25x, -0.5x, and -1.0x. MP4 video file is played directly from the local file system, video codec is h.264, and audio AAC. Video files are huge, like 10 GB, and a length of 3 hours. Playing video in reverse direction works well on a Macbook Air with M1 or M2 chip. When we run the same app with the same video on a Macbook Air with M3 chip the reverse playback is much worse. Playback might stutter badly, especially in the latter part of the video. This same behavior also happens in Apple's Quicktime video player when playing in the reverse direction with -1x speed. What's even more strange is that at one point of a time, the video playback is totally smooth, but again, after a while, the playback is stuttering. For example, this morning reverse playback worked 100 % smoothly, then I rebooted the Mac and tried again: the result was stuttering. After this the Mac stayed idle for several hours and I tried to reverse play video again: smooth performance! My conclusion: M3 playback works fine if the stars in the sky are aligned correctly. :-) So it's not only our app, but also Quicktime player is having exactly the same behavior. And only with the M3 chip. The same symptom appears with another similar M3 Mac, so it can't be a single fault. At the same time, open-source video player iina can reverse play the video well on the same Mac. All Macs have otherwise identical configuration: 16 GB RAM and macOS 15.1.1. Have you experienced the same problem? Any chance to solve this problem? I really hope that the M4 chip Mac is behaving better here.
4
0
781
Dec ’24
AVAudioEngine - How to archive configured nodes to file?
I’m looking to add DAW-like capabilities to my macOS music app, and AVAudioEngine seems like the right tool for the job. However, I haven’t been able to find any documentation on how to save the user’s AVAudioEngine configuration—specifically the connections between nodes and the internal states of each node—to a file. Does AVAudioEngine provide any API for saving and restoring this state, or does it need to be handled manually? If it’s manual, are there any sample "DAW" apps or resources that demonstrate how this can be implemented? Any guidance would be greatly appreciated. Thanks, BD
1
0
483
Dec ’24
Normal audio mode AirPods IOS 18.1
Hi everyone, I just upgraded my iPhone to 18.1.1. I noticed the absence of the normal mode on my AirPods. I can't stand the noise cancellation mode for too long and the transparency one is overwhelming everytime a hair is brushing near by my AirPods. Why is that? I don't really see the progression here. Can you this be fixed in the next version? I don't want to buy another brand, I use them everyday for many hours.
1
0
407
Dec ’24
Error Domain=NSOSStatusErrorDomain Code=-16384, -16155, -16512
I’ve built a custom media player using AVSampleBufferAudioRenderer and AVSampleBufferRenderSynchronizer, and overall, it works great! However, I’ve noticed some unusual logs popping up: Domain: NSOSStatusErrorDomain Error Codes: -16384, -16155, -16512 *That error -16512 keeps happening repeatedly for one of our users, preventing them from playing any media at all. I’ve searched around but can’t find any documentation explaining what these errors mean. Has anyone run into this issue or have any suggestions? Any help would be hugely appreciated! Thanks!
1
0
971
Dec ’24
Does updating MPNowPlayingInfoPropertyElapsedPlaybackTime frequently harm performance?
Hi, I'm wondering about one of the properties in the MPNowPlayingInfoCenter: MPNowPlayingInfoPropertyElapsedPlaybackTime. The docs say that updating this property frequently is not required, because the system can automatically calculate elapsed playback time based on the infrequent values we provide. Is performance harmed by updating this property every second? Should I add some filtering/throttling to update this property infrequently? Am I overthinking this, and it doesn't matter either way? Kind regards.
1
0
581
Dec ’24
Pictures won't import from iPhone to the Photo app on iMac
Since the OS was recently updated to 18.1.1 on my iPhone 15, I am no longer able to import my pictures into the Photos app on my iMac. I have to mention that my iMac is pretty old and is running OS High Sierra 10.13.6 and is not allowing me to update the OS to a newer version. Anyway, the main error message I get is: "Some items cannot be added to your Photo library because they may be an unrecognizable file format or the file may not contain valid data". Then, for each individual photo that failed to upload, the error message I get reads, "unable to read metadata. The file may be corrupt". However, videos import just fine from my iPhone to iMac. This was not a problem before the recent iPhone update. I tried closing the Photo app and reopening it. I tried restarting my iPhone and iMac but nothing seems to work. Any help would be much appreciated.
3
1
1.8k
Dec ’24
AVAudioSession's "availableInputs" not update in time
// Here addObserver for routeChangeNotification func testAudioRoute() { // My app is an VoIP app, so I need to set "playAndRecord" and "allowBluetooth" try? AVAudioSession.sharedInstance().setCategory(.playAndRecord, options: [.duckOthers, .allowBluetooth, .allowBluetoothA2DP]) NotificationCenter.default.addObserver(self, selector: #selector(currentRouteChanged(noti:)), name: AVAudioSession.routeChangeNotification, object: nil) } // Print the "availableInputs" once got a notification @objc func currentRouteChanged(noti: Notification) { let availableInputs = AVAudioSession.sharedInstance().availableInputs?.compactMap({ $0.portType }) ?? [] let currentRouteInputs = AVAudioSession.sharedInstance().currentRoute.inputs.compactMap({ $0.portType }) let currentRouteOutputs = AVAudioSession.sharedInstance().currentRoute.outputs.compactMap({ $0.portType }) print("willtest: \navailableInputs=\(availableInputs), \ncurrentRouteInputs=\(currentRouteInputs), \ncurrentRouteOutputs=\(currentRouteOutputs)") /* When BT (Airpods pro 2) CONNECTTED: it will print like below when notification comes, this is correct. ---------------------------------------------------------- willtest: availableInputs=[__C.AVAudioSessionPort(_rawValue: MicrophoneBuiltIn), __C.AVAudioSessionPort(_rawValue: BluetoothHFP)], currentRouteInputs=[], currentRouteOutputs=[__C.AVAudioSessionPort(_rawValue: BluetoothA2DPOutput)] ---------------------------------------------------------- When BT (Airpods pro 2) DISCONNECTTED: it will print like below when notification comes, this is wrong. ---------------------------------------------------------- availableInputs=[__C.AVAudioSessionPort(_rawValue: MicrophoneBuiltIn), __C.AVAudioSessionPort(_rawValue: BluetoothHFP)], currentRouteInputs=[], currentRouteOutputs=[__C.AVAudioSessionPort(_rawValue: Speaker)] */ } So my question here is: Why does the "availableInputs" still contain the "C.AVAudioSessionPort(_rawValue: BluetoothHFP)" item even though I have already disconnected the BT device? (Put AirPods in the case.) BTW, if I tap the "Manual" button once I disconnected the BT, it also prints the "wrong" value for "availableInputs", and it will become normal after about 3~4 seconds.
4
0
506
Dec ’24
MusicKit lastPlayedDate always nil
I am having trouble accessing the lastPlayedData for any given album or track using MusicKit. The value is always nil, both on numerous albums and tracks I tested. Afaik this is not a property that has to be fetched separately like tracks for example. I am running this on my physical iPhone 12 18.1.1 with Xcode 16.1. The albums and tracks have definitely been played multiple times before. The app has permission to the library using MusicAuthorization.request() This post mentions the same problem but offers no solution. Thanks for any help
0
0
451
Dec ’24
Listener for kAudioProcessPropertyIsRunningOutput
I'm trying to setup a listener for kAudioProcessPropertyIsRunningOutput but it's never triggered. I get calls for kAudioProcessPropertyIsRunning and kAudioProcessPropertyDevices but not for kAudioProcessPropertyIsRunningInput or kAudioProcessPropertyIsRunningOutput. class MyDelegate: PropertyListenerDelegate { func propertiesChanged(properties: [AudioObjectPropertyAddress]) { print(properties) } } var myDelegate = MyDelegate() var processes = try AudioHardwareSystem.shared.processes for process in processes { process.delegates += [myDelegate] try process.addListener(forProperties: [AudioObjectPropertyAddress(mSelector: kAudioPropertyWildcardPropertyID, mScope: kAudioObjectPropertyScopeWildcard, mElement: kAudioObjectPropertyElementWildcard)]) } Xcode 16.1 macOS 15.0.1
0
0
477
Dec ’24