Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.

All subtopics

Post

Replies

Boosts

Views

Activity

LockedCameraCaptureManager sessionContentUpdates sometimes is not called
Within my app, I have: for try await update in LockedCameraCaptureManager.shared.sessionContentUpdates { It seems that the first time my app opens from LockedCameraCapture (after enabling camera permissions etc...) this update is never called and the user will not see their capture (.added or .initial) If I then try to take another picture/video through my LockedCameraCapture control, it takes the video, opens the app as before, but this time sessionContentUpdates is called twice, once for the first video and once for the second video! After that it doesn't seem to occur again and all works perfectly! My device is: iPhone 16 Pro Max, iOS 18.2 developer beta Has anyone experienced this?
0
0
75
2d
iOS Audio Crackling issue when send audio data to UDP server and Play
I am experiencing an issue while recording audio using AVAudioEngine with the installTap method. I convert the AVAudioPCMBuffer to Data and send it to a UDP server. However, when I receive the Data and play it back, there is continuous crackling noise during playback. I am sending audio data using this library "https://github.com/mindAndroid/swift-rtp" by creating packet and send it. Please help me resolve this issue. I have attached the code reference that I am currently using. Thank you. ViewController.swift
0
0
91
3d
IOSurface with System Extensions
Hi All, I'm working on a camera system extension where the main app is supposed to transfer a video stream using IOSurface memory sharing to the cam extension. I have built a sample app that does contains all the logic, but without a camera extension. So I'm essentially using IOSurface to render a video in one SwiftUI view and show the result in another SwiftUI view. Just for testing purposes. And everything works fine so far. Now, when moving the receiver code to the camera extensions, I'm having problems in accessing the IOSurface via ID. I am sharing the IOSurface ID via UserDefaults. I know from the logs the ID is correctly transferred. Here is the code that uses IOSurfaceLookup to get the IOSurface. But this fails with the given message. The error message prints the surface ID which is the correct one. I know this from the main app where I get the ID and print it as well. private var surfaceId: Int = -1 { didSet { logger.info("surfaceId has changed") if surfaceId == -1 { stopReceivingFrames() ioSurface = nil } else { guard let surface = IOSurfaceLookup(IOSurfaceID(surfaceId)) else { logger.error("failed to lookup IOSurface with ID: \(self.surfaceId)") return } self.ioSurface = surface logger.info("surface set, now starting receiving frames") startReceivingFrames() } } } My gut feeling says that this issue might be related to some missing entitlement, sandboxing. In general, I have a working camera extension. I'm just not able to render a video in the main app, and send it over to the camera extension to overlay another web cam. Both, the main app and camera extension are in the same XCode workspace and share the same AppGroup. In short, my actual questions are: Is there any entitlement required for using IOSurface between app and camera system extension? Is using IOSurface actually possible in system extensions? Is there any specific setting/requirement that I need to handle to make this work?
0
0
83
3d
App randomly be terminated due to Capture Application Requirements Unmet
Hi. I encounter some random crashes of my camera app. After some investigations, I found that it's terminated by the system and the crash log did be generated but the information is not quite useful, and here is the log found via the Console app. Termination & Crash log "Camera not actively used; AVCaptureEventInteraction not installed": Received termination request from [osservice<com.apple.SpringBoard>:10931] on <RBSProcessPredicate <RBSProcessInstancePredicate| [app<com.juniperphoton.PhotonCam]>> with context <RBSTerminateContext| explanation:Capture Application Requirements Unmet: "Camera not actively used; AVCaptureEventInteraction not installed" reportType:CrashLog maxTerminationResistance:Interactive> The crash log exported from the device will have some common information like: It's a EXC_CRASH (SIGKILL) type with no termination reason. Exception Type: EXC_CRASH (SIGKILL) Exception Codes: 0x0000000000000000, 0x0000000000000000 Termination Reason: RUNNINGBOARD 0 It's triggered by the main thread, but it seems to be waiting for an event to process. Triggered by Thread: 0 Thread 0 name: Dispatch queue: com.apple.main-thread Thread 0 Crashed: 0 libsystem_kernel.dylib 0x1ee165788 mach_msg2_trap + 8 1 libsystem_kernel.dylib 0x1ee168e98 mach_msg2_internal + 80 2 libsystem_kernel.dylib 0x1ee168db0 mach_msg_overwrite + 424 3 libsystem_kernel.dylib 0x1ee168bfc mach_msg + 24 4 CoreFoundation 0x19cbe47f4 __CFRunLoopServiceMachPort + 160 5 CoreFoundation 0x19cbe3ea0 __CFRunLoopRun + 1212 6 CoreFoundation 0x19cc36274 CFRunLoopRunSpecific + 588 7 GraphicsServices 0x1e9d6d4c0 GSEventRunModal + 164 8 UIKitCore 0x19f783480 -[UIApplication _run] + 816 9 UIKitCore 0x19f3a9410 UIApplicationMain + 340 10 UIKitCore 0x19fae4bb0 0x19f394000 + 7670704 11 PhotonCam 0x1002e7e3c 0x1002cc000 + 114236 12 dyld 0x1c2d5ade8 start + 2724 Address size fault on the main thread Thread 0 crashed with ARM Thread State (64-bit): ... far: 0x0000000000000000 esr: 0x56000080 Address size fault I have once tried to reproduce this issue when the app is attached with debugger, and it says: Terminated due to signal 9 When the crash or termination happened, the app: No AVCaptureSession is running. The app is in the foreground and users are interacting with some functions like viewing photos or editing photos in the app. When users exit the camera view, like entering the gallery or settings, the camera session will be stopped. Both TestFlight and Debug build will have the same issue. No 3rd party crash reporter is installed(I deliberately disable it in Debug build and TestFlight build) It has adopted the LockedCameraCapture, but current it's running on the main app target(if not, my app will have a button of unlock, so I can confirm about this). Also, when it comes to the memory consumption, there is no JetsamEvent around the crash time. Device and app information Additionally, some information about the tech stack and the current state of my device and my app: iPhone 16 Pro with iOS 18.2 Beta 3. The app is a camera based app(it's PhotonCam and you can find it on the App Store), its main functionality is the camera feature using AVFoundation + Core Image + Metal to deliver camera functionality. It has adopted the Camera Control, AVCaptureEventInteraction and LockedCameraCapture features. If I remember it right, it occurs in iOS 18.1 Release build, but currently I have no such device to confirm. But in iOS 17.x the issue has never happened. Regarding to this termination, on top of my head is the "watchdog" mechanism that will terminate the process that is running on the LockedCameraCapture feature. However I can make sure that currently the app is running as the main target on the home screen. Has anybody encountered this kind of issue and has found some solutions? Thanks in advance.
1
0
165
4d
How to change PHLivePhoto EXIF metadata
I have an app that allows the user to change a photo’s EXIF metadata. To do this, I request a content editing input, get the full size image, modify its properties, create a content editing output, write the output image to the rendered content URL, then call performChanges on the PHPhotoLibrary creating an asset change request for that asset setting its content editing output. This works as expected for regular photos but Live Photos get turned off converted to a regular photo. To address this, I’m doing something similar by changing the properties of the .photo image in the Live Photo. I detect when the content editing input has a Live Photo, create a Live Photo editing context, set a frame processor that returns the frame’s image after setting its properties to the updated properties when the frame type is photo, then I create the content editing output and save the Live Photo to that output. It modifies the Live Photo successfully, but the metadata is not updated. If you get the full size image again the properties are the original properties. If you look at the EXIF metadata using an app like Metapho it remains unchanged. What am I doing wrong here? Thanks! let imageURL = contentEditingInput.fullSizeImageURL! let inputImage = CIImage(contentsOf: imageURL, options: [.applyOrientationProperty: true])! var metadata: [AnyHashable: Any] = inputImage.properties // Edit the metadata as desired... let editingContext = PHLivePhotoEditingContext(livePhotoEditingInput: contentEditingInput)! editingContext.frameProcessor = { frame, error -> CIImage? in // Edit only the still photo if frame.type == .photo { return frame.image.settingProperties(metadata) } return frame.image } let contentEditingOutput = try await withCheckedThrowingContinuation { continuation in let editingOutput = PHContentEditingOutput(contentEditingInput: contentEditingInput) editingOutput.adjustmentData = adjustmentData editingContext.saveLivePhoto(to: editingOutput) { success, error in if success { continuation.resume(returning: editingOutput) } else { continuation.resume(throwing: error!) } } } try await PHPhotoLibrary.shared().performChanges { let request = PHAssetChangeRequest(for: asset) request.contentEditingOutput = contentEditingOutput }
0
0
80
3d
AVASSETREADER and AVAssetWriter: ideal settings
Hello, To create a test project, I want to understand how the video and audio settings would look for a destination video whose content comes from a source video. I obtained the output from the source video in the audio and video tracks as follows: let audioSettings = [ AVFormatIDKey: kAudioFormatLinearPCM, AVSampleRateKey: 44100, AVNumberOfChannelsKey: 2 ] as [String : Any] var audioOutput = AVAssetReaderTrackOutput(track: audioTrack!, outputSettings: audioSettings) // Video let videoSettings = [ kCVPixelBufferPixelFormatTypeKey: kCVPixelFormatType_32BGRA, kCVPixelBufferWidthKey: videoTrack!.naturalSize.width, kCVPixelBufferHeightKey: videoTrack!.naturalSize.height ] as [String: Any] var videoOutput = AVAssetReaderTrackOutput(track: videoTrack!, outputSettings: videoSettings) With this, I'm obtaining the CMSampleBuffer using AVAssetReader.copyNextSampleBuffer . How can I add it to the destination video? Should I use a while loop, considering I already have the AVAssetWriter set up? Something like this: while let buffer = videoOutput.copyNextSampleBuffer() { if let imgBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) { let frame = imgBuffer as CVPixelBuffer let time = CMSampleBufferGetPresentationTimeStamp(sampleBuffer) adaptor.append(frame, withMediaTime: time) } } Lastly, regarding the destination video. How should the AVAssetWriterInput for audio and PixelBuffer of the destination video be set up? Provide an example, something like: let audioSettings = […] as [String: Any] Looking forward to your response.
4
0
124
1w
DockKit tracking becomes erratic with increased zoom factor in iOS app
I'm developing an iOS app using DockKit to control a motorized stand. I've noticed that as the zoom factor of the AVCaptureDevice increases, the stand's movement becomes increasingly erratic up and down, almost like a pendulum motion. I'm not sure why this is happening or how to fix it. Here's a simplified version of my tracking logic: func trackObject(_ boundingBox: CGRect, _ dockAccessory: DockAccessory) async throws { guard let device = AVCaptureDevice.default(for: .video), let input = try? AVCaptureDeviceInput(device: device) else { fatalError("Camera not available") } let currentZoomFactor = device.videoZoomFactor let dimensions = device.activeFormat.formatDescription.dimensions let referenceDimensions = CGSize(width: CGFloat(dimensions.width), height: CGFloat(dimensions.height)) let intrinsics = calculateIntrinsics(for: device, currentZoom: Double(currentZoomFactor)) let deviceOrientation = UIDevice.current.orientation let cameraOrientation: DockAccessory.CameraOrientation = { switch deviceOrientation { case .landscapeLeft: return .landscapeLeft case .landscapeRight: return .landscapeRight case .portrait: return .portrait case .portraitUpsideDown: return .portraitUpsideDown default: return .unknown } }() let cameraInfo = DockAccessory.CameraInformation( captureDevice: input.device.deviceType, cameraPosition: input.device.position, orientation: cameraOrientation, cameraIntrinsics: useIntrinsics ? intrinsics : nil, referenceDimensions: referenceDimensions ) let observation = DockAccessory.Observation( identifier: 0, type: .object, rect: boundingBox ) let observations = [observation] try await dockAccessory.track(observations, cameraInformation: cameraInfo) } func calculateIntrinsics(for device: AVCaptureDevice, currentZoom: Double) -> matrix_float3x3 { let dimensions = CMVideoFormatDescriptionGetDimensions(device.activeFormat.formatDescription) let width = Float(dimensions.width) let height = Float(dimensions.height) let diagonalPixels = sqrt(width * width + height * height) let estimatedFocalLength = diagonalPixels * 0.8 let fx = Float(estimatedFocalLength) * Float(currentZoom) let fy = fx let cx = width / 2.0 let cy = height / 2.0 return matrix_float3x3( SIMD3<Float>(fx, 0, cx), SIMD3<Float>(0, fy, cy), SIMD3<Float>(0, 0, 1) ) } I'm calling this function regularly (10-30 times per second) with updated bounding box information. The erratic movement seems to worsen as the zoom factor increases. Questions: Why might increasing the zoom factor cause this erratic movement? I'm currently calculating camera intrinsics based on the current zoom factor. Is this approach correct, or should I be doing something differently? Are there any other factors I should consider when using DockKit with a variable zoom? Could the frequency of calls to trackRider (10-30 times per second) be contributing to the erratic movement? If so, what would be an optimal frequency? Any insights or suggestions would be greatly appreciated. Thanks!
8
0
295
Sep ’24
How to edit gain map image to preserve HDR in Live Photo
I have an app that allows you to edit your photos. To preserve HDR, I edit both the SDR image and gain map image, like so: let sdrImage = CIImage(data: data, options: [.applyOrientationProperty: true]) let gainMapImage = CIImage(data: data, options: [.applyOrientationProperty: true, .auxiliaryHDRGainMap: true]) // edit them... try CIContext().writeHEIFRepresentation(of: sdrImage, to: url, format: .RGBA8, colorSpace: colorSpace, options: [.hdrGainMapImage: gainMapImage]) I also support editing the still photo in Live Photos. To do this you create a PHLivePhotoEditingContext, set the frameProcessor block which gives you a CIImage that I edit when the frame.type is .photo, then you create a PHContentEditingOutput and call saveLivePhoto. I’m not seeing any way to preserve HDR here. Interestingly the frame processor is called twice with .photo frame.type, but I don’t see any difference between these images. How can I edit a gain map image to preserve HDR in the still photo of a Live Photo?
1
0
209
3w
Connect 2 mono nodes as L/R input for a stereo node
Hello, I'm fairly new to AVAudioEngine and I'm trying to connect 2 mono nodes as left/right input to a stereo node. I was successful in splitting the input audio to 2 mono nodes using AVAudioConnectionPoint and channelMap. But I can't figure out how to connect them back to a stereo node. I'll post the code I have so far. The use case for this is that I'm trying to process the left/right channels with separate audio units. Any ideas? let monoFormat = AVAudioFormat(standardFormatWithSampleRate: nativeFormat.sampleRate, channels: 1)! let leftInputMixer = AVAudioMixerNode() let rightInputMixer = AVAudioMixerNode() let leftOutputMixer = AVAudioMixerNode() let rightOutputMixer = AVAudioMixerNode() let channelMixer = AVAudioMixerNode() [leftInputMixer, rightInputMixer, leftOutputMixer, rightOutputMixer, channelMixer].forEach { engine.attach($0) } let leftConnectionR = AVAudioConnectionPoint(node: leftInputMixer, bus: 0) let rightConnectionR = AVAudioConnectionPoint(node: rightInputMixer, bus: 0) plugin.leftInputMixer = leftInputMixer plugin.rightInputMixer = rightInputMixer plugin.leftOutputMixer = leftOutputMixer plugin.rightOutputMixer = rightOutputMixer plugin.channelMixer = channelMixer leftInputMixer.auAudioUnit.channelMap = [0] rightInputMixer.auAudioUnit.channelMap = [1] engine.connect(previousNode, to: [leftConnectionR, rightConnectionR], fromBus: 0, format: monoFormat) // Process right channel, pass through left channel engine.connect(rightInputMixer, to: plugin.audioUnit, format: monoFormat) engine.connect(plugin.audioUnit, to: rightOutputMixer, format: monoFormat) engine.connect(leftInputMixer, to: leftOutputMixer, format: monoFormat) // Mix back to stereo? engine.connect(leftOutputMixer, to: channelMixer, format: stereoFormat) engine.connect(rightOutputMixer, to: channelMixer, format: stereoFormat)
1
0
109
4d
Crash when presenting Camera via Web View in iOS 18.2 Beta - WebCore::AVVideoCaptureSource::create
We are experiencing thousands of crashes in our application when attempting to present the camera through a Web View. The app crashes during this process, and the crash logs point to WebCore::AVVideoCaptureSource::create WebCore::RealtimeMediaSourceCenter::getUserMediaDevices. This issue has only been observed in iOS 18.2 beta versions (beta 1 - 22C5109p, beta 2 - 22C5125e, beta 3 - 22C5131e). In iOS versions below 18.2, the functionality works and we haven't identified any correlation with specific device models. The problem seems to stem from a WebCore framework introduced in these beta releases 18.2. We kindly request a review and fix for this issue in upcoming beta releases to restore functionality. Let us know if there are any workarounds or adjustments we can implement in the interim. Thank you for your attention to this matter.
2
0
160
4d
LivePhoto not applying on iOS 17+ as live wallpaper
Hi fellow iOS developers! 👋 I've written a Swift code that converts a video (from a URL) into a Live Photo after downloading it. The conversion process seems fine, but when I try to set the generated Live Photo as a wallpaper on iOS 17+, it shows the message 'Motion not Available.' Has anyone else experienced this issue or know why this might be happening? Could it be related to changes in iOS 17 Live Photo handling or the generated file structure? Any help or suggestions would be greatly appreciated! 🙏
0
0
61
5d
Generate recomendation QUEUE after selection a song in MusicKit
Hi, I'm developing a musicKit integration in my iOS App, and I want to select songs from recently played (done it), the problem is that the queue is not auto-generated and the user have to select other song if they want to go forward. There is any method to ask for similar songs, or recommended songs, from a song that the user has already selected? It will be really great :) Also if you know it... There is any publisher for the music duration or I need to do a timer?? Thanks. David.
0
0
89
5d
Heic format image incompatibility issue
I am a developer working on iOS apps. In the demo, I planned to replace the local images with Heic format instead of PNG format, but the actual test results showed abnormalities on this device, while the other test devices displayed normally Heic images are converted by the built-in image conversion function on Mac. I tested multiple Heic images, but none of them were displayed and the image information returned nil,,but PNG images can be displayed normally. device information:
6
0
330
Oct ’24
Why doesn't sometimes recommendedVideoSettings have recommend settings?
I am talking about AVCaptureVideoDataOutput.recommendedVideoSettings. I found sometimes it return nil, there is my test result. hevc .mov with activeColorSpace sRGB 60FPS -> ok 120FPS -> ok hevc .mov with activeColorSpace displayP3_HLG 60FPS -> nil 120FPS -> nil h264 .mov 30FPS -> ok 60FPS -> nil 120FPS -> nil so, if you don't give a recommend setting, and you don't give a document, how does developer to use it?
0
0
126
5d
Error Domain=NSOSStatusErrorDomain Code=-16384, -16155, -16512
I’ve built a custom media player using AVSampleBufferAudioRenderer and AVSampleBufferRenderSynchronizer, and overall, it works great! However, I’ve noticed some unusual logs popping up: Domain: NSOSStatusErrorDomain Error Codes: -16384, -16155, -16512 *That error -16512 keeps happening repeatedly for one of our users, preventing them from playing any media at all. I’ve searched around but can’t find any documentation explaining what these errors mean. Has anyone run into this issue or have any suggestions? Any help would be hugely appreciated! Thanks!
0
0
162
6d
Some album artwork from MPMediaItem display as nil
Hey there, I'm trying to display all user's albums using the MediaPlayer library. I'm getting many albums returning nil, but I know artwork exists because they show up in the default Music app. There doesn't seem to be much rhyme or reason for what shows up and what doesn't. All downloaded albums display artwork, but some cloud album artwork displays as well. Here's the code I'm using to debug this. let query = MPMediaQuery.albums() if let albumCollections = query.collections { albums = albumCollections } for album in albums { let artwork = album.representativeItem?.artwork print(artwork, artwork?.image(at: CGSize(width: 100, height: 100))) } Any help would be greatly appreciated. Thanks!
2
1
593
Jan ’24
writeImageAtIndex:1012: ⭕️ ERROR: 'App' is trying to save an opaque image (5712x4284) with 'AlphaLast'.
I have an app that edits photos in your library. When I call try CIContext().writeHEIFRepresentation(of: editedImage, to: fileURL, format: .RGBA8, colorSpace: originalImage.colorSpace!) The following is logged to the console: writeImageAtIndex:1012: ⭕️ ERROR: 'App' is trying to save an opaque image (5712x4284) with 'AlphaLast'. This would unnecessarily increase the file size and will double (!!!) the required memory when decoding the image --> ignoring alpha. What does that mean and how can I resolve it? Xcode Version 16.0 (16A242d) iOS 18.1 (22B82)
3
6
436
4w
EXIF creation date of ICCameraFile always nil?
I am using ImageCaptureCore to access and (sometimes) download media files from a digital camera connected via USB (either to a Mac oder to an iOS device with Apple lightning to USB3 camera adapter). This works very well in general, but what puzzles me is that for the ICCameraFile's EXIF creation/modification date, it always returns nil. I can access the ICCameraItem's creation/modification date instead, which, as it says in the documentation "usually [is] the same as its EXIF creation date", but, well not always. Generally the EXIF tags are more reliable than the file dates, especially the modification date is easily messed up when copying files. As for my cameras, they show the stable EXIF date on their display, so for consistency I would prefer to use the same in my app. Is there a way to get it without downloading the image from the camera and reading it from the file? Does it possibly depend on the brand of camera (I mostly have Canon) whether ICCameraFile.exifCreationDate is ever populated or always nil? For a thumb drive with DCIM folder, which is treated just like a camera, it is also nil.
2
0
180
1w
MPRemoteCommand play conflicting with .playPause gesture
We develop a video playback app on Apple TV which has the two following features: Its content browsing screen has installed a gesture recognizer for presses on the PlayPause Siri remote button in order to directly launch a playback. The gesture recognizer is attached to the content browsing UIViewController view. It presents its own custom playback UI with an AVPlayerLayer for the video and supports MPNowPlayingSession in order to publish current playback information and respond to remote commands. It also supports switching between fullscreen and Picture in Picture playback. Both features work fine, ie. the playback is launched when pressing the PlayPause Siri remote button and, during playback, the playback info are properly advertised on other devices and remote commands are also triggered as expected. However, when pressing the PlayPause Siri remote button while the video is playing in PiP, the "pause" remote command is sometimes triggered instead of the .playPause gesture recognizer. The issue may not occur the first time but for subsequent PlayPause presses. Navigating a bit in the app UI seems to help preventing the issue to occur. Finally, the issue only occurs if the video is playing. If the video is paused, the PlayPause Siri remote button gesture is always recognized instead of the remote command. Please note that, before using MPNowPlayingSession (and the corresponding MPRemoteCommandCenter), the app was using the default MPRemoteCommandCenter to support remote commands and the issue did not occur. We don't reproduce this issue with the Apple TV app so there's probably something we are not doing right. Has someone any clue?
0
0
175
1w
MPNowPlayingInfoCenter without playing music
Hello ! I am working on an app connected to an external streamer . I would like to display current playing song on the Lock Screen. I tried to update the information in MPNowPlayingInfoCenter but I need to play a sound on my iPhone for the control to be displayed . Is there a way to do it without playing a sound? If not, playing a silent sound would be the only solution ? validated by Apple ? :-/ Thank you Frederic
2
0
361
Jul ’24