AVFoundation

RSS for tag

Work with audiovisual assets, control device cameras, process audio, and configure system audio interactions using AVFoundation.

AVFoundation Documentation

Posts under AVFoundation tag

362 Posts
Sort by:
Post not yet marked as solved
0 Replies
24 Views
I am implementing pan and zoom features for an app using a custom USB camera device, in iPadOS. I am using an update function (shown below) to apply transforms for scale and translation but they are not working. By re-enabling the animation I can see that the scale translation seems to initially take effect but then the image animates back to its original scale. This all happens in a fraction of a second but I can see it. The translation transform seems to have no effect at all. Printing out the value of AVCaptureVideoPreviewLayer.transform before and after does show that my values have been applied. private func updateTransform() { #if false // Disable default animation. CATransaction.begin() CATransaction.setDisableActions(true) defer { CATransaction.commit() } #endif // Apply the transform. logger.debug("\(String(describing: self.videoPreviewLayer.transform))") let transform = CATransform3DIdentity let translate = CATransform3DTranslate(transform, translationX, translationY, 0) let scale = CATransform3DScale(transform, scale, scale, 1) videoPreviewLayer.transform = CATransform3DConcat(translate, scale) logger.debug("\(String(describing: self.videoPreviewLayer.transform))") } My question is this, how can I properly implement pan/zoom for an AVCaptureVideoPreviewLayer? Or even better, if you see a problem with my current approach or understand why the transforms I am applying do not work, please share that information.
Posted
by
Post not yet marked as solved
1 Replies
30 Views
I'm trying to read meta information from MXF files without success. I get an empty AVAsset array. I saw that there are mentions of "MTRegisterProfessionalVideoWorkflowFormatReaders". But there is absolutely no documentation. I don't know where to look. Has anyone encountered this? Please help with any information
Posted
by
Post not yet marked as solved
1 Replies
55 Views
I have built a camera application which uses a AVCaptureSession with the AVCaptureDevice set to .builtInDualWideCamera and isVirtualDeviceConstituentPhotoDeliveryEnabled=true to enable delivery of "simultaneous" photos (AVCapturePhoto) for a single capture request. I am using the hd1920x1080 preset, but both the wide and ultra-wide photos are being delivered in the highest possible resolution (4224x2376). I've tried to disable any setting that suggests that it should be using that 4k resolution rather than 1080p on the AVCapturePhotoOutput, AVCapturePhotoSettings and AVCaptureDevice, but nothing has worked. Some debugging that I've done: When I turn off constituent photo delivery by commenting out the line of code below, I end up getting a single photo delivered with the 1080p resolution, as you'd expect. // photoSettings.virtualDeviceConstituentPhotoDeliveryEnabledDevices = captureDevice.constituentDevices I tried the constituent photo delivery with the .builtInDualCamera and got only 4k results (same as described above) I tried using a AVCaptureMultiCamSession with .builtInDualWideCamera and also only got 4k imagery I inspected the resolved settings on photo.resolvedSettings.photoDimensions, and the dimensions suggest the imagery should be 1080p, but then when I inspect the UIImage, it is always 4k. guard let imageData = photo.fileDataRepresentation() else { return } guard let capturedImage = UIImage(data: imageData ) else { return } print("photo.resolvedSettings.photoDimensions", photo.resolvedSettings.photoDimensions) // 1920x1080 print("capturedImage.size", capturedImage.size) // 4224x2376 -- Any help here would be greatly appreciated, because I've run out of things to try and documentation to follow 🙏
Posted
by
Post not yet marked as solved
1 Replies
129 Views
In this code, I aim to enable users to select an image from their phone gallery and display it with less opacity on top of the z-index. The selected image should appear on top of the user's phone camera feed, allowing them to see the canvas on which they are drawing as well as the low-opacity image. The app's purpose is to enable users to trace an image on the canvas while simultaneously seeing the camera feed. CameraView.swift import SwiftUI import AVFoundation struct CameraView: View { let selectedImage: UIImage var body: some View { ZStack { CameraPreview() Image(uiImage: selectedImage) .resizable() .aspectRatio(contentMode: .fill) .opacity(0.5) // Adjust the opacity as needed .edgesIgnoringSafeArea(.all) } } } struct CameraPreview: UIViewRepresentable { func makeUIView(context: Context) -> UIView { let cameraPreview = CameraPreviewView() return cameraPreview } func updateUIView(_ uiView: UIView, context: Context) {} } class CameraPreviewView: UIView { private let captureSession = AVCaptureSession() override init(frame: CGRect) { super.init(frame: frame) setupCamera() } required init?(coder: NSCoder) { fatalError("init(coder:) has not been implemented") } private func setupCamera() { guard let backCamera = AVCaptureDevice.default(for: .video) else { print("Unable to access camera") return } do { let input = try AVCaptureDeviceInput(device: backCamera) if captureSession.canAddInput(input) { captureSession.addInput(input) let previewLayer = AVCaptureVideoPreviewLayer(session: captureSession) previewLayer.videoGravity = .resizeAspectFill previewLayer.frame = bounds layer.addSublayer(previewLayer) captureSession.startRunning() } } catch { print("Error setting up camera input:", error.localizedDescription) } } } Thanks for helping and your time.
Posted
by
Post not yet marked as solved
1 Replies
92 Views
In my app I play HLS streams via AVPlayer. It works well! However, when I try to download those same HLS urls via MakeAssetDownloadTask I regularly come across the error: Download error for identifier 21222: Error Domain=CoreMediaErrorDomain Code=-12938 "HTTP 404: File Not Found" UserInfo={NSDescription=HTTP 404: File Not Found, _NSURLErrorRelatedURLSessionTaskErrorKey=( "BackgroundAVAssetDownloadTask <CE9B10ED-E749-49FF-9942-3F8728210B20>.<1>" ), _NSURLErrorFailingURLSessionTaskErrorKey=BackgroundAVAssetDownloadTask <CE9B10ED-E749-49FF-9942-3F8728210B20>.<1>} I have a feeling that the AVPlayer has a way to resolve this that the MakeAssetDownloadTask lacks. I am wondering if any of you have come across this or have insight. Thank you! BTW this is using Xcode Version 15.3 (15E204a) and developing for visionOS 1.0.1
Posted
by
Post not yet marked as solved
0 Replies
141 Views
After upgrading to iOS 17, Thread Performance Checker is complaining of priority inversion when converting a CVPixelBuffer to UIImage through a CIImage instance. It might be a false-positive or an issue? - (UIImage *)imageForSampleBuffer:(CMSampleBufferRef)sampleBuffer andOrientation:(UIImageOrientation)orientation { CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); CIImage *ciImage = [CIImage imageWithCVPixelBuffer:pixelBuffer]; UIImage *uiImage = [UIImage imageWithCIImage:ciImage]; NSData *data = UIImageJPEGRepresentation(uiImage, 90); } The code snippet above, when running in a thread set to the default priority results in the message below: Thread Performance Checker: Thread running at User-interactive quality-of-service class waiting on a lower QoS thread running at Default quality-of-service class. Investigate ways to avoid priority inversions PID: 1188, TID: 723209 Backtrace ================================================================= 3 AGXMetalG14 0x0000000235c77cc8 1FEF1F89-B467-37B0-86F8-E05BC8A2A629 + 2927816 4 AGXMetalG14 0x0000000235ccd784 1FEF1F89-B467-37B0-86F8-E05BC8A2A629 + 3278724 5 AGXMetalG14 0x0000000235ccf6a4 1FEF1F89-B467-37B0-86F8-E05BC8A2A629 + 3286692 6 MetalTools 0x000000022f758b68 E712D983-01AD-3FE5-AB66-E00ABF76CD7F + 568168 7 CoreImage 0x00000001a7c0e580 3D2AC243-0880-3BA9-BBF3-A214454875E0 + 267648 8 CoreImage 0x00000001a7d0cc08 3D2AC243-0880-3BA9-BBF3-A214454875E0 + 1309704 9 CoreImage 0x00000001a7c0e2e0 3D2AC243-0880-3BA9-BBF3-A214454875E0 + 266976 10 CoreImage 0x00000001a7c0e1d0 3D2AC243-0880-3BA9-BBF3-A214454875E0 + 266704 11 libdispatch.dylib 0x0000000105e4a7bc _dispatch_client_callout + 20 12 libdispatch.dylib 0x0000000105e5be24 _dispatch_lane_barrier_sync_invoke_and_complete + 176 13 CoreImage 0x00000001a7c0a784 3D2AC243-0880-3BA9-BBF3-A214454875E0 + 251780 14 CoreImage 0x00000001a7c0a46c 3D2AC243-0880-3BA9-BBF3-A214454875E0 + 250988 15 libdispatch.dylib 0x0000000105e5b764 _dispatch_block_async_invoke2 + 148 16 libdispatch.dylib 0x0000000105e4a7bc _dispatch_client_callout + 20 17 libdispatch.dylib 0x0000000105e5266c _dispatch_lane_serial_drain + 832 18 libdispatch.dylib 0x0000000105e5343c _dispatch_lane_invoke + 460 19 libdispatch.dylib 0x0000000105e524a4 _dispatch_lane_serial_drain + 376 20 libdispatch.dylib 0x0000000105e5343c _dispatch_lane_invoke + 460 21 libdispatch.dylib 0x0000000105e60404 _dispatch_root_queue_drain_deferred_wlh + 328 22 libdispatch.dylib 0x0000000105e5fa38 _dispatch_workloop_worker_thread + 444 23 libsystem_pthread.dylib 0x00000001f35a4f20 _pthread_wqthread + 288 24 libsystem_pthread.dylib 0x00000001f35a4fc0 start_wqthread + 8
Posted
by
Post not yet marked as solved
0 Replies
173 Views
Dear Apple Developer Forum, we have customers here complaining about not being able to play live streams (HLS FairPlay) with ou application anymore since having upgraded their phone to iOS 17.4.1. We can't reproduce this problem in-house but the error code sent to ou analytics platform is CoreMediaErrorDomain error -12852 . Would it be possible to get more information on this error especially the potential cause of this and if the app is not responsible how we can help our customers ? Kind regards Cédric
Posted
by
Post not yet marked as solved
0 Replies
99 Views
Is it possible to find IDR frame (CMSampleBuffer) in AVAsset h264 video file?
Posted
by
Post not yet marked as solved
0 Replies
103 Views
I have a camera application which aims to take images as close to simultaneously as possible from the wide and ultra-wide cameras. The AVCaptureMultiCamSession is setup with manual connections. Note: we are not using builtInDualWideCamera with constituent photo delivery enabled since some features we use are not supported in that mode. At the moment, we are manually trying to synchronize frames between the two cameras, but we would like to use the AVCaptureDataOutputSynchronizer to improve our results. Is it possible to synchronize the wide and ultra-wide video outputs? All examples and docs that I've found show synchronization with video and depth, metadata, or audio, but not two video outputs. From my testing, I've found that the dataOutputSynchronizer either fires with the wide video output, or the ultra video output, but never both (at least one is nil), suggesting that they are not being synchronized. self.outputSync = AVCaptureDataOutputSynchronizer(dataOutputs: [wideCameraOutput, ultraCameraOutput]) outputSync.setDelegate(self, queue: .main) ... func dataOutputSynchronizer(_ synchronizer: AVCaptureDataOutputSynchronizer, didOutput synchronizedDataCollection: AVCaptureSynchronizedDataCollection) { guard let syncWideData: AVCaptureSynchronizedSampleBufferData = synchronizedDataCollection.synchronizedData(for: self.wideCameraOutput) as? AVCaptureSynchronizedSampleBufferData, let syncedUltraData: AVCaptureSynchronizedSampleBufferData = synchronizedDataCollection.synchronizedData(for: self.ultraCameraOutput) as? AVCaptureSynchronizedSampleBufferData else { return; } // either syncWideData or syncUltraData is always nil, so the guard condition never passes. }
Posted
by
Post not yet marked as solved
0 Replies
119 Views
We found that crashes occur on some specific devices. But don't know the root cause for it. It only appears on the user side and cannot be reproduced on our local devices. From the stack, a crash occurs inside AVCapture after calling discoverySessionWithDeviceTypes: NSArray<AVCaptureDevice*>* GetVideoCaptureDevices() { NSArray* captureDeviceType = @[ AVCaptureDeviceTypeBuiltInWideAngleCamera, AVCaptureDeviceTypeExternalUnknown ]; AVCaptureDeviceDiscoverySession* deviceDiscoverySession = [AVCaptureDeviceDiscoverySession discoverySessionWithDeviceTypes:captureDeviceType mediaType:AVMediaTypeVideo position:AVCaptureDevicePositionUnspecified]; return deviceDiscoverySession.devices; } The following is the crash call stack: OS Version: macOS 13.5 (22G74) Report Version: 104 Crashed Thread: 10301 Application Specific Information: Fatal Error: EXC_BAD_INSTRUCTION / EXC_I386_INVOP / 0x7ff8194b3522 Thread 10301 Crashed: 0 AppKit 0x7ff8194b3522 -[NSApplication _crashOnException:] 1 AppKit 0x7ff8194b32b3 -[NSApplication reportException:] 2 AppKit 0x7ff819569efa NSApplicationUncaughtExceptionHandler 3 CoreFoundation 0x7ff8161c010a <unknown> 4 libobjc.A.dylib 0x7ff815c597c8 <unknown> 5 libc++abi.dylib 0x7ff815f926da std::__terminate 6 libc++abi.dylib 0x7ff815f92695 std::terminate 7 libobjc.A.dylib 0x7ff815c65929 <unknown> 8 libdispatch.dylib 0x7ff815e38046 _dispatch_client_callout 9 libdispatch.dylib 0x7ff815e39266 _dispatch_once_callout 10 AVFCapture 0x7ff8328cafb6 +[AVCaptureDALDevice devices] 11 AVFCapture 0x7ff832996410 +[AVCaptureDevice_Tundra _devicesWithAllowIOSMacEnvironment:] 12 AVFCapture 0x7ff83299652b +[AVCaptureDevice_Tundra _devicesWithDeviceTypes:mediaType:position:allowIOSMacEnvironment:] 13 AVFCapture 0x7ff83299e8c0 -[AVCaptureDeviceDiscoverySession_Tundra _initWithDeviceTypes:mediaType:position:allowIOSMacEnvironment:prefersUnsuspendedAndAllowsAnyPosition:] 14 AVFCapture 0x7ff83299e7a4 +[AVCaptureDeviceDiscoverySession_Tundra discoverySessionWithDeviceTypes:mediaType:position:] 15 Electron Framework 0x119453784 media::GetVideoCaptureDevices (video_capture_device_avfoundation_helpers.mm:22) I want to know what is the root cause of this crash. How should I simulate it and fix it? Any suggestions would be highly appreciated. Thank you.
Posted
by
Post not yet marked as solved
0 Replies
107 Views
For whoever needs to hear this... Say you have an AVURLAsset: let asset = AVURLAsset(url: URL(string: "https://www.example.com/playlist.m3u8")!) Then say you load that asset into an AVPlayerItem, and would like it to automatically load certain asset keys you're interested in ahead of time: let playerItem = AVPlayerItem( asset: avURLAsset, automaticallyLoadedAssetKeys: [ "metadata", "commonMetadata", "availableMetadataFormats", "allMediaSelections", "hasProtectedContent", "overallDurationHint"]) Among those keys, do not use "tracks" even though it's one of the available options. That will break AirPlay across all platforms (the user chooses an AirPlay destination and the AVPlayerItem's status instantly switches to failed). Took me far too long to track this down, just wanted to get it out there to save anybody else some time if they ever run into it.
Posted
by
Post not yet marked as solved
1 Replies
182 Views
I am using Lidar to measure the distance between the target point and the iPhone Pro. I am getting the correct distance only if I am greater than 70 cm away from the target point. I need that value to be accurate for distances below 70 cm as well. Is there any coding level issue or It's Lidar's limitations?
Posted
by
Post not yet marked as solved
1 Replies
275 Views
We are facing a weird behaviour when implementing the AirPlay functionality of our iOS app. When we test our app on Apple TV devices everything works fine. On some smart TVs with a specific AirPlay receiver version, (more details below) the stream gets stuck on buffering state immediately after switching to AirPlay mode. On other smart TVs, with different AirPlay receiver version, everything works as expected. The interesting part is that other free or DRM protected streams, work fine on all devices. Smart TVs that AirPlay works fine AirPlay Version -> 25.06 (19.9.9) Smart TVs that AirPlay stuck at buffering state: AirPlayReceiverSDKVersion -> 3.3.0.54 AirPlayReceiverAppVersion -> 53.122.0 You can reproduce this issue using the following stream url: https://tr.vod.cdn.cosmotetvott.gr/v1/310/668/1674288197219/1674288197219.ism/.m3u8?qual=a&ios=1&hdnts=st=1713194669\~exp=1713237899\~acl=\*/310/668/1674288197219/1674288197219.ism/\*\~id=cab757e3-9922-48a5-988b-3a4f5da368b6\~data=de9bbd0100a8926c0311b7dbe5389f7d91e94a199d73b6dc75ea46a4579769d7~hmac=77b648539b8f3a823a7d398d69e5dc7060632c29 If this link expires, notify me to send a new one for testing. Could you please provide to us any specific suggestion as to what causes this issue on those specific streams?
Posted
by
Post not yet marked as solved
0 Replies
189 Views
I'm trying to get a similar experience to Apple TV's immersive videos, but I cannot figure out how to present the AVPlayerViewController controls detached from the video. I am able to use the same AVPlayer in a window and projected on a VideoMaterial, but I can't figure out how to just present the controls, while displaying the video only in the 3D entity, without having a 2D projection in any view. Is this even possible?
Posted
by
Post not yet marked as solved
0 Replies
154 Views
I'm doing random access sampling from AVAsset of local h264 video file let track = asset.tracks(withMediaType: .video)[0] let assetReader = try! AVAssetReader(asset: asset) let trackOutput = AVAssetReaderTrackOutput(track: track, outputSettings: nil) trackOutput.supportsRandomAccess = true assetReader.add(trackOutput) assetReader.startReading() ... let targetFrameDTS = CMTime(value: 56, timescale: 30) let timeRange = CMTimeRange( start: CMTimeAdd(time, CMTime(value: -1, timescale: 30)), duration: CMTime(value: 2, timescale: 30) ) // reset output to be near target frame decoding time trackOutput.reset(forReadingTimeRanges: [NSValue(timeRange: timeRange)]) while assetReader.status == .reading { guard let sample = trackOutput.copyNextSampleBuffer() else { break } let dts = CMSampleBufferGetDecodeTimeStamp(sample) print("\(dts.value)/\(dts.timescale)") } for some reason with some targetFrameDTS assetReader copyNextSampleBuffer will skip samples. in my particular case the output is ... 47/30 48/30 50/30 51/30 54/30 55/30 57/30 why is it so?
Posted
by
Post not yet marked as solved
2 Replies
158 Views
As the title describe, I cause a crash when call [AVCaptureSession stopRunning] on macos 14.4.1. the crash stack is as below: Process: Nebula for Mac [31922] Path: /Applications/Nebula for Mac.app/Contents/MacOS/Nebula for Mac Identifier: ai.nreal.nebula.mac Version: 0.8.0.1098 (0.8.0) Code Type: ARM-64 (Native) Parent Process: launchd [1] User ID: 501 Date/Time: 2024-04-11 14:12:34.6474 +0800 OS Version: macOS 14.4.1 (23E224) Report Version: 12 Anonymous UUID: C438684A-95E7-7DA1-D063-81E1A5FBF5DC Sleep/Wake UUID: 3EB85031-82AC-4BDB-8F28-FAF4CBD28CA1 Time Awake Since Boot: 110000 seconds Time Since Wake: 1108 seconds System Integrity Protection: enabled Crashed Thread: 0 Dispatch queue: com.apple.avfoundation.proprietarydefaults.singleton.source_queue.0x202f8b460 Exception Type: EXC_CRASH (SIGTRAP) Exception Codes: 0x0000000000000000, 0x0000000000000000 Termination Reason: Namespace SIGNAL, Code 5 Trace/BPT trap: 5 Terminating Process: Nebula for Mac [31922] Thread 0 Crashed:: Dispatch queue: com.apple.avfoundation.proprietarydefaults.singleton.source_queue.0x202f8b460 0 libsystem_kernel.dylib 0x19c1a61f4 mach_msg2_trap + 8 1 libsystem_kernel.dylib 0x19c1b8b24 mach_msg2_internal + 80 2 libsystem_kernel.dylib 0x19c1aee34 mach_msg_overwrite + 476 3 libsystem_kernel.dylib 0x19c1a6578 mach_msg + 24 4 libdispatch.dylib 0x19c0513b0 _dispatch_mach_send_and_wait_for_reply + 544 5 libdispatch.dylib 0x19c051740 dispatch_mach_send_with_result_and_wait_for_reply + 60 6 libxpc.dylib 0x19bef2af0 xpc_connection_send_message_with_reply_sync + 288 7 AVFCapture 0x1b9e5565c -[CMIOProprietaryDefaultsSource setObject:forKey:] + 140 8 AVFCapture 0x1b9e57044 __58-[AVCaptureProprietaryDefaultsSingleton setObject:forKey:]_block_invoke + 36 9 libdispatch.dylib 0x19c0363e8 _dispatch_client_callout + 20 10 libdispatch.dylib 0x19c0458d8 _dispatch_lane_barrier_sync_invoke_and_complete + 56 11 AVFCapture 0x1b9e597e0 -[AVCaptureProprietaryDefaultsSingleton _runBlockOnProprietaryDefaultsSourceQueueSync:] + 136 12 AVFCapture 0x1b9e56fbc -[AVCaptureProprietaryDefaultsSingleton setObject:forKey:] + 180 13 AVFCapture 0x1b9e776a0 -[AVCaptureDALDevice _refreshCenterStageUnavailableReasons] + 400 14 AVFCapture 0x1b9e7d0fc -[AVCaptureDALDevice updateActivelyProvidingInputCountForActiveUseState:] + 488 15 AVFCapture 0x1b9e33474 -[AVCaptureSession_Tundra _updateNewActiveUseState:forConnection:] + 196 16 AVFCapture 0x1b9e32e4c -[AVCaptureSession_Tundra _setRunning:] + 428 17 AVFCapture 0x1b9e32a28 -[AVCaptureSession_Tundra stopRunning] + 432 18 libnr_api.dylib 0x1446d7514 0x144478000 + 2487572 19 libnr_api.dylib 0x14468a690 0x144478000 + 2172560 20 libnr_api.dylib 0x14468bcb0 0x144478000 + 2178224 21 libnr_api.dylib 0x1444d0268 0x144478000 + 361064 22 libnr_api.dylib 0x1444ecb00 0x144478000 + 477952 23 libnr_api.dylib 0x1444ec724 0x144478000 + 476964 24 libnr_api.dylib 0x144541bcc 0x144478000 + 826316 25 libnr_api.dylib 0x144543e00 0x144478000 + 835072 26 libnr_api.dylib 0x144543f88 0x144478000 + 835464 27 libnr_api.dylib 0x144542ca8 0x144478000 + 830632 28 GameAssembly.dylib 0x12117d4c4 0x120000000 + 18339012 29 GameAssembly.dylib 0x1211894e0 0x120000000 + 18388192 30 GameAssembly.dylib 0x121165fe4 0x120000000 + 18243556 31 GameAssembly.dylib 0x1202e4248 0x120000000 + 3031624 32 GameAssembly.dylib 0x12116931c 0x120000000 + 18256668 33 GameAssembly.dylib 0x1201dcdf0 0x120000000 + 1953264 34 GameAssembly.dylib 0x1201dcd2c 0x120000000 + 1953068 35 UnityPlayer.dylib 0x10428dc60 0x103c38000 + 6642784 36 UnityPlayer.dylib 0x104295170 0x103c38000 + 6672752 37 UnityPlayer.dylib 0x1042b1620 0x103c38000 + 6788640 38 UnityPlayer.dylib 0x103f788d0 0x103c38000 + 3410128 39 UnityPlayer.dylib 0x1040d8c4c 0x103c38000 + 4852812 40 UnityPlayer.dylib 0x1040d8c98 0x103c38000 + 4852888 41 UnityPlayer.dylib 0x1040d8f2c 0x103c38000 + 4853548 42 UnityPlayer.dylib 0x104b104b8 0x103c38000 + 15566008 43 UnityPlayer.dylib 0x104b10304 0x103c38000 + 15565572 44 Foundation 0x19d430224 __NSFireTimer + 104 45 CoreFoundation 0x19c2e1f90 CFRUNLOOP_IS_CALLING_OUT_TO_A_TIMER_CALLBACK_FUNCTION + 32 46 CoreFoundation 0x19c2e1c34 __CFRunLoopDoTimer + 972 47 CoreFoundation 0x19c2e176c __CFRunLoopDoTimers + 356 48 CoreFoundation 0x19c2c4ba4 __CFRunLoopRun + 1856 49 CoreFoundation 0x19c2c3e0c CFRunLoopRunSpecific + 608 50 HIToolbox 0x1a6a5f000 RunCurrentEventLoopInMode + 292 51 HIToolbox 0x1a6a5ec90 ReceiveNextEventCommon + 220 52 HIToolbox 0x1a6a5eb94 _BlockUntilNextEventMatchingListInModeWithFilter + 76 53 AppKit 0x19fb1c970 _DPSNextEvent + 660 54 AppKit 0x1a030edec -[NSApplication(NSEventRouting) _nextEventMatchingEventMask:untilDate:inMode:dequeue:] + 700 55 AppKit 0x19fb0fcb8 -[NSApplication run] + 476 56 AppKit 0x19fae6f54 NSApplicationMain + 880 57 UnityPlayer.dylib 0x104b0ffe4 PlayerMain(int, char const**) + 944 58 dyld 0x19be5e0e0 start + 2360
Posted
by
Post not yet marked as solved
0 Replies
149 Views
Hi guys, I'm designing a customized camera based on avfoundation. I can output Live Photo from avCaptureDeviceInput for now. I expect to take still and live Photos with different aspect ratio, just like the apple's camera app does (1:1, 4:3, 16:9). I didn't find any useful infos from docs, any suggestion?
Posted
by
Post not yet marked as solved
0 Replies
141 Views
On a Vision Pro I load an HDR video served over HLS using AVPlayer. Per FFMPEG the video has: pixel format: yuv420p10le color space / ycbcr matrix: bt2020nc color primaries: bt2020 transfer function: smte2084 I wanted to try out letting AVFoundation do all of the color conversion instead of making my own YUV -> RGB shader. To display a 10-bit texture in a drawable queue, the destination Metal texture format must be MTLPixelFormat.rgba16Float (no other formats above 8-bit are supported). So the pixel format I am capturing in is kCVPixelFormatType_64RGBAHalf since it's pretty close. It's worth noting that the AVAsset shows no track information...must be because it's HLS? I am using AVPlayerItemVideoOutput to get pixel buffers: AVPlayerItemVideoOutput(outputSettings: [ AVVideoColorPropertiesKey: [ AVVideoColorPrimariesKey: AVVideoColorPrimaries_ITU_R_2020, AVVideoTransferFunctionKey: AVVideoTransferFunction_SMPTE_ST_2084_PQ, AVVideoYCbCrMatrixKey: AVVideoYCbCrMatrix_ITU_R_2020 ], kCVPixelBufferPixelFormatTypeKey as String: Int(kCVPixelFormatType_64RGBAHalf), kCVPixelBufferMetalCompatibilityKey as String: true ]) I can change these settings in real time and see they are having an effect on my drawable queue. The BT.2020 primaries do not look correct to me, it's very bright and washed out. When I switch to BT.709 it looks closer to the output of the AVPlayer. The AVPlayer by itself doesn't look terrible, just a little dark maybe. When I leave out the outputSettings and let the AVPlayerItemVideoOutput choose its own color settings, it appears to choose BT.2020 also. Is it enough to put in these outputSettings and expect an RGB pixelBuffer that perfectly matches those settings? Or do I have to just capture in YUV and do all of the conversion manually? Am I misunderstanding something related to color settings here? I am definitely not an expert. Thanks
Posted
by
Post not yet marked as solved
0 Replies
172 Views
https://developer.apple.com/videos/play/wwdc2023/10235/ - In this WWDC session, at 3:19 - Apple has introduced **Other audio ducking ** feature In iOS17, we can control the amount of 'other audio' ducking through the AVAudioEngine. Is this also possible on AVAudioSession ? We are using an AVAudioSession for a VOIP call while concurrently attempting to play a video through an AVPlayer. However, the volume of the AVPlayer is considerably low. Does anyone have any ideas on how to achieve the level of control that AVAudioEngine offers?
Posted
by