What’s new in camera capture

RSS for tag

Discuss the WWDC21 session What’s new in camera capture.

View Session

Posts under wwdc21-10047 tag

14 Posts
Sort by:
Post not yet marked as solved
1 Replies
204 Views
All apps need to use AUVoiceIO in order to use Mic Modes, but what is AUVoiceIO? I searched the Apple Developer Document, but there was no description of AUVoiceIO anywhere. Why didn't the Apple Developer Document include a description of AUVoiceIO? Isn't it possible to use AUVoiceIO with this?
Posted Last updated
.
Post marked as solved
1 Replies
747 Views
I want to record the TrueDepth or Dual camera's depth data output when recording the video data. I have already managed to get the AVCaptureDepthDataOutput object and displayed it in realtime, but I also need the depth to be recorded as an individual track of AVMediaTypeVideo or AVMediaTypeMetadata in the movie, and read them back for post processing. Compared to use AVCaptureMovieFileOutput, I use movieWriter and AVAssetWriterInputPixelBufferAdaptor to append pixel buffer. I have tried to append the streaming depth as normal AVAssetWriterInput with AVVideoCodecTypeH264, but failed. Is it possible to append depth data buffer in the same way as video data for depth data, or with any other way of doing it?
Posted Last updated
.
Post not yet marked as solved
0 Replies
199 Views
Hi, I searched all over the documentation, but I didn't find anything about that. Is there a method to save a frame as UIImage from an AVCaptureSession? Basically what I'm trying to do is saving an image as preview of an output of a capture session. Thank you anciently!
Posted Last updated
.
Post not yet marked as solved
0 Replies
394 Views
I am a new ios developer and I am trying to use the media picker and default camera to record slow motion video. I am able to record video but it's 30 fps, not the 240 fps I want. Why doesn't the "slow-mo" option show up on the UI when the picker displays? How do I get that "slow-mo" option to show up like it does on the standard photo app? Is that the way this should work or is there a code-based approach to setting configuration that will allow for slow motion video to be recorded
Posted
by ruwill.
Last updated
.
Post not yet marked as solved
0 Replies
295 Views
I am trying to show "Mic Mode" while I am capturing audio for my app but it is not shown. Is there an example code or what option do I need to include in info.plist to opt-in to show "Mic Mode"? I can only see "Camera - Opt-in for Portrait Effect" there are no options for "Mic Mode".
Posted
by Andykkt74.
Last updated
.
Post not yet marked as solved
0 Replies
296 Views
@AVFoundationEngineers I am trying to observe isCenterStageEnabled property as follows: AVCaptureDevice.self.addObserver(self, forKeyPath: "isCenterStageEnabled", options: [.initial, .new], context: &CapturePipeline.centerStageContext) I have set the centerStageControlMode to .cooperative. The KVO fires only when I do make changes to property AVCaptureDevice.isCenterStageEnabled in my code. KVO is NOT fired when the user toggles the centerStage property from Control Center. Is this a bug?
Posted Last updated
.
Post not yet marked as solved
1 Replies
413 Views
In the WWDC 2021 video 10047, it was mentioned to look for availability of Lossless CVPixelBuffer format and fallback to normal BGRA32 format if it is not available. But in the updated AVMultiCamPiP sample code, it first looks for Lossy format than the lossless. Why is it so and whats the exact difference it would make if we select lossy vs lossless?
Posted Last updated
.
Post not yet marked as solved
0 Replies
419 Views
Can center stage work with any app that has video. I am building an App that uses the video as a presentation that can be a live presentation, recorded or live streamed. Not quite a videoconferencing App but close. Thanks, Chris
Posted Last updated
.
Post not yet marked as solved
0 Replies
896 Views
CameraUI -[CAMPriorityNotificationCenter _removeObserver:fromObserversByName:] i am seeing crashes on firebase and here is the stacktrace Fatal Exception: NSGenericException *** Collection <__NSArrayM: 0x28024a0a0> was mutated while being enumerated. keyboard_arrow_up 0 CoreFoundation __exceptionPreprocess 2 arrow_drop_down CoreFoundation -[__NSSingleObjectEnumerator initWithObject:collection:] arrow_right 3 CameraUI -[CAMPriorityNotificationCenter _removeObserver:fromObserversByName:] 4 CameraUI -[CAMPriorityNotificationCenter removeObserver:] 13 arrow_drop_down libsystem_pthread.dylib start_wqthread Crashed: com.google.firebase.crashlytics.ios.exception SIGABRT ABORT 0x00000001c1eb7334 keyboard_arrow_up 0 FirebaseCrashlytics FIRCLSProcess.c - Line 393 FIRCLSProcessRecordAllThreads + 393 1 FirebaseCrashlytics FIRCLSProcess.c - Line 424 FIRCLSProcessRecordAllThreads + 424 2 FirebaseCrashlytics FIRCLSHandler.m - Line 34 FIRCLSHandler + 34 3 FirebaseCrashlytics FIRCLSException.mm - Line 218 __FIRCLSExceptionRecord_block_invoke + 218 4 libdispatch.dylib _dispatch_client_callout + 20 5 arrow_drop_down libdispatch.dylib _dispatch_lane_barrier_sync_invoke_and_complete + 60 6 FirebaseCrashlytics FIRCLSException.mm - Line 225 FIRCLSExceptionRecord + 225 7 FirebaseCrashlytics FIRCLSException.mm - Line 111 FIRCLSExceptionRecordNSException + 111 8 FirebaseCrashlytics FIRCLSException.mm - Line 279 FIRCLSTerminateHandler() + 279 9 libc++abi.dylib std::__terminate(void (*)()) + 20 24 arrow_drop_down libsystem_pthread.dylib start_wqthread + 8 This started happening in 14.4 and above and was working fine in the previous version and am recently seeing an increase in the crash. Is it an OS or am i doing something wrong
Posted Last updated
.