Post not yet marked as solved
All apps need to use AUVoiceIO in order to use Mic Modes, but what is AUVoiceIO?
I searched the Apple Developer Document, but there was no description of AUVoiceIO anywhere.
Why didn't the Apple Developer Document include a description of AUVoiceIO?
Isn't it possible to use AUVoiceIO with this?
I want to record the TrueDepth or Dual camera's depth data output when recording the video data. I have already managed to get the AVCaptureDepthDataOutput object and displayed it in realtime, but I also need the depth to be recorded as an individual track of AVMediaTypeVideo or AVMediaTypeMetadata in the movie, and read them back for post processing.
Compared to use AVCaptureMovieFileOutput, I use movieWriter and AVAssetWriterInputPixelBufferAdaptor to append pixel buffer. I have tried to append the streaming depth as normal AVAssetWriterInput with AVVideoCodecTypeH264, but failed.
Is it possible to append depth data buffer in the same way as video data for depth data, or with any other way of doing it?
Post not yet marked as solved
Hi everyone,
I'm working on a camera app, but there is an error and I don't know how to fix it. Can anyone help me?
Thanks,
Robby Flockman
Post not yet marked as solved
Hi,
I searched all over the documentation, but I didn't find anything about that. Is there a method to save a frame as UIImage from an AVCaptureSession?
Basically what I'm trying to do is saving an image as preview of an output of a capture session.
Thank you anciently!
Post not yet marked as solved
I am a new ios developer and I am trying to use the media picker and default camera to record slow motion video. I am able to record video but it's 30 fps, not the 240 fps I want. Why doesn't the "slow-mo" option show up on the UI when the picker displays? How do I get that "slow-mo" option to show up like it does on the standard photo app? Is that the way this should work or is there a code-based approach to setting configuration that will allow for slow motion video to be recorded
Post not yet marked as solved
I am trying to show "Mic Mode" while I am capturing audio for my app but it is not shown.
Is there an example code or what option do I need to include in info.plist to opt-in to show "Mic Mode"?
I can only see "Camera - Opt-in for Portrait Effect" there are no options for "Mic Mode".
Post not yet marked as solved
@AVFoundationEngineers
I am trying to observe isCenterStageEnabled property as follows:
AVCaptureDevice.self.addObserver(self, forKeyPath: "isCenterStageEnabled", options: [.initial, .new], context: &CapturePipeline.centerStageContext)
I have set the centerStageControlMode to .cooperative.
The KVO fires only when I do make changes to property AVCaptureDevice.isCenterStageEnabled in my code. KVO is NOT fired when the user toggles the centerStage property from Control Center. Is this a bug?
Post not yet marked as solved
WWDC 2021 session 10047 recommends to observe changes in AVCaptureDevice.isCenterStageEnabled which is a class property. But how exactly do we observe a class property in Swift?
Post not yet marked as solved
In the WWDC 2021 video 10047, it was mentioned to look for availability of Lossless CVPixelBuffer format and fallback to normal BGRA32 format if it is not available. But in the updated AVMultiCamPiP sample code, it first looks for Lossy format than the lossless. Why is it so and whats the exact difference it would make if we select lossy vs lossless?
Post not yet marked as solved
After calling the function 'captureTextFromCamera:' of an UIResponder-object multiple times, then the LiveText Interface can not be dismissed.
Post not yet marked as solved
Portrait mode does not work when i use the sample AVCam. i wonder know if there is any examples about this feature?
Post not yet marked as solved
AVFoundation setPreferredPolarPattern AVAudioSessionPolarPatternStereo returning no audio on iOS 15. AVAudioSessionPolarPatternOmnidirectional works as expected.
Post not yet marked as solved
Can center stage work with any app that has video. I am building an App that uses the video as a presentation that can be a live presentation, recorded or live streamed. Not quite a videoconferencing App but close.
Thanks,
Chris
Post not yet marked as solved
CameraUI
-[CAMPriorityNotificationCenter _removeObserver:fromObserversByName:] i am seeing crashes on firebase and here is the stacktrace
Fatal Exception: NSGenericException
*** Collection <__NSArrayM: 0x28024a0a0> was mutated while being enumerated.
keyboard_arrow_up
0
CoreFoundation
__exceptionPreprocess
2
arrow_drop_down
CoreFoundation
-[__NSSingleObjectEnumerator initWithObject:collection:]
arrow_right
3
CameraUI
-[CAMPriorityNotificationCenter _removeObserver:fromObserversByName:]
4
CameraUI
-[CAMPriorityNotificationCenter removeObserver:]
13
arrow_drop_down
libsystem_pthread.dylib
start_wqthread
Crashed: com.google.firebase.crashlytics.ios.exception
SIGABRT ABORT 0x00000001c1eb7334
keyboard_arrow_up
0
FirebaseCrashlytics
FIRCLSProcess.c - Line 393
FIRCLSProcessRecordAllThreads + 393
1
FirebaseCrashlytics
FIRCLSProcess.c - Line 424
FIRCLSProcessRecordAllThreads + 424
2
FirebaseCrashlytics
FIRCLSHandler.m - Line 34
FIRCLSHandler + 34
3
FirebaseCrashlytics
FIRCLSException.mm - Line 218
__FIRCLSExceptionRecord_block_invoke + 218
4
libdispatch.dylib
_dispatch_client_callout + 20
5
arrow_drop_down
libdispatch.dylib
_dispatch_lane_barrier_sync_invoke_and_complete + 60
6
FirebaseCrashlytics
FIRCLSException.mm - Line 225
FIRCLSExceptionRecord + 225
7
FirebaseCrashlytics
FIRCLSException.mm - Line 111
FIRCLSExceptionRecordNSException + 111
8
FirebaseCrashlytics
FIRCLSException.mm - Line 279
FIRCLSTerminateHandler() + 279
9
libc++abi.dylib
std::__terminate(void (*)()) + 20
24
arrow_drop_down
libsystem_pthread.dylib
start_wqthread + 8
This started happening in 14.4 and above and was working fine in the previous version and am recently seeing an increase in the crash. Is it an OS or am i doing something wrong