Post not yet marked as solved
My iphone 12 is unable to open apps and it's able to use Apple services but I can't swipe the screen. I've tried to restart my phone but it's impossible. What's the problem with my phone ?
Post not yet marked as solved
Currently i am getting depth data from delegate and even i converted to CIImage to check it's out put and it's gray scale but i cannot append that pixel buffer to AVAssetWriterInputPixelBufferAdaptor becuase once i tried to save in photo gallery i get error mentioned below.
Error:
The operation couldn’t be completed. (PHPhotosErrorDomain error 3302.
Setup:
private let videoDeviceDiscoverySession = AVCaptureDevice.DiscoverySession(deviceTypes: [.builtInWideAngleCamera, .builtInDualCamera, .builtInTrueDepthCamera, .builtInDualWideCamera],mediaType: .video, position: .back)
I tried both video pixel formats:
videoDataOutput!.videoSettings = [kCVPixelBufferPixelFormatTypeKey as String: kCVPixelFormatType_DepthFloat16]
videoDataOutput!.videoSettings = [kCVPixelBufferPixelFormatTypeKey as String: kCMPixelFormat_422YpCbCr8]
Hi,
Is it possible to create a 3D model by ingesting a video ? As we create by using pictures ?
Is there any API provided by apple for this ?
Any help is appreciated.
Thanks
"Creating 3D models with Object Capture" the API provided is only working on console app now ?
It's working perfect on console app but not on macOS GUI app.
I couldn't find such information in the documents as well.
In my GUI app, I am getting the error as, "Process got error: invalidRequest(RealityFoundation.PhotogrammetrySession.Request.modelFile(url: file:///Users/s***ik/Desktop/Pot_usdz/sam.usdz, detail: RealityFoundation.PhotogrammetrySession.Request.Detail.preview, geometry: nil), ".modelFile directory path file:///Users/snayvik/Desktop/Pot_usdz/ cannot be written to!")"
Any help is appreciated.
I want to record the TrueDepth or Dual camera's depth data output when recording the video data. I have already managed to get the AVCaptureDepthDataOutput object and displayed it in realtime, but I also need the depth to be recorded as an individual track of AVMediaTypeVideo or AVMediaTypeMetadata in the movie, and read them back for post processing.
Compared to use AVCaptureMovieFileOutput, I use movieWriter and AVAssetWriterInputPixelBufferAdaptor to append pixel buffer. I have tried to append the streaming depth as normal AVAssetWriterInput with AVVideoCodecTypeH264, but failed.
Is it possible to append depth data buffer in the same way as video data for depth data, or with any other way of doing it?
Post not yet marked as solved
Can hellp photogrammetry app export files as .glb?