Discuss using the camera on Apple devices.

Posts under Camera tag

126 Posts
Sort by:
Post not yet marked as solved
1 Replies
179 Views
Hello! I am having trouble calculating accurate distances in the real world using the camera's returned intrinsic matrix and pixel coordinates/depths captured from the iPhone's LiDAR. For example, in the image below, I set a mug 0.5m from the phone. The mug is 8.5cm wide. The intrinsic matrix returned from the phone's AVCameraCalibrationData class has focalx = 1464.9269, focaly = 1464.9269, cx = 960.94916, and cy = 686.3547. Selecting the two pixel locations denoted in the image below, I calculated each one's xyz coordinates using the formula: x = d * (u - cx) / focalx y = d * (v - cy) / focaly z = d Where I get depth from the appropriate pixel in the depth map - I've verified that both depths were 0.5m. I then calculate the distance between the two points to get the mug width. This gives me a calculated width of 0.0357, or 3.5 cm, instead of the 8.5cm I was expecting. What could be accounting for this discrepancy? Thank you so much for your help!
Posted
by mingm.
Last updated
.
Post not yet marked as solved
0 Replies
153 Views
At around the 5 min mark of "Discover advancements in iOS camera capture: Depth, focus, and multitasking", you state that TrueDepth delivers relative depth. This appears to contradict official documentation. In Capturing Photos with Depth, you state explicitly that TrueDepth camera measures depth directly with absolute accuracy. Why the change?
Posted
by kaccie14.
Last updated
.
Post not yet marked as solved
20 Replies
13k Views
Hi! I recently bought the new iPhone 12 Pro Max. I have noticed that when I shoot video's in the dark (with the lights on in the house), there is some kind of flickering visible in the video. Apparently it is possible that due to very fast flickering of lights, slowmo video's make this kind of flickering visible when you can not see it with the ***** eye. I however have this problem with normal video's as well. I have compared it with the video's on my iPhone X and it is definitely worse in my iPhone 12 video's. I noticed that this happens while recording video on HD (or 4K) at 60 FPS, if you switch to 30 FPS this doesn't happen. Anyone else that has this problem? Problem happening on iOS 14.2.1 and iOS 14.3 Beta 2. Thanks!
Posted Last updated
.
Post not yet marked as solved
5 Replies
177 Views
_streamSinkIn = [[CMIOExtensionStream alloc] initWithLocalizedName:localizedName streamID:streamInputID direction:CMIOExtensionStreamDirectionSink clockType:CMIOExtensionStreamClockTypeHostTime source:self]; Attempting to publish a CMIOExtensionStream with the 'sink' direction (i.e. print-to-tape) as alluded to in Brad Ford's presentation. Any attempt to create such a stream yields and invalid argument exception and if you examine the header files all the init methods are described as returning stream instances that source data (ie camera publishers). *** Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: 'Invalid argument'
Posted
by DrXibber.
Last updated
.
Post not yet marked as solved
1 Replies
110 Views
Hello folks! How can I get a real-world measurement between the device (iPad Pro 5th. gen) and an object measured with the LiDAR? Let's say I have a reticle in the middle of my CameraView and want to measure precisely from my position to that point I'm aiming?. Almost like the "Measure App" from Apple. sceneDepth doesn't give me anything. I also looked into the Sample Code "Capturing Depth Using the LiDAR Camera" Any ideas how to do that? A push in to the right direction might also be very helpful Thanks in advance!
Posted
by Erik13.
Last updated
.
Post not yet marked as solved
6 Replies
1.2k Views
I installed the iOS16 beta to test the new features and everything works well except the camera, which when I access it the screen is completely black. Initially I upgraded from iOS15, after restarting, turning off/on, I tried to restore the iPhone in different ways (from the Mac and from the iPhone) without solving the problem. I also tried to restore to iOS15 to rule out a hardware problem and here the camera worked correctly, so I upgraded back to iOS16 and it stopped working. The camera does not work in any application, in the Camera, in iMessage, Halide, Instagram, WhatsApp,... The controls and buttons of the camera app do work, but it doesn't take the photo or see anything.
Posted
by xtianp87.
Last updated
.
Post not yet marked as solved
1 Replies
101 Views
How can I make sure my app on iOS AppStore only show compatibility for AVCaptureMultiCamSession enabled devices only? I need to write a key under "Required Device Capabilities" in the info.plist file, but which key? I didn't find the key in the documentation https://developer.apple.com/library/archive/documentation/General/Reference/InfoPlistKeyReference/Articles/iPhoneOSKeys.html#//apple_ref/doc/uid/TP40009252-SW3
Posted
by LaXa.
Last updated
.
Post not yet marked as solved
1 Replies
243 Views
While trying to re-create the CIFilterCam demo shown in the WWDC session, I hit a roadblock when trying to access a hardware camera from inside my extension. Can I simply use an AVCaptureSession + AVCaptureDeviceInput + AVCaptureVideoDataOutput to get frames from an actual hardware camera and pass them to the extension's stream? If yes, when should I ask for camera access permissions? It seems the extension code is run as soon as I install the extension, but I never get prompted for access permission. Do I need to set up the capture session lazily? What's the best practice for this use case?
Posted Last updated
.
Post not yet marked as solved
0 Replies
66 Views
So, we use ARFaceTrackingConfiguration and ARKit for a magic mirror like experience in our apps, augmenting users faces with digital content. On the iPad Pro 5gen customers are complaining that the camera image is too wide, I'm assuming that is because of the new wide-angle camera necessary for Apples center-stage Facetime calls? I have looked through Tracking and Visualizing Faces and the WWDC 2021 videos, but I must have missed any API's that allow us to disable the wide-angle feature on the new iPads programmatically?
Posted
by Bersaelor.
Last updated
.
Post not yet marked as solved
0 Replies
72 Views
Currently I'm using UIImagePickerController to allow our users to take photos within the app like so: UIImagePickerController *picker = [UIImagePickerController new]; picker.sourceType = UIImagePickerControllerSourceTypeCamera; picker.mediaTypes = @[(NSString *)kUTTypeImage]; picker.delegate = self; [self presentViewController:picker animated:YES completion:nil]; And I use the delegate method to get the image out and do what is needed: -(void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary<NSString *,id> *)info { // Do stuff. } This seems to work fine for 99.9% of our users but for some reason we occasionally have an odd info dictionary with no image in it. When I print the info dictionary, it looks like this every time: {    UIImagePickerControllerMediaMetadata =     {        "{MakerApple}" =         {            25 = 0;        };    };    UIImagePickerControllerMediaType = "public.image"; } As you can see there is no UIImagePickerControllerEditedImage or UIImagePickerControllerOriginalImage in that dictionary. Anyone have any ideas on what this is, and what I might be able to do to 'fix it' ?
Posted
by Moff.
Last updated
.
Post not yet marked as solved
0 Replies
98 Views
Hi everyone, I’m trying to use builtinDualWideCamera. However, I’m seeing the “macro camera issue” when shooting the closer object. The lens will be automatically switched. I see users can solve it with “macro control” in settings for the system camera. I wonder is there a similar API that developers can use to disable the automatic lens switching? Thanks!
Posted
by ios99.
Last updated
.
Post not yet marked as solved
1 Replies
224 Views
My macOS app (targeting Catalina only) uses camera, mic and screen recording. While developing the app the system asks me permission every time I rebuild and run the app. This does not happen on iOS. Is there anyway to prevent this? Secondly when I distribute the app to other Macs, every build needs the consent re-affirmed. This doesn't seem like the way it should be. What could I be doing wrong?
Posted Last updated
.
Post not yet marked as solved
0 Replies
165 Views
Issue After an Update to 12.4 my AVCapture session is using between 70-90% CPU That code is just a simple capture session in a Mac Catalyst App with the webcam of the Mac. The session calls an empty func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) and nothing more is happing in the test app I tried. Also no rendering of the stream. My test code ran with 10-15% CPU usage a week ago, before I made an update today to macOS 12.4 - with rendering into a Metal view maybe 20-25% To reproduce: take AVCam project https://developer.apple.com/documentation/avfoundation/capture_setup/avcam_building_a_camera_app Set checkbox for Mac Catalyst Support in General run on Mac The original sample code from Apple is also running with the same high CPU usage. But even my AVCaptureSession code ran a week ago at about 15% which might even be too much. I have tested this on a 2019 Intel MacBook pro and also tried it on a macOS 12.4 iMac which did result in the same high CPU usage with all the test apps. Update to macOS 12.5 Beta 3 did also not help. But I know it ran better with macOS <12.4 and also almost 100% CPU even with 800% overall available is too much for just camera capture Profiling in Instruments Insturments is showing me with the heaviest stack trace a lot VN calls with object detection - is this the camera autofocusing? Do I need to put more options for the capture device? I don't call anything from the Vision Framework - is this happening automatically? It looks like this, feels like a lot of work for just the Mac webcam: what to do? Is this a problem with macOS 12.4? Do you have a better running Capture session on 12.4 and what is needed to archive that? Could this be a Catalyst problem? Is this a bug which needs a bug report with Apple? I can't really go back to a previous version, and it would be neat if the code would also work on macOS 12.4 haha
Posted
by bennibeef.
Last updated
.
Post marked as solved
1 Replies
120 Views
IOS 16 developer issue - camera shows as a 3rd party replacement not as original equipment . Phone is a Iphone13 pro max bought brand new never repaired . Issue started as black screen when on photo mode or qr scan using back camera. All other modes had picture . Rebooted .. camera now works all modes but shows as unsupported replacement .
Posted Last updated
.
Post marked as solved
1 Replies
259 Views
Here's basic code of my ToDo App: Data: struct ToDo: Codable {     var title: String     var isCompleted: Bool     var dateCreated: Date     var notes:  String      static let DocumentsDirectory =     FileManager.default.urls(for: .documentDirectory, in: .userDomainMask).first!     static let ArchiveURL = DocumentsDirectory.appendingPathComponent("todos") .appendingPathExtension("plist")     static func loadToDos() -> [ToDo]? {         guard let codedToDos = try? Data(contentsOf: ArchiveURL) else {return nil}         let propertyListDecoder = PropertyListDecoder()         return try? propertyListDecoder.decode(Array<ToDo>.self, from: codedToDos)     }     static func saveToDos(_ todos: [ToDo]) {         let propertyListEncoder = PropertyListEncoder()         let codedToDos = try? propertyListEncoder.encode(todos)         try? codedToDos?.write(to: ArchiveURL,           options: .noFileProtection)     }    I have a tableViewController to display the data's detail, there's a noteTextView (textView) for editing/adding the todo.note. There's a Camera Button allows user to insert the images into (textView) todo.note using NSTextAttachment(): internal func imagePickerController(_ picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [UIImagePickerController.InfoKey : Any]) {         let attachment = NSTextAttachment()         let image = info[.originalImage] as! UIImage         attachment.image = image         //Resize Photo to fit in noteTextView: calculate new size so want a litter space on the right of image         let newImageWidth = (noteTextView.bounds.size.width - 20)         let scale = newImageWidth/image.size.width         let newImageHeight = image.size.height * scale         attachment.bounds = CGRect.init(x: 0, y: 0, width: newImageWidth, height: newImageHeight)         //attributedString=NSTextAttachment()         let imageString = NSAttributedString(attachment: attachment)         // add this attributed string to the cusor position         noteTextView.textStorage.insert(imageString, at: noteTextView.selectedRange.location)         picker.dismiss(animated: true, completion: nil)     } The code working well. Now, how do I save it (with images) to DocumentsDirectory and how do I load it back? I can save/load without images. Thanks.
Posted Last updated
.
Post not yet marked as solved
0 Replies
125 Views
It’d be nice if the extension could inherit the permissions given to the main app by the user, in my case screen capture. Alternatively, is there a way to send data to the extension from the app, or vice versa? Especially image or video data. thanks! laurent
Posted
by ldenoue.
Last updated
.
Post not yet marked as solved
0 Replies
104 Views
Apologies if this has been asked. I was reviewing the transcript iOS 16 camera improvement as it relates to depth and depth maps. It’s my understanding that said models including LIDAR scanners play a big role in the resulting depth maps that are captured when taking an image. I know this is an improvement from models that rely on Truedepth cameras, but how does this play into the new Lock Screen setup? Is this effect created solely through software automations or does the presence of LIDAR and depth maps influence the results when it comes to creating the depth effect that pulls subjects to the front while allowing the background to remain in the foreground? Thanks so much in advanced!
Posted
by kesen.
Last updated
.
Post marked as solved
1 Replies
193 Views
Continuity Camera is a way to stream raw video and metadata from iPhone to mac. Is it possible for an iPhone local recording app to use camera continuity to stream a preview from iPhone to mac? Can camera continuity be made available on iPad, so that one can stream video/metadata to iPad screen (use case being a need to use better camera and user does not have a mac-book)
Posted Last updated
.