Discuss using the camera on Apple devices.

Posts under Camera tag

127 Posts
Sort by:
Post marked as solved
1 Replies
262 Views
Here's basic code of my ToDo App: Data: struct ToDo: Codable {     var title: String     var isCompleted: Bool     var dateCreated: Date     var notes:  String      static let DocumentsDirectory =     FileManager.default.urls(for: .documentDirectory, in: .userDomainMask).first!     static let ArchiveURL = DocumentsDirectory.appendingPathComponent("todos") .appendingPathExtension("plist")     static func loadToDos() -> [ToDo]? {         guard let codedToDos = try? Data(contentsOf: ArchiveURL) else {return nil}         let propertyListDecoder = PropertyListDecoder()         return try? propertyListDecoder.decode(Array<ToDo>.self, from: codedToDos)     }     static func saveToDos(_ todos: [ToDo]) {         let propertyListEncoder = PropertyListEncoder()         let codedToDos = try? propertyListEncoder.encode(todos)         try? codedToDos?.write(to: ArchiveURL,           options: .noFileProtection)     }    I have a tableViewController to display the data's detail, there's a noteTextView (textView) for editing/adding the todo.note. There's a Camera Button allows user to insert the images into (textView) todo.note using NSTextAttachment(): internal func imagePickerController(_ picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [UIImagePickerController.InfoKey : Any]) {         let attachment = NSTextAttachment()         let image = info[.originalImage] as! UIImage         attachment.image = image         //Resize Photo to fit in noteTextView: calculate new size so want a litter space on the right of image         let newImageWidth = (noteTextView.bounds.size.width - 20)         let scale = newImageWidth/image.size.width         let newImageHeight = image.size.height * scale         attachment.bounds = CGRect.init(x: 0, y: 0, width: newImageWidth, height: newImageHeight)         //attributedString=NSTextAttachment()         let imageString = NSAttributedString(attachment: attachment)         // add this attributed string to the cusor position         noteTextView.textStorage.insert(imageString, at: noteTextView.selectedRange.location)         picker.dismiss(animated: true, completion: nil)     } The code working well. Now, how do I save it (with images) to DocumentsDirectory and how do I load it back? I can save/load without images. Thanks.
Posted
by
Post not yet marked as solved
0 Replies
143 Views
We are using Chrome Browser (Customer request) to scan QRCode and each time we are ask for Camera permission. It does not do that for Safari. I saw some places that it can be set on each web page by accessing Chrome setting directly and not the apple setting for Chrome. (Never found the setting for the camera in chrome) Thank you very much for the help (Event if it is not possible at this point to do that) Regards Christian
Posted
by
Post not yet marked as solved
0 Replies
149 Views
First, thanks for your time ! I'm trying to code my own app, she had to link 2 iPhone with Bluetooth. The first iPhone had to send live camera flow to the other with Bluetooth link when the app is running in sender mode. The second iPhone had to show the live, and offer the possibility to take instant photos when the app is running in receiver mode. I don't want to install an app because I want to include other special actions in the future. First question : May I use swift or C ? If you need more details, don't hesitate! Arthur
Posted
by
Post not yet marked as solved
6 Replies
344 Views
The problem is that I have added the string privacy - camera usage description to the info.plist file and it does not appear when using the camera, what could be happening? , I have restarted the xcode several times and I have simulated the application several times and it still does not appear. If someone can help, thank you very much in advance.
Posted
by
Post not yet marked as solved
0 Replies
228 Views
Hi every I am new here, i figure out why is they say is crash, after i check it is camera not working in iPad, but why? in mac unity editor the unity Vuforia work smoothly, but i build with XCode and run on mac the camera is not working. i have try some solution from website such as tick AR Kit selection in player setting, must select both landscape left and right. iOS minimum set to 12, put camera use details, Allow camera permission. But still not work, why? Any one same with me? I need help Unity version : 2021.3.0f1 Vuforia version:10.6.3 My Mac OS: macOS Monterey version 12.2.1 Xcode version:13.3.1 Thank you "My Last Question here"(https://developer.apple.com/forums/thread/705225)
Posted
by
Post not yet marked as solved
1 Replies
183 Views
Hello! I am having trouble calculating accurate distances in the real world using the camera's returned intrinsic matrix and pixel coordinates/depths captured from the iPhone's LiDAR. For example, in the image below, I set a mug 0.5m from the phone. The mug is 8.5cm wide. The intrinsic matrix returned from the phone's AVCameraCalibrationData class has focalx = 1464.9269, focaly = 1464.9269, cx = 960.94916, and cy = 686.3547. Selecting the two pixel locations denoted in the image below, I calculated each one's xyz coordinates using the formula: x = d * (u - cx) / focalx y = d * (v - cy) / focaly z = d Where I get depth from the appropriate pixel in the depth map - I've verified that both depths were 0.5m. I then calculate the distance between the two points to get the mug width. This gives me a calculated width of 0.0357, or 3.5 cm, instead of the 8.5cm I was expecting. What could be accounting for this discrepancy? Thank you so much for your help!
Posted
by
Post not yet marked as solved
0 Replies
179 Views
Please can anyone suggest if they have attempted to have camera on and dynamic text overlaying done depending on what is identified in the view. Eg. point a camera to the fruit and i should be able to identify the fruit and display text over the camera feed. The moment i move to next object it should ask me if ii want to save this or discard to move to new object.
Posted
by
Post not yet marked as solved
1 Replies
152 Views
My barely 6-month old iphone 13 pro has this colossal camera issue. I think it happened after the latest update but being a professional dancer, I am constantly wanting to record videos (the sole reason I bought this super expensive phone) and thanks to apple and its horrible product, I am unable to do that. Nobody seems to be talking about it and I am amazed
Posted
by
Post not yet marked as solved
1 Replies
182 Views
I am trying to make a simple camera app. But when I execute the code with iOS simulator, the iphoe show only black image and appear anything. I made it with the code below and two permissions which are Privacy - Camera Usage Description and Privacy - Photo Library Additions Usage Description. I need some help for solving this. import UIKit class ViewController: UIViewController, UIImagePickerControllerDelegate, UINavigationControllerDelegate {   @IBOutlet weak var imageView: UIImageView!   @IBAction func launchCamera(_ sender: UIBarButtonItem) {     let camera = UIImagePickerController.SourceType.camera     if UIImagePickerController.isSourceTypeAvailable(camera){       let picker = UIImagePickerController()       picker.sourceType = camera       picker.delegate = self       self.present(picker, animated: true)     }   }       func imagePickerController(_ picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [UIImagePickerController.InfoKey : Any]) {           let image = info[UIImagePickerController.InfoKey.originalImage] as! UIImage     self.imageView.image = image     UIImageWriteToSavedPhotosAlbum(image, nil, nil, nil)     self.dismiss(animated: true)   }       override func viewDidLoad() {     super.viewDidLoad()     // Do any additional setup after loading the view.   }
Posted
by
Post not yet marked as solved
0 Replies
151 Views
I believe I know the answer to this, but I will ask anyway- is it possible to limit camera stabilization to only one axis, instead of both? I want to use my app with an anamorphic lens, and like Panasonic’s implementation of its anamorphic-mode stabilization of the GH5/6, I’d like to only stabilize one axis. Any chance this is doable?
Posted
by
Post not yet marked as solved
1 Replies
205 Views
All apps need to use AUVoiceIO in order to use Mic Modes, but what is AUVoiceIO? I searched the Apple Developer Document, but there was no description of AUVoiceIO anywhere. Why didn't the Apple Developer Document include a description of AUVoiceIO? Isn't it possible to use AUVoiceIO with this?
Posted
by
Post not yet marked as solved
1 Replies
194 Views
Hi there, I wonder are there APIs that I can use to achieve the system camera zooming effect with front facing camera? (pressing the button to get wider frames) I can only find wide angle camera and true depth camera for the front position which don't support this operation.
Posted
by
Post not yet marked as solved
0 Replies
293 Views
UVCAssistant got crash at M1 chip with Monterey with 10g speed. 5g speed is ok, Mac OS before Monterey is ok too. The camera device can be recognized by the system, but QuickTime Player can't display video stream. Here is UVCAssistant crash log. hope it's helpful. crash log
Posted
by
Post not yet marked as solved
0 Replies
128 Views
Hi, The infrared image taken by TrueDepth camera seems not to include the heat data, am I correct to say that? Looks like it covers the depth of an image only.
Posted
by
Post not yet marked as solved
0 Replies
316 Views
Currently i am getting depth data from delegate and even i converted to CIImage to check it's out put and it's gray scale but i cannot append that pixel buffer to AVAssetWriterInputPixelBufferAdaptor becuase once i tried to save in photo gallery i get error mentioned below. Error: The operation couldn’t be completed. (PHPhotosErrorDomain error 3302. Setup: private let videoDeviceDiscoverySession = AVCaptureDevice.DiscoverySession(deviceTypes: [.builtInWideAngleCamera, .builtInDualCamera, .builtInTrueDepthCamera, .builtInDualWideCamera],mediaType: .video, position: .back) I tried both video pixel formats: videoDataOutput!.videoSettings = [kCVPixelBufferPixelFormatTypeKey as String: kCVPixelFormatType_DepthFloat16] videoDataOutput!.videoSettings = [kCVPixelBufferPixelFormatTypeKey as String: kCMPixelFormat_422YpCbCr8]
Posted
by