Discuss using the camera on Apple devices.

Posts under Camera tag

171 Posts
Sort by:

Post

Replies

Boosts

Views

Activity

Adding AVCameraCapture & AVFoundation to AppDelegate file
Hello!! I think I found the code that will help with the camera capture function within my app but I don't know how to add it correctly within XCode. https://developer.apple.com/documentation/avfoundation/capture_setup/choosing_a_capture_device https://fek.io/blog/why-you-need-to-add-an-app-delegate-to-your-swift-ui-app/ If any of yall have worked with this code would ya help? Thank Ya :)
0
0
429
Aug ’23
How do I detect whether an iPad's camera location is landscape or portrait?
The 10th-Gen iPad differs from its predecessors by having a camera that's located at the top of its landscape orientation. This is a headache for me since my app needs to know the rough camera location given the device's orientation for AR purposes. I can find out whether the device is a tablet or not, but I can't find out whether it's an iPad 10. Are there any direct or indirect ways for me to find out whether a camera is placed for portrait or landscape use?
0
0
495
Aug ’23
AVCam vs Apple Photo App for stabilized video recording
I have build a video recording app based on https://developer.apple.com/documentation/avfoundation/capture_setup/avcam_building_a_camera_app I have exteded the sample to use stabilization, but I am not near getting as good results as I get from using the Apple Photo App. The only thing I seem to be able to control from AVFoundation is https://developer.apple.com/documentation/avfoundation/avcaptureconnection/1620484-preferredvideostabilizationmode Do Apple use different APIs than what is available from AVFoundation? Best, Thomas Hagen
0
0
322
Aug ’23
AVCam vs Apple Photo App for stabilized video recording
Why do Apple’s builtin Photo app produce videos with better image stabilization than what I can get when I enable cinematicextended as preferred stabilization using the AVCapture API? Ref: https://developer.apple.com/documentation/avfoundation/avcapturevideostabilizationmode/cinematicextended I have based my work on https://developer.apple.com/documentation/avfoundation/capture_setup/avcam_building_a_camera_app Do the Apple Photo app use another API or some Apple properitary algoritms to achieve a better result than what’s available using the AvFoundation API? Best, Thomas Hagen
0
0
334
Aug ’23
Camera extension not uninstalling on version mismatch
Hi there - I’m developing a camera extension for macOS. Whenever I up the build number from say 31-33 for 1.0 and install, I’m then unable to uninstall the extension from the previous version on deletion of the app. Should I be reinstalling the extension somehow on every update even if it hasn’t changed or should I not be bumping the version of the extension? is this something that only happens in development or I should expect to deal with this in prod too? I have frustrated users who are having to disable SIP to uninstall the extension. Thanks for your help!
0
0
437
Aug ’23
setExposureTargetBias handler problem
Hello everybody! I use video session and in delegate method didOutputSampleBuffer I get CMSampleBuffer. That's all right. But at some point, I want to change the exposure to -2, +2 and make images from the buffer. I use the setExposureTargetBias method and everything changes successfully, but there is one problem - I need to understand when exactly the buffer with the changed value (-2.0, +2.0 etc.) will be. The description says what to use CMTime syncTime from the setExposureTargetBias closure, which will be the future timestamp of the buffer with applied changes, but here there is a complete mismatch - when I compare these syncTime timestamp with buffer from didOutputSampleBuffer, the exposure is still in the process of changing. How to determine that exactly this buffer (its time stamp) corresponds exactly to the time stamp at which exposure bias changes will already be applied?
0
0
315
Aug ’23
Camera problems on iPhone 7, dark photos
I have a controller that its functions are to open the front camera, show the instructions in the view with a guide for the user to place his face, draw two buttons to capture the selfie photo and cancel the process. When capturing the photo with that controller, a function is called in a controller called 'Service Manager' to send the photo and update the stream's lifecycle. The problem occurs in devices like iPhone 7, when tapping on the "CAPTURE SELFIE" button the photo turns out to be dark or bright. Another problem that this driver has, but on all iOS devices, is that if the user minimizes the app and reopens it, the camera freezes and you have to minimize and reopen more than once to get the camera to work. resume and the user can capture the selfie correctly but the instructions are redrawn each time the user minimizes and opens the app again so the camera preview shows black. What can be caused this? I attached part of the code. func setUpAVCapture() { session.sessionPreset = AVCaptureSession.Preset.hd1280x720 guard let device = AVCaptureDevice.default(AVCaptureDevice.DeviceType.builtInWideAngleCamera, for: .video, position: AVCaptureDevice.Position.front) else { return } } func stopCamera() { session.stopRunning() } @objc func captureButtonTapped() { guard let captureSession = captureSession else { return } self.animacion?.isHidden = true blurView?.addBlurToView() if self.activityView == nil { activityView = UIView(frame: CGRect(x: (self.view.frame.width / 2) - 50, y: (self.view.frame.height / 2) - 50, width: 100, height: 100)) activityView!.backgroundColor = UIColor(red:0 , green: 0, blue: 0, alpha: 0.8) activityView!.layer.cornerRadius = 4 activityIndicator = UIActivityIndicatorView(frame: CGRect(x: (activityView!.frame.width / 2) - 15, y: (activityView!.frame.height / 2) - 15, width: 30, height: 30)) if #available(iOS 13.0, *) { activityIndicator!.style = .large activityIndicator?.color = .white } else { activityIndicator?.style = .whiteLarge } activityView!.addSubview(activityIndicator!) self.view.addSubview(activityView!) } if intentosRealizados == 1 { if captureSession.canAddOutput(self.photoOutput) { captureSession.addOutput(self.photoOutput) } } let settings = AVCapturePhotoSettings() if let connection = photoOutput.connection(with: .video) { connection.videoOrientation = .portrait } let paths = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask) let fileUrl = paths[0].appendingPathComponent("selfie.png") try? FileManager.default.removeItem(at: fileUrl) photoOutput.capturePhoto(with: settings, delegate: self) DispatchQueue.main.asyncAfter(deadline: .now() + 6.0){ self.performSelector(onMainThread: #selector(self.stopSelfie), with: nil, waitUntilDone: false) } activityIndicator?.startAnimating() activityView?.isHidden = false } func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) { self.maskImage?.image = UIImage(named: "face_yellow") if let imageData = photo.fileDataRepresentation(), let image = UIImage(data: imageData) { let compressedData = image.jpegData(compressionQuality: 0.5) if let compressedImage = UIImage(data: compressedData ?? Data()){ if let compressedData = compressImage(image: compressedImage, maxSizeInBytes: 1 * 1024 * 1024) { let rotatedImage = UIImage(data: compressedData)?.rotate(radians: .pi / 0.5) let pngData = rotatedImage?.jpegData(compressionQuality: 1.0) let documentDirectory = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask).first! let fileURL = documentDirectory.appendingPathComponent("selfie.png") if let data = pngData { do { try data.write(to: fileURL) } catch { } } } } } } func compressImage(image: UIImage, maxSizeInBytes: Int) -> Data? { let maxHeight: CGFloat = 1024.0 let maxWidth: CGFloat = 1024.0 var actualWidth = image.size.width var actualHeight = image.size.height if actualWidth <= maxWidth && actualHeight <= maxHeight { return image.jpegData(compressionQuality: 1.0) } let imgRatio = actualWidth / actualHeight var newWidth: CGFloat var newHeight: CGFloat if imgRatio > maxWidth / maxHeight { newWidth = maxWidth newHeight = maxWidth / imgRatio } else { newHeight = maxHeight newWidth = maxHeight * imgRatio } let renderRect = CGRect(x: 0, y: 0, width: newWidth, height: newHeight) let renderer = UIGraphicsImageRenderer(size: renderRect.size) let compressedImage = renderer.image { context in image.draw(in: renderRect) } var compressionQuality: CGFloat = 1.0 var finalImageData = compressedImage.jpegData(compressionQuality: compressionQuality) while finalImageData?.count ?? 0 > maxSizeInBytes && compressionQuality > 0.1 { compressionQuality -= 0.1 finalImageData = compressedImage.jpegData(compressionQuality: compressionQuality) } return finalImageData } That the use of the camera, in the application, is compatible with all devices from iPhone 6+. The intention of this controller is that the application takes a selfie photo, saves it in png and that the file size is less than 5MB. Thank you
0
1
347
Sep ’23
AVFoundation low-light mode to match native camera's
I'm trying to get AVFoundation to provide the same low-light boost provided in the native camera app. I have tried: isLowLightBoostSupported (which always returns false, regardless of my configuration) setExposureTargetBias set to max AVCaptureSessionPreset set to various options without success Is automaticallyEnablesLowLightBoostWhenAvailable outdated? The documentation claims "capture device that supports this feature may only engage boost mode for certain source formats or resolutions", but with no explanation of which formats or resolutions. I can't tell if the option is no longer supported, or if there is some configuration permutation I haven't yet found.
1
0
403
Sep ’23
iPhone 14 Pro Max produces corrupted images
My iPhone produces corrupted images under certain conditions. If I shoot same scene (with slightly varying angle) in same lightning conditions, I almost all the time receive corrupted photo, which contains magenta copy of image and green rectangle. If I add some other objects to the scene, thus changing overall brightness of the scene, chance of bug reduces significantly. Device info: iPhone 14 Pro Max (iOS 17 RC), iPhone 14 Pro (iOS 17 beta 6) Images with issue: f/1.664, 1/25s, ISO 500, digitalZoom=1.205, brightness=-0.568 f/1.664, 1/25s, ISO 500, digitalZoom=1.205, brightness=-0.514 f/1.664, 1/25s, ISO 640, digitalZoom=1.205, brightness=-0.641 f/1.664, 1/25s, ISO 500, digitalZoom=1.205, brightness=-0.448 f/1.664, 1/25s, ISO 500, digitalZoom=1.205, brightness=-0.132 Images without issue: f/1.664, 1/25s, ISO 500, digitalZoom=1.205, brightness=-0.456 f/1.664, 1/20s, ISO 640, digitalZoom=1.205, brightness=-1.666 f/1.664, 1/100s, ISO 50, digitalZoom=1.205, brightness=4.840 f/1.664, 1/25s, ISO 640, digitalZoom=1.205, brightness=-0.774 I'm using builtInWideAngleCamera with continuousAutoExposure, continuousAutoFocus and slight videoZoomFactor func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) { if let error = error { capturePhotoCallback?(.failure(.internalError(error.localizedDescription))) return } guard let data = photo.fileDataRepresentation() else { capturePhotoCallback?(.failure(.internalError("Can not get data representation."))) return } guard let image = UIImage(data: data) else { capturePhotoCallback?(.failure(.internalError("Can not get image from data representation."))) return } capturePhotoCallback?(.success(image)) }
0
0
506
Sep ’23
How to prevent camera from adjusting brightness in manual mode?
In case when I have locked white balance and custom exposure, on black background when I introduce new object in view, both objects become brighter. How to turn off this feature or compensate for that change in a performant way? This is how I configure the session, note that Im setting a video format which supports at least 180 fps which is required for my needs. private func configureSession() { self.sessionQueue.async { [self] in //MARK: Init session guard let session = try? validSession() else { fatalError("Session is unexpectedly nil") } session.beginConfiguration() guard let device = AVCaptureDevice.default(AVCaptureDevice.DeviceType.builtInWideAngleCamera, for:AVMediaType.video, position: .back) else { fatalError("Video Device is unexpectedly nil") } guard let videoDeviceInput: AVCaptureDeviceInput = try? AVCaptureDeviceInput(device:device) else { fatalError("videoDeviceInput is unexpectedly nil") } guard session.canAddInput(videoDeviceInput) else { fatalError("videoDeviceInput could not be added") } session.addInput(videoDeviceInput) self.videoDeviceInput = videoDeviceInput self.videoDevice = device //MARK: Connect session IO let dataOutput = AVCaptureVideoDataOutput() dataOutput.setSampleBufferDelegate(self, queue: sampleBufferQueue) session.automaticallyConfiguresCaptureDeviceForWideColor = false guard session.canAddOutput(dataOutput) else { fatalError("Could not add video data output") } session.addOutput(dataOutput) dataOutput.alwaysDiscardsLateVideoFrames = true dataOutput.videoSettings = [ String(kCVPixelBufferPixelFormatTypeKey): pixelFormat.rawValue ] if let captureConnection = dataOutput.connection(with: .video) { captureConnection.preferredVideoStabilizationMode = .off captureConnection.isEnabled = true } else { fatalError("No Capture Connection for the session") } //MARK: Configure AVCaptureDevice do { try device.lockForConfiguration() } catch { fatalError(error.localizedDescription) } if let format = format(fps: fps, minWidth: minWidth, format: pixelFormat) { // 180FPS, YUV layout device.activeFormat = format device.activeVideoMinFrameDuration = CMTime(value: 1, timescale: CMTimeScale(fps)) device.activeVideoMaxFrameDuration = CMTime(value: 1, timescale: CMTimeScale(fps)) } else { fatalError("Compatible format not found") } device.activeColorSpace = .sRGB device.isGlobalToneMappingEnabled = false device.automaticallyAdjustsVideoHDREnabled = false device.automaticallyAdjustsFaceDrivenAutoExposureEnabled = false device.isFaceDrivenAutoExposureEnabled = false device.setFocusModeLocked(lensPosition: 0.4) device.isSubjectAreaChangeMonitoringEnabled = false device.exposureMode = AVCaptureDevice.ExposureMode.custom let exp = CMTime(value: Int64(40), timescale: 100_000) let isoValue = min(max(40, device.activeFormat.minISO), device.activeFormat.maxISO) device.setExposureModeCustom(duration: exp, iso: isoValue) { t in } device.setWhiteBalanceModeLocked(with: AVCaptureDevice.WhiteBalanceGains(redGain: 1.0, greenGain: 1.0, blueGain: 1.0)) { (timestamp:CMTime) -> Void in } device.unlockForConfiguration() session.commitConfiguration() onAVSessionReady() } } This post (https://stackoverflow.com/questions/34511431/ios-avfoundation-different-photo-brightness-with-the-same-manual-exposure-set) suggests that the effect can be mitigated by settings camera exposure to .locked right after setting device.setExposureModeCustom(). This works properly only if used with async api and still does not influence the effect. Async approach: private func onAVSessionReady() { guard let device = device() else { fatalError("Device is unexpectedly nil") } guard let sesh = try? validSession() else { fatalError("Device is unexpectedly nil") } MCamSession.shared.activeFormat = device.activeFormat MCamSession.shared.currentDevice = device self.observer = SPSDeviceKVO(device: device, session: sesh) self.start() Task { await lockCamera(device) } } private func lockCamera(_ device: AVCaptureDevice) async { do { try device.lockForConfiguration() } catch { fatalError(error.localizedDescription) } _ = await device.setFocusModeLocked(lensPosition: 0.4) let exp = CMTime(value: Int64(40), timescale: 100_000) let isoValue = min(max(40, device.activeFormat.minISO), device.activeFormat.maxISO) _ = await device.setExposureModeCustom(duration: exp, iso: isoValue) _ = await device.setWhiteBalanceModeLocked(with: AVCaptureDevice.WhiteBalanceGains(redGain: 1.0, greenGain: 1.0, blueGain: 1.0)) device.exposureMode = AVCaptureDevice.ExposureMode.locked device.unlockForConfiguration() } private func configureSession() { // same session init as before ... onAVSessionReady() }
1
0
768
Nov ’23
Looking for hints on swiftui camera app.
hello. I am a beginner developer. I am creating a camera app using swiftui. I want to create an app that allows you to take pictures with a camera, apply filters, and save them. I'm practicing following Apple's sample tutorial. https://developer.apple.com/tutorials/sample-apps/capturingphotos-camerapreview It was possible to apply CIFilter to the image displayed in the viewfinder, but I don't know how to apply the filter to the saved photo. I'd like to get some hints on this.
1
0
477
Sep ’23
App reject of Camera/Photo access permssion
Hi, My app was rejected by IOS, based on Guideline 5.1.1: "we still find the permission request for camera and photos is not sufficient. To resolve this issue, it would be appropriate to revise these and ensure that they sufficiently explain why the app needs this access." It was my 2nd time already, I am frustrated. I asked IOS to give suggestion and it refuses. Can anyone help me if I modified my message as below: "AppName" would like to access your Camera. Allow "AppName" to take your photo as profile picture. And this photo will be used in-app messaging only. "AppName" would like to access your Photos Library. Allow "AppName" to capture image from your Photo Library. This image will used as your profile picture and for in-app messaging only. Thanks your help in advance
0
0
261
Sep ’23