Hello!!
I think I found the code that will help with the camera capture function within my app but I don't know how to add it correctly within XCode.
https://developer.apple.com/documentation/avfoundation/capture_setup/choosing_a_capture_device
https://fek.io/blog/why-you-need-to-add-an-app-delegate-to-your-swift-ui-app/
If any of yall have worked with this code would ya help? Thank Ya :)
Camera
RSS for tagDiscuss using the camera on Apple devices.
Posts under Camera tag
171 Posts
Sort by:
Post
Replies
Boosts
Views
Activity
The 10th-Gen iPad differs from its predecessors by having a camera that's located at the top of its landscape orientation. This is a headache for me since my app needs to know the rough camera location given the device's orientation for AR purposes.
I can find out whether the device is a tablet or not, but I can't find out whether it's an iPad 10. Are there any direct or indirect ways for me to find out whether a camera is placed for portrait or landscape use?
We found there high possibility of CameraSessionWasInterrupted exception in iOS17 beta5
any body knows why?
Hi, is there a reason why I can't access the ultra wide back camera on ios 16.2 below when I try to use mediaDevices? I'm using the library of html5-qrcode to getCameras and scan qr code. I'm asking this question because the camera on iphone 13 above is having a hard time focusing on small qr code. thank you
"Hi developers, I am creating a camera app project. In this camera app, I need to implement floating buttons, for example, buttons for aspect ratio and timer settings."
I have build a video recording app based on https://developer.apple.com/documentation/avfoundation/capture_setup/avcam_building_a_camera_app
I have exteded the sample to use stabilization, but I am not near getting as good results as I get from using the Apple Photo App.
The only thing I seem to be able to control from AVFoundation is https://developer.apple.com/documentation/avfoundation/avcaptureconnection/1620484-preferredvideostabilizationmode
Do Apple use different APIs than what is available from AVFoundation?
Best, Thomas Hagen
Why do Apple’s builtin Photo app produce videos with better image stabilization than what I can get when I enable cinematicextended as preferred stabilization using the AVCapture API?
Ref: https://developer.apple.com/documentation/avfoundation/avcapturevideostabilizationmode/cinematicextended
I have based my work on https://developer.apple.com/documentation/avfoundation/capture_setup/avcam_building_a_camera_app
Do the Apple Photo app use another API or some Apple properitary algoritms to achieve a better result than what’s available using the AvFoundation API?
Best, Thomas Hagen
I've got the following error on Mac Catalyst:
TypeError: undefined is not an object (evaluating 'navigator.mediaDevices.getUserMedia')
macOS 13.5.1 (22G 90)
Safari 17.0 (18616.1.24.11.5, 18616)
UserAgent "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/17.0 Safari/605.1.15"
The same app works well on iOS.
Is there an SDK call to retrieve the physical pixel size on the sensor for the device cameras? Something that would give me, for example, 2.44 microns for the iPhone 14 Pro camera in 12MP mode?
Thank you,
Rick
Hi there - I’m developing a camera extension for macOS. Whenever I up the build number from say 31-33 for 1.0 and install, I’m then unable to uninstall the extension from the previous version on deletion of the app. Should I be reinstalling the extension somehow on every update even if it hasn’t changed or should I not be bumping the version of the extension? is this something that only happens in development or I should expect to deal with this in prod too? I have frustrated users who are having to disable SIP to uninstall the extension.
Thanks for your help!
Hello everybody! I use video session and in delegate method didOutputSampleBuffer I get CMSampleBuffer. That's all right. But at some point, I want to change the exposure to -2, +2 and make images from the buffer. I use the setExposureTargetBias method and everything changes successfully, but there is one problem - I need to understand when exactly the buffer with the changed value (-2.0, +2.0 etc.) will be. The description says what to use CMTime syncTime from the setExposureTargetBias closure, which will be the future timestamp of the buffer with applied changes, but here there is a complete mismatch - when I compare these syncTime timestamp with buffer from didOutputSampleBuffer, the exposure is still in the process of changing.
How to determine that exactly this buffer (its time stamp) corresponds exactly to the time stamp at which exposure bias changes will already be applied?
I have a controller that its functions are to open the front camera, show the instructions in the view with a guide for the user to place his face, draw two buttons to capture the selfie photo and cancel the process.
When capturing the photo with that controller, a function is called in a controller called 'Service Manager' to send the photo and update the stream's lifecycle.
The problem occurs in devices like iPhone 7, when tapping on the "CAPTURE SELFIE" button the photo turns out to be dark or bright. Another problem that this driver has, but on all iOS devices, is that if the user minimizes the app and reopens it, the camera freezes and you have to minimize and reopen more than once to get the camera to work. resume and the user can capture the selfie correctly but the instructions are redrawn each time the user minimizes and opens the app again so the camera preview shows black.
What can be caused this?
I attached part of the code.
func setUpAVCapture() {
session.sessionPreset = AVCaptureSession.Preset.hd1280x720
guard let device = AVCaptureDevice.default(AVCaptureDevice.DeviceType.builtInWideAngleCamera, for: .video, position: AVCaptureDevice.Position.front) else { return }
}
func stopCamera() {
session.stopRunning()
}
@objc func captureButtonTapped() {
guard let captureSession = captureSession else { return }
self.animacion?.isHidden = true
blurView?.addBlurToView()
if self.activityView == nil {
activityView = UIView(frame: CGRect(x: (self.view.frame.width / 2) - 50, y: (self.view.frame.height / 2) - 50, width: 100, height: 100))
activityView!.backgroundColor = UIColor(red:0 , green: 0, blue: 0, alpha: 0.8)
activityView!.layer.cornerRadius = 4
activityIndicator = UIActivityIndicatorView(frame: CGRect(x: (activityView!.frame.width / 2) - 15, y: (activityView!.frame.height / 2) - 15, width: 30, height: 30))
if #available(iOS 13.0, *) {
activityIndicator!.style = .large
activityIndicator?.color = .white
} else {
activityIndicator?.style = .whiteLarge
}
activityView!.addSubview(activityIndicator!)
self.view.addSubview(activityView!)
}
if intentosRealizados == 1 {
if captureSession.canAddOutput(self.photoOutput) {
captureSession.addOutput(self.photoOutput)
}
}
let settings = AVCapturePhotoSettings()
if let connection = photoOutput.connection(with: .video) {
connection.videoOrientation = .portrait
}
let paths = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask)
let fileUrl = paths[0].appendingPathComponent("selfie.png")
try? FileManager.default.removeItem(at: fileUrl)
photoOutput.capturePhoto(with: settings, delegate: self)
DispatchQueue.main.asyncAfter(deadline: .now() + 6.0){
self.performSelector(onMainThread: #selector(self.stopSelfie), with: nil, waitUntilDone: false)
}
activityIndicator?.startAnimating()
activityView?.isHidden = false
}
func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) {
self.maskImage?.image = UIImage(named: "face_yellow")
if let imageData = photo.fileDataRepresentation(), let image = UIImage(data: imageData) {
let compressedData = image.jpegData(compressionQuality: 0.5)
if let compressedImage = UIImage(data: compressedData ?? Data()){
if let compressedData = compressImage(image: compressedImage, maxSizeInBytes: 1 * 1024 * 1024) {
let rotatedImage = UIImage(data: compressedData)?.rotate(radians: .pi / 0.5)
let pngData = rotatedImage?.jpegData(compressionQuality: 1.0)
let documentDirectory = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask).first!
let fileURL = documentDirectory.appendingPathComponent("selfie.png")
if let data = pngData {
do {
try data.write(to: fileURL)
} catch {
}
}
}
}
}
}
func compressImage(image: UIImage, maxSizeInBytes: Int) -> Data? {
let maxHeight: CGFloat = 1024.0
let maxWidth: CGFloat = 1024.0
var actualWidth = image.size.width
var actualHeight = image.size.height
if actualWidth <= maxWidth && actualHeight <= maxHeight {
return image.jpegData(compressionQuality: 1.0)
}
let imgRatio = actualWidth / actualHeight
var newWidth: CGFloat
var newHeight: CGFloat
if imgRatio > maxWidth / maxHeight {
newWidth = maxWidth
newHeight = maxWidth / imgRatio
} else {
newHeight = maxHeight
newWidth = maxHeight * imgRatio
}
let renderRect = CGRect(x: 0, y: 0, width: newWidth, height: newHeight)
let renderer = UIGraphicsImageRenderer(size: renderRect.size)
let compressedImage = renderer.image { context in
image.draw(in: renderRect)
}
var compressionQuality: CGFloat = 1.0
var finalImageData = compressedImage.jpegData(compressionQuality: compressionQuality)
while finalImageData?.count ?? 0 > maxSizeInBytes && compressionQuality > 0.1 {
compressionQuality -= 0.1
finalImageData = compressedImage.jpegData(compressionQuality: compressionQuality)
}
return finalImageData
}
That the use of the camera, in the application, is compatible with all devices from iPhone 6+.
The intention of this controller is that the application takes a selfie photo, saves it in png and that the file size is less than 5MB.
Thank you
I'm trying to get AVFoundation to provide the same low-light boost provided in the native camera app.
I have tried:
isLowLightBoostSupported (which always returns false, regardless of my configuration)
setExposureTargetBias set to max
AVCaptureSessionPreset set to various options without success
Is automaticallyEnablesLowLightBoostWhenAvailable outdated? The documentation claims "capture device that supports this feature may only engage boost mode for certain source formats or resolutions", but with no explanation of which formats or resolutions. I can't tell if the option is no longer supported, or if there is some configuration permutation I haven't yet found.
My iPhone produces corrupted images under certain conditions. If I shoot same scene (with slightly varying angle) in same lightning conditions, I almost all the time receive corrupted photo, which contains magenta copy of image and green rectangle. If I add some other objects to the scene, thus changing overall brightness of the scene, chance of bug reduces significantly.
Device info: iPhone 14 Pro Max (iOS 17 RC), iPhone 14 Pro (iOS 17 beta 6)
Images with issue:
f/1.664, 1/25s, ISO 500, digitalZoom=1.205, brightness=-0.568
f/1.664, 1/25s, ISO 500, digitalZoom=1.205, brightness=-0.514
f/1.664, 1/25s, ISO 640, digitalZoom=1.205, brightness=-0.641
f/1.664, 1/25s, ISO 500, digitalZoom=1.205, brightness=-0.448
f/1.664, 1/25s, ISO 500, digitalZoom=1.205, brightness=-0.132
Images without issue:
f/1.664, 1/25s, ISO 500, digitalZoom=1.205, brightness=-0.456
f/1.664, 1/20s, ISO 640, digitalZoom=1.205, brightness=-1.666
f/1.664, 1/100s, ISO 50, digitalZoom=1.205, brightness=4.840
f/1.664, 1/25s, ISO 640, digitalZoom=1.205, brightness=-0.774
I'm using builtInWideAngleCamera with continuousAutoExposure, continuousAutoFocus and slight videoZoomFactor
func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) {
if let error = error {
capturePhotoCallback?(.failure(.internalError(error.localizedDescription)))
return
}
guard let data = photo.fileDataRepresentation() else {
capturePhotoCallback?(.failure(.internalError("Can not get data representation.")))
return
}
guard let image = UIImage(data: data) else {
capturePhotoCallback?(.failure(.internalError("Can not get image from data representation.")))
return
}
capturePhotoCallback?(.success(image))
}
In case when I have locked white balance and custom exposure, on black background when I introduce new object in view, both objects become brighter. How to turn off this feature or compensate for that change in a performant way?
This is how I configure the session, note that Im setting a video format which supports at least 180 fps which is required for my needs.
private func configureSession() {
self.sessionQueue.async { [self] in
//MARK: Init session
guard let session = try? validSession() else { fatalError("Session is unexpectedly nil") }
session.beginConfiguration()
guard let device = AVCaptureDevice.default(AVCaptureDevice.DeviceType.builtInWideAngleCamera, for:AVMediaType.video, position: .back) else { fatalError("Video Device is unexpectedly nil") }
guard let videoDeviceInput: AVCaptureDeviceInput = try? AVCaptureDeviceInput(device:device) else { fatalError("videoDeviceInput is unexpectedly nil") }
guard session.canAddInput(videoDeviceInput) else { fatalError("videoDeviceInput could not be added") }
session.addInput(videoDeviceInput)
self.videoDeviceInput = videoDeviceInput
self.videoDevice = device
//MARK: Connect session IO
let dataOutput = AVCaptureVideoDataOutput()
dataOutput.setSampleBufferDelegate(self, queue: sampleBufferQueue)
session.automaticallyConfiguresCaptureDeviceForWideColor = false
guard session.canAddOutput(dataOutput) else { fatalError("Could not add video data output") }
session.addOutput(dataOutput)
dataOutput.alwaysDiscardsLateVideoFrames = true
dataOutput.videoSettings = [
String(kCVPixelBufferPixelFormatTypeKey): pixelFormat.rawValue
]
if let captureConnection = dataOutput.connection(with: .video) {
captureConnection.preferredVideoStabilizationMode = .off
captureConnection.isEnabled = true
} else {
fatalError("No Capture Connection for the session")
}
//MARK: Configure AVCaptureDevice
do { try device.lockForConfiguration() } catch { fatalError(error.localizedDescription) }
if let format = format(fps: fps, minWidth: minWidth, format: pixelFormat) { // 180FPS, YUV layout
device.activeFormat = format
device.activeVideoMinFrameDuration = CMTime(value: 1, timescale: CMTimeScale(fps))
device.activeVideoMaxFrameDuration = CMTime(value: 1, timescale: CMTimeScale(fps))
} else {
fatalError("Compatible format not found")
}
device.activeColorSpace = .sRGB
device.isGlobalToneMappingEnabled = false
device.automaticallyAdjustsVideoHDREnabled = false
device.automaticallyAdjustsFaceDrivenAutoExposureEnabled = false
device.isFaceDrivenAutoExposureEnabled = false
device.setFocusModeLocked(lensPosition: 0.4)
device.isSubjectAreaChangeMonitoringEnabled = false
device.exposureMode = AVCaptureDevice.ExposureMode.custom
let exp = CMTime(value: Int64(40), timescale: 100_000)
let isoValue = min(max(40, device.activeFormat.minISO), device.activeFormat.maxISO)
device.setExposureModeCustom(duration: exp, iso: isoValue) { t in }
device.setWhiteBalanceModeLocked(with: AVCaptureDevice.WhiteBalanceGains(redGain: 1.0, greenGain: 1.0, blueGain: 1.0)) {
(timestamp:CMTime) -> Void in }
device.unlockForConfiguration()
session.commitConfiguration()
onAVSessionReady()
}
}
This post (https://stackoverflow.com/questions/34511431/ios-avfoundation-different-photo-brightness-with-the-same-manual-exposure-set) suggests that the effect can be mitigated by settings camera exposure to .locked right after setting device.setExposureModeCustom(). This works properly only if used with async api and still does not influence the effect.
Async approach:
private func onAVSessionReady() {
guard let device = device() else { fatalError("Device is unexpectedly nil") }
guard let sesh = try? validSession() else { fatalError("Device is unexpectedly nil") }
MCamSession.shared.activeFormat = device.activeFormat
MCamSession.shared.currentDevice = device
self.observer = SPSDeviceKVO(device: device, session: sesh)
self.start()
Task {
await lockCamera(device)
}
}
private func lockCamera(_ device: AVCaptureDevice) async {
do { try device.lockForConfiguration() } catch { fatalError(error.localizedDescription) }
_ = await device.setFocusModeLocked(lensPosition: 0.4)
let exp = CMTime(value: Int64(40), timescale: 100_000)
let isoValue = min(max(40, device.activeFormat.minISO), device.activeFormat.maxISO)
_ = await device.setExposureModeCustom(duration: exp, iso: isoValue)
_ = await device.setWhiteBalanceModeLocked(with: AVCaptureDevice.WhiteBalanceGains(redGain: 1.0, greenGain: 1.0, blueGain: 1.0))
device.exposureMode = AVCaptureDevice.ExposureMode.locked
device.unlockForConfiguration()
}
private func configureSession() {
// same session init as before
...
onAVSessionReady()
}
hello. I am a beginner developer.
I am creating a camera app using swiftui.
I want to create an app that allows you to take pictures with a camera, apply filters, and save them.
I'm practicing following Apple's sample tutorial.
https://developer.apple.com/tutorials/sample-apps/capturingphotos-camerapreview
It was possible to apply CIFilter to the image displayed in the viewfinder, but I don't know how to apply the filter to the saved photo.
I'd like to get some hints on this.
I'm working with an NTSC to USB camera converter
Hi,
My app was rejected by IOS, based on Guideline 5.1.1:
"we still find the permission request for camera and photos is not sufficient. To resolve this issue, it would be appropriate to revise these and ensure that they sufficiently explain why the app needs this access."
It was my 2nd time already, I am frustrated. I asked IOS to give suggestion and it refuses.
Can anyone help me if I modified my message as below:
"AppName" would like to access your Camera.
Allow "AppName" to take your photo as profile picture. And this photo will be used in-app messaging only.
"AppName" would like to access your Photos Library.
Allow "AppName" to capture image from your Photo Library. This image will used as your profile picture and for in-app messaging only.
Thanks your help in advance
Hello, we are having issues on the camera function. where in iOS 16 it functions well without any problems. but only on devices with iOS 17 everytime it access the camera the app force closes. Could anyone help with this? highly appreciated. any suggestions is highly welcome and appreciated. thank you in advance.
Hello everyone,
I couldn't find an answer for this question. It is possible for developers to capture 24 megapixel photos in AVFoundation with an iPhone 15 pro or is it only a iPhone camera app feature?