PLATFORM AND VERSION :iOS 18.5
I wanted to bring to your attention a critical issue some of our production users are experiencing with the CoinOut app. Specifically, users are encountering a problem when attempting to capture photos of receipts using the app's customized camera feature. The camera, which utilizes AVCaptureVideoPreviewLayer and AVCaptureDevice, occasionally fails to load the preview, resulting in a black screen instead of the expected camera view.
This camera blackout issue is significantly impacting the user experience as it prevents them from snapping photos of their receipts, which is a core functionality of the CoinOut app.
Any help/suggestion to this issue would be greatly appreciated.
STEPS TO REPRODUCE
Open the app and click on camera icon.
It will display camera to capture photo.
Camera shows black for few production user's.
class ViewController: UIViewController {
@IBOutlet private weak var captureButton: UIButton!
private var fillLayer: CAShapeLayer!
private var previewLayer : AVCaptureVideoPreviewLayer!
private var output: AVCapturePhotoOutput!
private var device: AVCaptureDevice!
private var session : AVCaptureSession!
private var highResolutionEnabled: Bool = false
private let sessionQueue = DispatchQueue(label: "session queue")
override func viewDidLoad() {
super.viewDidLoad()
setupCamera()
customiseUI()
}
@IBAction func startCamera(sender: UIButton) {
didTapTakePhoto()
}
private func setupCamera() {
let session = AVCaptureSession()
session.sessionPreset = AVCaptureSession.Preset.high
previewLayer = AVCaptureVideoPreviewLayer(session: session)
output = AVCapturePhotoOutput()
device = AVCaptureDevice.default(.builtInWideAngleCamera, for: AVMediaType.video, position: .back)
if let device = self.device{
do{
let input = try AVCaptureDeviceInput(device: device)
if session.canAddInput(input){ session.addInput(input)}
else { print("\(#fileID):\(#function):\(#line) : Session Input addition failed") }
if session.canAddOutput(output){
output.isHighResolutionCaptureEnabled = self.highResolutionEnabled
session.addOutput(output)
} else { print("\(#fileID):\(#function):\(#line) : Session Input high resolution failed") }
previewLayer.videoGravity = .resizeAspectFill
previewLayer.session = session
sessionQueue.async { session.startRunning() }
self.session = session
self.session.accessibilityElementIsFocused()
try device.lockForConfiguration()
if device.isWhiteBalanceModeSupported(AVCaptureDevice.WhiteBalanceMode.autoWhiteBalance) {
device.whiteBalanceMode = .autoWhiteBalance
} else { print("\(#fileID):\(#function):\(#line) : isWhiteBalanceModeSupported no supported") }
if device.isWhiteBalanceModeSupported(AVCaptureDevice.WhiteBalanceMode.continuousAutoWhiteBalance) {
device.whiteBalanceMode = .continuousAutoWhiteBalance
} else { print("\(#fileID):\(#function):\(#line) : isWhiteBalanceModeSupported no supported") }
if device.isFocusModeSupported(.continuousAutoFocus) { device.focusMode = .continuousAutoFocus}
else if device.isFocusModeSupported(.autoFocus) { device.focusMode = .autoFocus }
device.unlockForConfiguration()
} catch { print("\(#fileID):\(#function):\(#line) : \(error.localizedDescription)") }
} else { print("\(#fileID):\(#function):\(#line) : Device found as nil") }
}
private func customiseUI() {
let path = UIBezierPath(roundedRect: CGRect(x: 0, y: 0, width: self.view.bounds.width, height: self.view.bounds.height), cornerRadius: 0)
let rectangleWidth = view.frame.width - (view.frame.width * 0.16)
let x = (view.frame.width - rectangleWidth) / 2
let rectangleHeight = view.frame.height - (view.frame.height * 0.16)
let y = (view.frame.height - rectangleHeight) / 2
let roundRect = UIBezierPath(roundedRect: CGRect(x: x, y: y, width: rectangleWidth, height: rectangleHeight), byRoundingCorners:.allCorners, cornerRadii: CGSize(width: 0, height: 0))
roundRect.move(to: CGPoint(x: self.view.center.x , y: self.view.center.y))
path.append(roundRect)
path.usesEvenOddFillRule = true
fillLayer = CAShapeLayer()
fillLayer.path = path.cgPath
fillLayer.fillRule = .evenOdd
fillLayer.opacity = 0.4
previewLayer.addSublayer(fillLayer)
previewLayer.frame = view.bounds
view.layer.addSublayer(previewLayer)
view.bringSubviewToFront(captureButton)
}
private func didTapTakePhoto() {
let settings = self.getSettings(camera: self.device)
if device.isAdjustingFocus {
do {
try device.lockForConfiguration()
device.focusMode = .continuousAutoFocus
device.unlockForConfiguration()
device.addObserver(self, forKeyPath: "adjustingFocus", options: [.new], context: nil)
} catch { print(error) }
} else { output.capturePhoto(with: settings, delegate: self) }
}
func getSettings(camera: AVCaptureDevice) -> AVCapturePhotoSettings {
var settings = AVCapturePhotoSettings()
if let rawFormat = output.availableRawPhotoPixelFormatTypes.first {
settings = AVCapturePhotoSettings(rawPixelFormatType: OSType(rawFormat))
}
settings.isHighResolutionPhotoEnabled = self.highResolutionEnabled
let previewPixelType = settings.availablePreviewPhotoPixelFormatTypes.first!
let previewFormat = [kCVPixelBufferPixelFormatTypeKey as String: previewPixelType] as [String : Any]
settings.previewPhotoFormat = previewFormat
return settings
}
}
extension ViewController: AVCapturePhotoCaptureDelegate {
func photoOutput(_ output: AVCapturePhotoOutput, willCapturePhotoFor resolvedSettings: AVCaptureResolvedPhotoSettings) {
AudioServicesDisposeSystemSoundID(1108)
}
func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) {
guard let data = photo.fileDataRepresentation() else { return }
let image = UIImage(data: data)!
showImage(cropped: image)
}
func showImage(cropped: UIImage) {
let vc = self.storyboard?.instantiateViewController(withIdentifier: "ImagePreviewViewController") as? ImagePreviewViewController
vc?.captured = cropped
self.present(vc!, animated: true)
}
}```
Photos & Camera
RSS for tagExplore technical aspects of capturing high-quality photos and videos, including exposure control, focus modes, and RAW capture options.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Hi,
I’m building a PPG-based heart rate feature where the user places their finger over the rear telephoto camera. On iPhone 16 Pro Max, I'm explicitly selecting the telephoto lens like this:
videoDevice = AVCaptureDevice.default(.builtInTelephotoCamera, for: .video, position: .back)
And trying to lock it:
if #available(iOS 15.0, *),
device.activePrimaryConstituentDeviceSwitchingBehavior != .unsupported {
try? device.lockForConfiguration()
device.setPrimaryConstituentDeviceSwitchingBehavior(.locked, restrictedSwitchingBehaviorConditions: [])
device.unlockForConfiguration()
}
I also lock everything else to prevent dynamic changes:
try device.lockForConfiguration()
device.focusMode = .locked
device.exposureMode = .locked
device.whiteBalanceMode = .locked
device.videoZoomFactor = 1.0
device.automaticallyEnablesLowLightBoostWhenAvailable = false
device.automaticallyAdjustsVideoHDREnabled = false
device.unlockForConfiguration()
Despite this, the camera still switches to another lens, especially under different lighting, even though the user’s finger fully covers the lens.
Questions:
How can I completely prevent lens switching in this scenario?
Would using videoZoomFactor = 3.0 or 5.0 better enforce use of the telephoto lens?
Thanks!
Gal
Before you post —Camera doesn't work on the Simulator— that's no longer true. I've made a solution that makes the Simulator believe there's an actual hardware device connected, allowing users to stream the macOS camera to the iOS Simulator (see for more info RocketSim's documentation: https://docs.rocketsim.app/features/hzQMSrSga7BGWvxdNVdwYs/simulator-camera-support/58tQ5jvevLNSnyUEA7VgAv)
Now, it works for VNDocumentCameraViewController, but when I try opening DataScannerViewController, I directly run into:
Failed to start scanning: The operation couldn’t be completed. (VisionKit.DataScannerViewController.ScanningUnavailable error 0.)
My question:
How does this view controller determine whether scanning is available?
Is there a certain capability the available AVCaptureDevice's need to support maybe?
Any direction would be helpful for me to make this work for developers, making them build apps faster!
I'm developing iPad app that will be mostly dedicated for certain external camera for visually impaired people.
The linux UVC api (e.g. using guvcview) allows to enable automatic exposure for the camera. IOs api "isExposureModeSupported" unfortunately returns false for any of the exposure modes.
Is it a bug? Or perhaps AVFoundation doesn't support UVC exposure yet?
So I've spent the last five years optimizing my video AI system so that it runs with less than 5% CPU while processing a 30fps video feed on a Macbook Pro M2, and everything is great, until Sonoma comes out, and I find myself consuming 40% CPU for the exact same workload.
So I fire up Instruments, and the "heaviest stack trace" (see screenshot) turns out to be Espresso doing some completely unasked-for and absolutely useless processing on my video frames. I turn off Reactions, but nothing helps - the CPU consumptions stays at 40%.
"Reactions" is nothing but a useless toy to please some WWDC keynote fanboys, I don't want it anywhere near my app or my users, and I especially do not want to take the blame for it pissing away the user's CPU cycles and battery.
Now, how do I make it go away, for ever?
Best regards
Jacob
We have a React website build to scan qr codes. The website is properly working for Android devices but for Iphone we see a camera glitch causing delay in scan which is unexpected.
Website URL : https://react-qr-code-scanner-app.vercel.app/
Topic:
Media Technologies
SubTopic:
Photos & Camera
AVCaptureVideoDataOutput.preparesCellularRadioForNetworkConnection requires com.apple.developer.avfoundation.video-data-output-prepares-cellular-radio-for-machine-readable-code-scanning. But I cannot acquire its entitlement. I can't find its entitlement on 'Certificates, Identifiers & Profiles'. Any solutions?
Provisioning profile "iOS Team Provisioning Profile: ......" doesn't include the com.apple.developer.avfoundation.video-data-output-prepares-cellular-radio-for-machine-readable-code-scanning entitlement.
AVCaptureSession's startRunning method is thread blocking and seems to be slow. What is this method doing behind the scenes?
For context: I'm working on Simulator Camera support and I have a 'fake' AVCaptureDevice that might be causing this. My hypothesis is that AVCaptureSession tries to connect to the device and waits for a notification to be posted back.
I'd love to find a way to let my fake device message AVCaptureSession that it's connected.
Hi guys,
Can I use CMIO to achieve the following feature on macOS when a USB device (Camera/Mic/Speaker) is connected:
When a third-party video conferencing app is not in a meeting, ensure the app defaults to using the USB device (Camera/Mic/Speaker).
When a third-party conferencing app is in a meeting, ensure the app automatically switches to the USB device (Camera/Mic/Speaker).
it will use about 300MB memory, it cause a memory peak
Hey,
Quick question. I noticed that Adobe's new app, Project Indigo, allows you to open the app using the Camera Control button. However, when your device is locked it just shows this screen:
Would this normally be approved by the Appstore approval process? I ask because I would like to do something similar with my camera app.
I know that this is not the best user experience, but my apps UI is not built in Swift and I don't have the resources to build the UI again. At least this way the user experience would be improved from what it is now, where users cannot even launch the app. I get many requests per week about this feature and would love to improve the UX for my users, even if it's not the best possible.
Thanks, Alex
I'm developing a photo backup app.
To detect newly added or edited photos since the app launched, I keep a local dictionary in the format [localIdentifier: modification_date].
However, PHAsset.modificationDate is not reliable.
It often changes unexpectedly, possibly due to system operations like iCloud metadata updates.
Is there a more reliable way to detect whether a photo has been modified by user since the last app launch?
I'm thinking about using content hash instead, but I'm not sure how heavy this operation is in terms of performance.
I'm using Picture-in-Picture (PiP) mode in my native iOS application, which is similar to Google Meet, using the VideoSDK.live Native iOS SDK. The SDK has built-in support for PiP and it's working fine for the most part.
However, I'm running into a problem:
When the app enters PiP mode, the local camera (self-video) of the participant freezes or stops. I want to fix this and achieve the same smooth behavior seen in apps like Google Meet and WhatsApp, where the local camera keeps working even in PiP mode.
I've been trying to find documentation or examples on how to achieve this but haven't had much luck. I came across a few mentions that using the camera in the background may require special entitlements from Apple (like in the entitlements file). Most of the official documentation says background camera access is restricted due to Apple’s privacy policies.
So my questions are:
Has anyone here successfully implemented background camera access in PiP mode on iOS?
If yes, what permissions or entitlements are required?
How do apps like WhatsApp and Google Meet achieve this functionality on iPhones?
Any help, advice, or pointers would be really appreciated!
Topic:
Media Technologies
SubTopic:
Photos & Camera
After Picture-in-Picture is opened on the camera interface, the custom UI cannot be displayed. Is there any way to make the custom UI visible? If custom UI cannot be displayed, how do teleprompter-type apps in the store manage to show custom teleprompter text within the camera?
Hello,
I'm developing an app that displays a photo library using UICollectionView and PHCachingImageManager. I'd like to achieve a user experience similar to the native iOS Photos app, where low-quality images are shown quickly while scrolling, and higher-quality images are loaded for visible cells once scrolling stops.
I'm currently using the following approach:
While Scrolling: I'm using the UICollectionViewDataSourcePrefetching protocol. In the prefetchItemsAt method, I call startCachingImages with low-quality options to cache images in advance.
After Scrolling Stops: In the scrollViewDidEndDecelerating method, I intend to load high-quality images for the currently visible cells.
I have a few questions regarding this approach:
What is the best practice for managing both low-quality and high-quality images efficiently with PHCachingImageManager? Is it correct to call startCachingImages with fastFormat options and then call it again with highQualityFormat in scrollViewDidEndDecelerating?
How can I minimize the delay when a low-quality image is replaced by a high-quality one? Are there any additional strategies to help pre-load high-quality images more effectively?
Topic:
Media Technologies
SubTopic:
Photos & Camera
Some users reported that their images are not loading correctly in our app. After a lot of debugging we identified the following:
This only happens when the app is build for Mac Catalyst. Not on iOS, iPadOS, or “real” macOS (AppKit).
The images in question have unusual color spaces. We observed the issue for uRGB and eciRGB v2.
Those images are rendered correctly in Photos and Preview on all platforms.
When displaying the image inside of a UIImageView or in a SwiftUI Image, they render correctly.
The issue only occurs when loading the image via Core Image.
When comparing the different Core Image render graphs between AppKit (working) and Catalyst (faulty) builds, they look identical—except for the result.
Mac (AppKit):
Catalyst:
Something seems to be off when Core Image tries to load an image with foreign color space in Catalyst.
We identified a workaround: By using a CGImageDestination to transcode the image using the kCGImageDestinationOptimizeColorForSharing option, Image I/O will convert the image to sRGB (or similar) and Core Image is able to load the image correctly. However, one potentially loses fidelity this way.
Or might there be a better workaround?
Topic:
Media Technologies
SubTopic:
Photos & Camera
Tags:
Image I/O
Photos and Imaging
Core Image
Core Graphics
I'm creating Live Photos programmatically in my app using the Photos and AVFoundation frameworks. While the Live Photos work perfectly in the Photos app (long press shows motion), users cannot set them as motion wallpapers. The system shows "Motion not available" message.
Here's my approach for creating Live Photos:
// 1. Create video with required metadata
let writer = try AVAssetWriter(outputURL: videoURL, fileType: .mov)
let contentIdentifier = AVMutableMetadataItem()
contentIdentifier.identifier = .quickTimeMetadataContentIdentifier
contentIdentifier.value = assetIdentifier as NSString
writer.metadata = [contentIdentifier]
// Video settings: 882x1920, H.264, 30fps, 2 seconds
// Added still-image-time metadata at middle frame
// 2. Create HEIC image with asset identifier
var makerAppleDict: [String: Any] = [:]
makerAppleDict["17"] = assetIdentifier // Required key for Live Photo
metadata[kCGImagePropertyMakerAppleDictionary as String] = makerAppleDict
// 3. Generate Live Photo
PHLivePhoto.request(
withResourceFileURLs: [photoURL, videoURL],
placeholderImage: nil,
targetSize: .zero,
contentMode: .aspectFit
) { livePhoto, info in
// Success - Live Photo created
}
// 4. Save to Photos library
PHAssetCreationRequest.forAsset().addResource(with: .photo, fileURL: photoURL, options: nil)
PHAssetCreationRequest.forAsset().addResource(with: .pairedVideo, fileURL: videoURL, options: nil)
What I've Tried
Matching exact video specifications from Camera app (882x1920, H.264, 30fps)
Adding all documented metadata (content identifier, still-image-time)
Testing various video durations (1.5s, 2s, 3s)
Different image formats (HEIC, JPEG)
Comparing with exiftool against working Live Photos
Expected Behavior
Live Photos created programmatically should be eligible for motion wallpapers, just like those from the Camera app.
Actual Behavior
System shows "Motion not available" and only allows setting as static wallpaper.
Any insights or workarounds would be greatly appreciated. This is affecting our users who want to use their created content as wallpapers.
Questions
Are there additional undocumented requirements for Live Photos to be wallpaper-eligible?
Is this a deliberate restriction for third-party apps, or a bug?
Has anyone successfully created Live Photos that work as motion wallpapers?
Environment
iOS 17.0 - 18.1
Xcode 16.0
Tested on iPhone 16 Pro
Topic:
Media Technologies
SubTopic:
Photos & Camera
Tags:
LivePhotosKit JS
PhotoKit
Core Image
AVFoundation
I have a feature requirement: to switch the writer for file writing every 5 minutes, and then quickly merge the last two files. How can I ensure that the merged file is seamlessly combined and that the audio and video information remains synchronized? Currently, the merged video has glitches, and the audio is also out of sync. If there are experts who can provide solutions in this area, I would be extremely grateful.
When changing a camera's exposure, AVFoundation provides a callback which offers the timestamp of the first frame captured with the new exposure duration: AVCaptureDevice.setExposureModeCustom(duration:, iso:, completionHandler:).
I want to get a similar callback when changing frame duration.
After setting AVCaptureDevice.activeVideoMinFrameDuration or AVCaptureDevice.activeVideoMinFrameDuration to a new value, how can I compute the index or the timestamp of the first camera frame which was captured using the newly set frame duration?
Hello,
Does anyone have a recipe on how to raycast VNFaceLandmarkRegion2D points obtained from a frame's capturedImage?
More specifically, how to construct the "from" parameter of the frame's raycastQuery from a VNFaceLandmarkRegion2D point?
Do the points need to be flipped vertically? Is there any other transformation that needs to be performed on the points prior to passing them to raycastQuery?