Discuss using the camera on Apple devices.

Posts under Camera tag

127 Posts
Sort by:
Post not yet marked as solved
0 Replies
611 Views
Where can I find it? What's the content of this dictionary? Whenever I open the UIIMagePickerController with source type .camera I see an error: Failed to read exposureBiasesByMode dictionary. The camera seems to work fine, though. 2021-08-16 06:51:59.608807+0200 myProject[19537:4428708] [Camera] Failed to read exposureBiasesByMode dictionary: Error Domain=NSCocoaErrorDomain Code=4864 "*** -[NSKeyedUnarchiver _initForReadingFromData:error:throwLegacyExceptions:]: data is NULL" UserInfo={NSDebugDescription=*** -[NSKeyedUnarchiver _initForReadingFromData:error:throwLegacyExceptions:]: data is NULL} Thank you for any advise.
Posted
by
Post not yet marked as solved
0 Replies
334 Views
Using the following code, RAW/DNG images loose their location data when exported to a different device or computer. [[PHPhotoLibrary sharedPhotoLibrary] performChanges:^{             PHAssetResourceCreationOptions *options = [[PHAssetResourceCreationOptions alloc] init];             options.shouldMoveFile = YES;                          PHAssetCreationRequest *creationRequest = [PHAssetCreationRequest creationRequestForAsset];             creationRequest.location = locManager.location;             if(rawEmbedsJPEGOnOff == 0){                 [creationRequest addResourceWithType:PHAssetResourceTypePhoto data:photoData options:nil];                 [creationRequest addResourceWithType:PHAssetResourceTypeAlternatePhoto fileURL:temporaryFormatFileURL options:options]; // Add move (not copy) option             }             else if(rawEmbedsJPEGOnOff == 1){                 [creationRequest addResourceWithType:PHAssetResourceTypePhoto fileURL:temporaryFormatFileURL options:options];             }         } completionHandler:^( BOOL success, NSError *error ){             if(!success){                 NSLog( @"Error occurred while saving raw photo to photo library: %@", error );             }             else{                 NSLog( @"Raw photo was saved to photo library" ); }];
Posted
by
Post not yet marked as solved
0 Replies
233 Views
When I run my camera app and use Instrument to check the memory status, Leaks said that there was a memory leak during the initialization of AVCapturePhotoOutput. The machine is iPhone 6s and the version is iOS 10.
Posted
by
Post not yet marked as solved
0 Replies
335 Views
Hello, "We noticed that your app requests the user’s consent to access the camera, but doesn’t sufficiently explain the use of the camera in the purpose string. The current modal alert just says "test" for the camera." We use expo, and transport to send the application. The problem we are having is that testflight is replacing the 'test' camera application message. We don't know how we can post the application directly to apple and it won't go through testflight. Apple is telling us to go through eu but not testflight how? thank you in advance.
Posted
by
Post not yet marked as solved
1 Replies
408 Views
Users are reporting unrecoverable app crashes when using the AVCaptureDevice camera in our app. At minimum we want to fail gracefully. How can we capture this error? Is there an update or different AVCaptureDevice Camera that would prevent the problem? CameraUI -[CAMPriorityNotificationCenter _removeObserver:fromObserversByName:] Fatal Exception: NSGenericException *** Collection <__NSArrayM: 0x2838a3e10> was mutated while being enumerated. This rolls all the way back up to main.m so we cannot error handle in the thread that it occurs in. Fatal Exception: NSGenericException 0 CoreFoundation 0x18a967754 __exceptionPreprocess 1 libobjc.A.dylib 0x19f42e7a8 objc_exception_throw 2 CoreFoundation 0x18a967058 -[__NSSingleObjectEnumerator initWithObject:collection:] 3 CameraUI 0x1b4bcb1b0 -[CAMPriorityNotificationCenter _removeObserver:fromObserversByName:] 4 CameraUI 0x1b4bcb4f4 -[CAMPriorityNotificationCenter removeObserver:] 5 CameraUI 0x1b4b8b474 -[CAMViewfinderViewController dealloc] 6 CameraUI 0x1b4cdd030 -[CAMCameraViewController .cxx_destruct] 7 libobjc.A.dylib 0x19f40bcd8 object_cxxDestructFromClass(objc_object*, objc_class*) 8 libobjc.A.dylib 0x19f423148 objc_destructInstance 9 libobjc.A.dylib 0x19f42a5c4 _objc_rootDealloc 10 UIKitCore 0x18d38b96c -[UIResponder dealloc] 11 UIKitCore 0x18cc28f84 -[UIViewController dealloc] 12 CameraUI 0x1b4cd4114 -[CAMCameraViewController dealloc] 13 libobjc.A.dylib 0x19f42d57c AutoreleasePoolPage::releaseUntil(objc_object**) 14 libobjc.A.dylib 0x19f42d41c objc_autoreleasePoolPop 15 CoreFoundation 0x18a973f3c _CFAutoreleasePoolPop 16 CoreFoundation 0x18a8e061c __CFRunLoopPerCalloutARPEnd 17 CoreFoundation 0x18a8db2b4 __CFRunLoopRun 18 CoreFoundation 0x18a8da360 CFRunLoopRunSpecific 19 GraphicsServices 0x1a1f18734 GSEventRunModal 20 UIKitCore 0x18d355584 -[UIApplication _run] 21 UIKitCore 0x18d35adf4 UIApplicationMain 22 [OUR APP DISPLAY NAME] 0x104314454 main + 18 (main.m:18) 23 libdyld.dylib 0x18a596cf8 start
Posted
by
Post not yet marked as solved
0 Replies
479 Views
Apple presented how to Create audio drivers with DriverKit on the latest WWDC 2021. Video presentation: https://developer.apple.com/videos/play/wwdc2021/10190 Code sample: https://developer.apple.com/documentation/audiodriverkit/creating_an_audio_device_driver We need a similar approach for cameras. The audio driver mentioned above can be compiled using the new Xcode 13 beta. So this approach is in progress. We need to develop a custom driver for the camera. Is there a solution in DriverKit for cameras? Is it planned? Should we develop a driver from scratch using USBDriverKit? Any suggestions are appreciated.
Posted
by
Post not yet marked as solved
0 Replies
476 Views
The problem is, I have a video file which is about 111MB with resolution 1216x2160 aaaand I can’t save it on my iPhone even though I have a plenty enough space 😭 I tried to send it via airdrop and it shows a pop up with an error and ask me if I want to save it in my documents (I tried this one as well and there’s no way to save it in my gallery from the app), I tried to send a file via telegram and also get the same error. What should I do? I can’t believe that I can shoot in 4K, but can’t save a video with a higher resolution on my iPhone
Posted
by
Post marked as solved
28 Replies
6.5k Views
Hey guys, facing the issue that scanned documents on my iPhone 12 Pro Max with Files app are pretty bad quality. Guess it started with iOS 15 beta 3. Unfortunately issue still persists with current non beta iOS 15 release. It‘s the same on iPad OS 15. When I launch ‚scan with iPhone’ using Preview app on macOS quality is good as always. Hence looks like issue is related on files app or PDF processing on iPhone. Have anybody else seen the same? Thanx and cheers, Flory
Posted
by
Post not yet marked as solved
0 Replies
259 Views
Is there a simpler way to determine if a device's telephoto camera has a 2.0x, 2.5x or the new 3.0x lens, rather than having to check that the device is a iPhone 12 Pro Max or a 13 Pro? Many thanks!
Posted
by
Post not yet marked as solved
0 Replies
380 Views
(original question on stack overflow) Safari requires that a user gesture occurs before the playing of any audio. However, the user's response to getUserMedia does not appear to count as a user gesture. Or perhaps I have that wrong, maybe there is some way to trigger the playing? This question ("Why can't JavaScript .play() audio files on iPhone safari?") details the many attempts to work around the need for a user gesture, but it seems like Apple has closed most of the loopholes. For whatever reason, Safari does not consider the IOS acceptance of the camera/mic usage dialog as a user gesture and there's no way to make camera capture count as a user gesture. Is there something I'm missing, is it impossible to play an audio file after capturing the camera? Or is there someway to respond to the camera being captured with an audio file?
Posted
by
Post not yet marked as solved
0 Replies
229 Views
We apologize for asking questions using Google Translate. I'm currently developing a photography app with XCode, what code should I write to add the ability to automatically release the shutter over time? If anyone knows, wouldn't you like to ask for a professor?
Posted
by
Post not yet marked as solved
0 Replies
218 Views
Is there a programmatic means of getting an iPhone camera lens' magnification level? I'm referring to the .5x, 1x and 2.x/2.5x/3.x values for the ultra wide, wide and telephoto lenses available on iPhones of the last few years. I tried accessing the AVCaptureDevice videoZoomFactor property, but that returns the same value for both of my iPhone's cameras.
Posted
by
Post not yet marked as solved
0 Replies
321 Views
Hello! I have recently been making an app for iOS which takes a picture and does some image recognition on it. The problem is that I need some technical details of the camera, specifically the focal length and the height of the sensor, in order to achieve this. I have previously got these when I capture the photo. I would take the photo using AVCapturePhotoOutput and extract the EXIF tags FocalLength and FocalLenIn35mmFilm from the metadata. I would then do some calculations to find the sensor height. The problem is that I have since moved to using a frame by frame capture using AVCaptureVideoDataOutput, which still means I can get the metadata from the CMSampleBuffer, but now the FocalLenIn35mmFilm has a value that is over two times what it should be. So, in short, I have been trying to find a way to retrieve the focal length, and height of the sensor, of the iPhone's camera. Any help would be very appreciated! Thanks in advance!
Posted
by
Post not yet marked as solved
0 Replies
363 Views
So I have an app that is designed for local files a lot but if a user navigates to an area for photo taking it causes the app to crash. I don't really need camera access requirement for my app that's why I never bothered to add the request in it because I didn't feel like it was needed.
Posted
by
Post not yet marked as solved
0 Replies
183 Views
I am developing a camera app by using Xcode ver.1.12.5 and SwiftUI (for iPhone8). The functions of the app are capturing a photo with front camera and saving the photos to Camera Roll. I’d like to add functions to realize (1) delayed shooting with (2) displaying remaining time. Could you tell me the code for above (1) and (2)?Thank you for your kindly support in advance.
Posted
by
Post not yet marked as solved
1 Replies
379 Views
Hi Everyone, I'm making a broadcast app. In this app I have an UIView and 3 buttons: 1 button for the ultra wide camera. 1 button for the wide camera. 1 button for the tele photo lens. How can I display the camera view in the UIView if I pressed one the buttons? Thanks, Robby Flockman
Posted
by
Post not yet marked as solved
0 Replies
216 Views
I am developing a camera app by using Xcode ver.1.12.5 and SwiftUI (for iPhone8). The functions of the app are capturing a photo with front camera and saving the photos to Camera Roll. I’d like to add functions to realize (1) delayed shooting with (2) displaying remaining time. Could you tell me the code for above (1) and (2)?Thank you for your kindly support in advance.
Posted
by
Post not yet marked as solved
2 Replies
414 Views
I ran into a strange problem. A camera app using AVFoundation, I use the following code; captureDevice = AVCaptureDevice.default(AVCaptureDevice.DeviceType.builtInUltraWideCamera, for: AVMediaType.video, position: .back) then, let isAutoFocusSupported = captureDevice.isFocusModeSupported(.autoFocus) "isAutoFocusSupported" should be "true". For iPhone 13 pro, it is "true". But for 13 / 13 mini, it is "false". Why?
Post not yet marked as solved
0 Replies
370 Views
I want to develop an app and using the camera to scan document into PDF with some processing on the document to make it clear as much as possible. is using the camera after grant it from the user of course is there any special arrangement or agreement with apple or I just can develop the application directly using the built in framework.
Posted
by