Discuss using the camera on Apple devices.

Posts under Camera tag

171 Posts
Sort by:

Post

Replies

Boosts

Views

Activity

Ios flashlight specification
We are making application of ios. The application is for Disaster prevention in Japan. Because, Recentry in Japan has many Disaster in all season. We spplied to Local government in Japan and using general public and they are using well. One of the local government request to us, they want also supply to deaf person and help whe disaster occur. Local government already has disaster prevention broadcast but it is using loudspeaker. Then, when rains heavily, it can not hear to general public. And our application is from disaster prevention broadcast and forwarding to ios smartphone. It is helpfull to general public well. We are making new application it is not only prevention broadcast text but flashlight with iphone. But after making application, the flashlight is lighting only open our application. like beloe link. But, if Deaf person using it maybe he is not notice well. Our application is already has viblation function but Deaf person put his smartphone in his bag maybe I think he never noticed that Heavy rain and tsunami. Here is our application https://apps.apple.com/jp/app/cosmocast/id1247774270?mt=8 Here is application rule for flashlight. https://stackoverflow.com/questions/32136442/iphone-flashlight-not-working-while-app-is-in-background/32137434 I want help Deaf person and also senior citizens for Heavy rain and tsunami by our application. We'd like to make a flash light at the same time as the push notification arrived. Does anyone know a good way? Thank you and Best regards. Tomita.
1
0
699
Oct ’23
iPAD Pro External Camera
Hi, I was trying to configure people with the iPAD as a primary system. They got the apple keyboard, mouse, and an external monitor 36in. However the external camera via USB or bluetooth cannot be used for FaceTime, Webex, Zoom, Teams, etc. It always default to Apple camera. Is there a way to get an external camera and apps to recognize as the primary camera on an iPAD. Thanks
1
1
1.2k
Sep ’23
Universal Links Camera does not pass url to .onOpenURL (perform :) method SwiftUI Xcode
Hello, I connected Universal Links between the website and the application, everything was done according to the official documentation, and everything works fine, when the site is launched, safari offers to launch the application, there is also a banner above the launch page. In SwiftUI, I listen to the URL via .onOpenURL (perform :), everything works, but if the link is encoded in a QR code and scanned through the camera, a plate will appear prompting you to go to the application, and it goes, but only does not pass the URL to the .onOpenURL method (perform: ) . What could be the problem ?
3
4
3.8k
Aug ’23
iPhone 13 / 13 mini builtInUltraWideCamera trouble
I ran into a strange problem. A camera app using AVFoundation, I use the following code; captureDevice = AVCaptureDevice.default(AVCaptureDevice.DeviceType.builtInUltraWideCamera, for: AVMediaType.video, position: .back) then, let isAutoFocusSupported = captureDevice.isFocusModeSupported(.autoFocus) "isAutoFocusSupported" should be "true". For iPhone 13 pro, it is "true". But for 13 / 13 mini, it is "false". Why?
4
0
1.6k
Sep ’23
Truedepth Camera on iPhone13 products - lower accuracy of depth data?
Hi, just experienced using the Apple demo app for Truedepth images on different devices that there are significant differences in the provided data quality. Data derived on iPhones before iPhone 13 lineup provide quite smooth surfaces - like you may know from one of the many different 3D scanner apps displaying data from the front facing Truedepth camera. Data derived on e.g. iPhone13 Pro has now some kind of wavy overlaying structures and has - visually perceived - very low accuracy compared to older iPhones. iPhone 12pro: data as pointcloud, object about 25cm from phone: iPhone 13pro: data as pointcloud, same setup Tried it out on different iPhone 13 devices, same result, all running on latest iOS. Images captured with the same code. Capturing by using some of the standard 3D scanner apps for the Truedepth camera are providing similar lower quality images or point clouds.   Is this due to degraded hardware (smaller sized Truedepth camera) on new iPhone release or a software issue from within iOS, e.g. driver part of the Truedepth camera?   Are there any foreseen improvements or solutions already announced by Apple?   Best Holger
10
2
6.4k
Oct ’23
AVCaptureVideoDataOutput video range value exceed the range
CVPixelBuffer.h defines kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange = '420v', /* Bi-Planar Component Y'CbCr 8-bit 4:2:0, video-range (luma=[16,235] chroma=[16,240]). baseAddr points to a big-endian CVPlanarPixelBufferInfo_YCbCrBiPlanar struct */ kCVPixelFormatType_420YpCbCr10BiPlanarVideoRange = 'x420', /* 2 plane YCbCr10 4:2:0, each 10 bits in the MSBs of 16bits, video-range (luma=[64,940] chroma=[64,960]) */ But when I set above format camera output, and I find the output pixelbuffer's value is exceed the range.I can see [0 -255] for 420YpCbCr8BiPlanarVideoRange and [0,1023] for 420YpCbCr10BiPlanarVideoRange Is it a bug or something wrong of the output?If it is not how can I choose the correct matrix transfer the yuv data to rgb?
1
0
1.1k
Feb ’24
iOS 14+ GPS EXIF data from CAMERA
I am building an iOS app that uses the phone's GPS EXIF data from both camera and image library. My problem is that I while I am able to get GPS data from images in the phone's library, I have not been able to get any GPS data when using the camera within my app. I first built this app about a year ago and at that time I was able to get GPS data from both the library AND the camera from within the app. I believe that at that point I was still building for iOS 12.. I believe that the new security features that came with iOS 13 or 14 now dissalow my app's access to the GPS data when using the camera. This issue is new as of iOS 13 or 14. The code I had was working fine with earlier versions of iOS I am having no issues with getting GPS from the EXIF on the device library images. Images taken with the NATIVE IOS CAMERA APP are saved to the library with full GPS data.  - However I am not able to get GPS data directly from camera image EXIF when using the camera from within my app. When saving an image taking by the camera from within my app the image is saved to the library with NO GPS data. I am able, at any time, to ask the device for current GPS coordinates. As far as I can tell, device settings are all correct. Location services are available at all times.  My feeling is that iOS is stripping the GPS data from the EXIF image before handing the image data to my app. I have searched Apple developer forums, Apple documention, Stack Exchange, on and on for over several weeks now and I seem no closer to knowing if the camera API even returns this data or not and if it will be necessary for me to talk to the location services and add the GPS data myself (which is what I am working on now as I have about given up on getting it from the camera). Info.plist keys I am currently setting:  LSRequiresIPhoneOS  NSCameraUsageDescription  NSLocationAlwaysUsageDescription  NSLocationWhenInUseUsageDescription  NSMicrophoneUsageDescription  NSPhotoLibraryUsageDescription  NSPhotoLibraryAddUsageDescription Am I missing some required plist key? I have been searching and searching for the name of a key that I might be missing but have found absolutely nothing other than people trying to hack some post-camera device location merging. This has been very frustrating.. Any insite is appreciated Is it currently possible to get GPS data directly from the camera's EXIF output any more? Do I need to ask the device for the current GPS values and insert the GPS data into the image EXIF on my own? Is there any example code of getting GPS data from the camera? Is there any example code of inserting GPS data into the Exif before saving the file to the device? Sample swift code which processes the camera image.   func imagePickerController(_ picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [UIImagePickerController.InfoKey : Any]) {     let pickedImage = info[UIImagePickerController.InfoKey.originalImage] as? UIImage     // let pickedImage = info[UIImagePickerController.InfoKey.editedImage] as? UIImage     // using editedImage vs originalImage has no effect on the availabilty of the GPS data     userImage.image = pickedImage     picker.dismiss(animated: true, completion: nil)   }
4
2
1.9k
Dec ’23
Get distance from uvd and intrinsic matrix?
Hello! I am having trouble calculating accurate distances in the real world using the camera's returned intrinsic matrix and pixel coordinates/depths captured from the iPhone's LiDAR. For example, in the image below, I set a mug 0.5m from the phone. The mug is 8.5cm wide. The intrinsic matrix returned from the phone's AVCameraCalibrationData class has focalx = 1464.9269, focaly = 1464.9269, cx = 960.94916, and cy = 686.3547. Selecting the two pixel locations denoted in the image below, I calculated each one's xyz coordinates using the formula: x = d * (u - cx) / focalx y = d * (v - cy) / focaly z = d Where I get depth from the appropriate pixel in the depth map - I've verified that both depths were 0.5m. I then calculate the distance between the two points to get the mug width. This gives me a calculated width of 0.0357, or 3.5 cm, instead of the 8.5cm I was expecting. What could be accounting for this discrepancy? Thank you so much for your help!
5
0
1.8k
Jun ’24
kIOGPUCommandBufferCallbackErrorSubmissionsIgnored in the Sample Code
The sample code Capturing Depth Using the Lidar Camera works fine on my phone with iOS 15.6. But after upgraded the phone to iOS 16 and the Xcode to 14.0, now the sample code can only run for a few minutes on the phone and start to show the following error: LiDARDepth[1454:256343] Execution of the command buffer was aborted due to an error during execution. Ignored (for causing prior/excessive GPU errors) (00000004:kIOGPUCommandBufferCallbackErrorSubmissionsIgnored) Did anyone get the same error after upgrade to Xcode 14 or iOS 16. Can an expert take a look of the sample code and point out a possible solution? Thanks!
2
1
1k
Jul ’23
should an AVPlayer work in a Camera Extension?
My goal is to implement a moving background in a virtual camera, implemented as a Camera Extension, on macOS 13 and later. The moving background is available to the extension as a H.264 file in its bundle. I thought i could create an AVAsset from the movie's URL, make an AVPlayerItem from the asset, attach an AVQueuePlayer to the item, then attach an AVPlayerLooper to the queue player. I make an AVPlayerVideoOutput and add it to each of the looper's items, and set a delegate on the video output. This works in a normal app, which I use as a convenient environment to debug my extension code. In my camera video rendering loop, I check self.videoOutput.hasNewPixelBuffer , it returns true at regular intervals, I can fetch video frames with the video output's copyPixelBuffer and composite those frames with the camera frames. However, it doesn't work in an extension - hasNewPixelBuffer is never true. The looping player returns 'failed', with an error which simply says "the operation could not be completed". I've tried simplifying things by removing the AVPlayerLooper and using an AVPlayer instead of an AVQueuePlayer, so the movie would only play once through. But still, I never get any frames in the extension. Could this be a sandbox thing, because an AVPlayer usually renders to a user interface, and camera extensions don't have UIs? My fallback solution is to use an AVAssetImageGenerator which I attempt to drive by firing off a Task for each frame each time I want to render one, I ask for another frame to keep the pipeline full. Unfortunately the Tasks don't finish in the same order they are started so I have to build frame-reordering logic into the frame buffer (something which a player would fix for me). I'm also not sure whether the AVAssetImageGenerator is taking advantage of any hardware acceleration, and it seems inefficient because each Task is for one frame only, and cannot maintain any state from previous frames. Perhaps there's a much simpler way to do this and I'm just missing it? Anyone?
2
0
1.2k
Aug ’23
extrinsicMatrix inside AVCameraCalibrationData
In regards to the extrinsicMatrix attribute of the AVCameraCalibrationData class, the description provided is as follows: A matrix relating a camera’s position and orientation to a world or scene coordinate system. I'm trying to build an app for 3D reconstruction/scanning that only uses the AVFoundation framework (not ARKit). I'm able to extract RGB frames, depth maps, and the camera's intrinsic matrix. However, the extrinsicMatrix is always an identity matrix. The documentation mentions this: The camera’s pose is expressed with respect to a reference camera (camera-to-world view). If the rotation matrix is an identity matrix, then this camera is the reference camera. My questions are: Does the extrinsicMatrix param refer to a global coordinate system at all? If so, which coordinate system it is? Are there settings to configure that would trigger the extrinsicMatrix to change according to camera movement? If the extrinsicMatrix can't be used in this manner, can you recommend another way to estimate camera motion between frames to provide accurate 3D reconstruction? Thanks in advance, and I'd be happy to provide more info if needed. I'm using an iPhone 14 Pro and the .builtInDualWideCamera as the AVCaptureDevice.
1
1
772
Oct ’23
Unable to stream from DJI Action Cam 3
I was hopeful that I would be able to record video from an external action camera such as the DJI Action Cam 3. The DJI action cam has a webcam mode that can be enabled as soon as one plugs it to a device, such as a macbook or an iPad. On the Macbook, the DJI shows up as an external camera for use in facetime or quicktime. But when I plug it into the iPad and use the AVCam sample from here, I notice that the camera preview comes up when the AVCam app is in photo mode but as soon as I switch the video mode, the preview layer hangs. No error message or AVCaptureSession errors that I could see. The same code works when using the Apple studio as an external camera. Curious if anyone has had any luck figuring this out? So near yet so far.
2
0
837
Sep ’23
Webview, photolibrary,Camera, choose files
I have a safari webkit inside my mobile app which renders a webpage. The webpage has File upload option. when I click on it 3 options are shown as in screenshot. I am trying to make the safari kit to only allow Camera capture and hide Upload already existing files.  Is there any safari permission which I can remove to configuration that hide the options of upload from files.
2
0
1.2k
Jul ’23
CoreMediaIO camera extension not available immediately after activation
We activate our camera extension from host application and wait for user to allow access it in System Settings. Once our host application receives notification camera extension is ready to be used we want to communicate with the extension. When we enumerate AVCaptureDevices or try to find newly added device using CMIOObjectGetPropertyData for property kCMIOHardwarePropertyDevices, our camera extension is not shown. Once we stop and restart host application camera extension is shown as expected, issue only happens once right after activating the extension. Looks like capture devices are not refreshed for host application after camera extension is activated and approved. Is there a way to force system to refresh cameras? Or any other ideas to make extension immediately visible for host application without relaunching it?
3
1
487
Aug ’23
When using ARKit, why can’t you get the front-facing and back-facing camera feeds at once?
I’d like to use ARKit world tracking and display both the back camera feed and the front camera feeds, using the front feed as as a PIP. This would work great for an internet streaming use case. However, it’s impossible. As soon as ARKit is told to use one mode, the camera for the other side freezes/doesn’t work. This page also says you have to pick one camera to show: https://developer.apple.com/documentation/arkit/arkit_in_ios/choosing_which_camera_feed_to_augment?language=objc A question to the developers: why is this limitation in-place? Are there any work-arounds for the use case of ARKit world tracking + displaying the back camera feed + displaying the front camera feed as an overlay? It’s possible to do this with plain camera initialization without ARKit. (There’s an official example.) With ARKit, it no longer works. It’s strange that I cannot access the front feed via one of the other frameworks, but I guess that ARKit blocks that.
1
0
656
Aug ’23
External Camera Not Recognized
Configuration: iPad Pro (10.5-inch) iPadOS 17 beta 4 USB USV Camera Nexigo N940P 2K Apple Lightning to USB 3 Camera Adapter Attaching the adapter to the iPad caused it to update driver software - so far ok Next attached USB Camera to adapter and iPad complained "Not Enough Power ...". Attached power to lighting port of adapter. Above warning cleared. FaceTime didn't recognize the External Camera. I went through the usual unplug routine with no luck. Xcode sees the iPad so I executed the Objective-C version of AVCam from the presentation and added breakpoints at "AVCaptureDeviceDiscoverySession *externalVideoDeviceDiscoverySession = [AVCaptureDeviceDiscoverySession discoverySessionWithDeviceTypes:@[AVCaptureDeviceTypeExternal] mediaType:AVMediaTypeVideo position:AVCaptureDevicePositionUnspecified];" The externalVideoDeviceDiscoverySession shows no attached camera. Does anyone have a suggestion for debugging this? Or share your working setup? Thanks!
1
0
865
Aug ’23