Photos & Camera

RSS for tag

Explore technical aspects of capturing high-quality photos and videos, including exposure control, focus modes, and RAW capture options.

Posts under Photos & Camera subtopic

Post

Replies

Boosts

Views

Activity

AVCapturePhotoOutput crashes at delegate callback on MacOS 13.7.5
A functioning Multiplatform app, which includes use of Continuity Camera on an M1MacMini running Sequoia 15.5, works correctly capturing photos with AVCapturePhoto. However, that app (and a test app just for Continuity Camera) crashes at delegate callback when run on a 2017 MacBookPro under MacOS 13.7.5. The app was created with Xcode 16 (various releases) and using Swift 6 (but tried with 5). Compiling and running the test app with Xcode 15.2 on the 13.7.5 machine also crashes at delegate callback. The iPhone 15 Continuity Camera gets detected and set up correctly, and preview video works correctly. It's when the CapturePhoto code is run that the crash occurs. The relevant capture code is: func capturePhoto() { let captureSettings = AVCapturePhotoSettings() captureSettings.flashMode = .auto photoOutput.maxPhotoQualityPrioritization = .quality photoOutput.capturePhoto(with: captureSettings, delegate: PhotoDelegate.shared) print("**** CameraManager: capturePhoto") } and the delegate callbacks are: class PhotoDelegate: NSObject, AVCapturePhotoCaptureDelegate { nonisolated(unsafe) static let shared = PhotoDelegate() // MARK: - Delegate callbacks func photoOutput( _ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: (any Error)? ) { print("**** CameraManager: didFinishProcessingPhoto") guard let pData = photo.fileDataRepresentation() else { print("**** photoOutput is empty") return } print("**** photoOutput data is \(pData.count) bytes") } func photoOutput( _ output: AVCapturePhotoOutput, willBeginCaptureFor resolvedSettings: AVCaptureResolvedPhotoSettings ) { print("**** CameraManager: willBeginCaptureFor") } func photoOutput(_ output: AVCapturePhotoOutput, willCapturePhotoFor resolvedSettings: AVCaptureResolvedPhotoSettings) { print("**** CameraManager: willCaptureCapturePhotoFor") } } The crash report significant parts are..... Crashed Thread: 3 Dispatch queue: com.apple.cmio.CMIOExtensionProviderHostContext Exception Type: EXC_BAD_ACCESS (SIGSEGV) Exception Codes: KERN_INVALID_ADDRESS at 0x0000000000000000 Exception Codes: 0x0000000000000001, 0x0000000000000000 Termination Reason: Namespace SIGNAL, Code 11 Segmentation fault: 11 Terminating Process: exc handler [30850] VM Region Info: 0 is not in any region. Bytes before following region: 4296495104 REGION TYPE START - END [ VSIZE] PRT/MAX SHRMOD REGION DETAIL UNUSED SPACE AT START ---> __TEXT 100175000-10017f000 [ 40K] r-x/r-x SM=COW ...tinuityCamera Thread 0:: Dispatch queue: com.apple.main-thread 0 libsystem_kernel.dylib 0x7ff803aed552 mach_msg2_trap + 10 1 libsystem_kernel.dylib 0x7ff803afb6cd mach_msg2_internal + 78 2 libsystem_kernel.dylib 0x7ff803af4584 mach_msg_overwrite + 692 3 libsystem_kernel.dylib 0x7ff803aed83a mach_msg + 19 4 CoreFoundation 0x7ff803c07f8f __CFRunLoopServiceMachPort + 145 5 CoreFoundation 0x7ff803c06a10 __CFRunLoopRun + 1365 6 CoreFoundation 0x7ff803c05e51 CFRunLoopRunSpecific + 560 7 HIToolbox 0x7ff80d694f3d RunCurrentEventLoopInMode + 292 8 HIToolbox 0x7ff80d694d4e ReceiveNextEventCommon + 657 9 HIToolbox 0x7ff80d694aa8 _BlockUntilNextEventMatchingListInModeWithFilter + 64 10 AppKit 0x7ff806ca59d8 _DPSNextEvent + 858 11 AppKit 0x7ff806ca4882 -[NSApplication(NSEvent) _nextEventMatchingEventMask:untilDate:inMode:dequeue:] + 1214 12 AppKit 0x7ff806c96ef7 -[NSApplication run] + 586 13 AppKit 0x7ff806c6b111 NSApplicationMain + 817 14 SwiftUI 0x7ff90e03a9fb 0x7ff90dfb4000 + 551419 15 SwiftUI 0x7ff90f0778b4 0x7ff90dfb4000 + 17578164 16 SwiftUI 0x7ff90e9906cf 0x7ff90dfb4000 + 10340047 17 ContinuityCamera 0x10017b49e 0x100175000 + 25758 18 dyld 0x7ff8037d1418 start + 1896 Thread 1: 0 libsystem_pthread.dylib 0x7ff803b27bb0 start_wqthread + 0 Thread 2: 0 libsystem_pthread.dylib 0x7ff803b27bb0 start_wqthread + 0 Thread 3 Crashed:: Dispatch queue: com.apple.cmio.CMIOExtensionProviderHostContext 0 ??? 0x0 ??? 1 AVFCapture 0x7ff82045996c StreamAsyncStillCaptureCallback + 61 2 CoreMediaIO 0x7ff813a4358f __94-[CMIOExtensionProviderHostContext captureAsyncStillImageWithStreamID:uniqueID:options:reply:]_block_invoke + 498 3 libxpc.dylib 0x7ff803875b33 _xpc_connection_reply_callout + 36 4 libxpc.dylib 0x7ff803875ab2 _xpc_connection_call_reply_async + 69 5 libdispatch.dylib 0x7ff80398b099 _dispatch_client_callout3 + 8 6 libdispatch.dylib 0x7ff8039a6795 _dispatch_mach_msg_async_reply_invoke + 387 7 libdispatch.dylib 0x7ff803991088 _dispatch_lane_serial_drain + 393 8 libdispatch.dylib 0x7ff803991d6c _dispatch_lane_invoke + 417 9 libdispatch.dylib 0x7ff80399c3fc _dispatch_workloop_worker_thread + 765 10 libsystem_pthread.dylib 0x7ff803b28c55 _pthread_wqthread + 327 11 libsystem_pthread.dylib 0x7ff803b27bbf start_wqthread + 15 Of course, the MacBookPro is an old device - but Continuity Camera works with the installed Photo Booth app, so it's possible. Any thoughts on solving this situation would be appreciated. Regards, Michaela
1
0
414
1w
Impact on iOS Due to Image Policy Changes with Android Target SDK 34
As the image access policy has changed with Android targeting SDK 34, I’m planning to update the way our app accesses photos. We are using the react-native-image-picker library to access images. On Android, the system no longer prompts the user for image access permissions, but on iOS, permission requests still appear. Since Android no longer requires explicit permissions, I’ve removed the permission request logic for Android. In this case, is it also safe to remove the permission request for iOS? In our app, photo access is only used for changing the user profile picture and attaching images when writing a post on the bulletin board. Are there any limitations or considerations for this kind of usage?
1
0
89
Apr ’25
[iOS 18.0] PHImageManager request image crash
I am seeing a crash on iOS 18.0 in my app due to PHImageManager.default().requestImage. Same crash is seen even while using requestImageDataAndOrientation. Not sure why this is happening only on iOS 18.0. Any help on this would be appreciated. Exception Type: EXC_BAD_ACCESS (SIGSEGV) Exception Subtype: KERN_INVALID_ADDRESS at 0x0000000000000010 Exception Codes: 0x0000000000000001, 0x0000000000000010 VM Region Info: 0x10 is not in any region. Bytes before following region: 4310777840 REGION TYPE START - END [ VSIZE] PRT/MAX SHRMOD REGION DETAIL UNUSED SPACE AT START ---> Termination Reason: SIGNAL 11 Segmentation fault: 11 Terminating Process: exc handler [695] Triggered by Thread: 14 Thread 14 name: Dispatch queue: */file=33) .Generic Thread 14 Crashed: 0 libobjc.A.dylib 0x1944d7020 objc_msgSend + 32 1 PhotoLibraryServices 0x1b080262c +[PLUniformTypeIdentifier utiWithCompactRepresentation:conformanceHint:] + 40 2 Photos 0x1b0081354 _resourceInfoFromResultDict + 1048 3 Photos 0x1b0080a7c fetchResourcesForChoosing + 584 4 Photos 0x1b0080714 ___fetchNonHintResources_block_invoke.227 + 152 5 PhotoLibraryServices 0x1b026b3c4 __53-[PLManagedObjectContext _directPerformBlockAndWait:]_block_invoke + 48 6 CoreData 0x19f171b00 developerSubmittedBlockToNSManagedObjectContextPerform + 228 7 libdispatch.dylib 0x19eeff584 _dispatch_client_callout + 16 8 libdispatch.dylib 0x19eef5728 _dispatch_lane_barrier_sync_invoke_and_complete + 56 9 CoreData 0x19f1c2a0c -[NSManagedObjectContext performBlockAndWait:] + 308 10 PhotoLibraryServices 0x1b026cea4 -[PLManagedObjectContext _directPerformBlockAndWait:] + 144 11 PhotoLibraryServices 0x1b026cdf8 -[PLManagedObjectContext performBlockAndWait:] + 196 12 Photos 0x1b00804a4 _fetchNonHintResources + 292 13 Photos 0x1afeedb38 PHChooserListContinueEnumerating + 144 14 Photos 0x1afeed9f8 -[PHImageResourceChooser presentNextQualifyingResource] + 412 15 Photos 0x1afeed1d0 -[PHImageRequest startRequest] + 2424 16 libdispatch.dylib 0x19eee5aac _dispatch_call_block_and_release + 32 17 libdispatch.dylib 0x19eeff584 _dispatch_client_callout + 16 18 libdispatch.dylib 0x19eeee2d0 _dispatch_lane_serial_drain + 740 19 libdispatch.dylib 0x19eeeede0 _dispatch_lane_invoke + 440 20 libdispatch.dylib 0x19eef91dc _dispatch_root_queue_drain_deferred_wlh + 292 21 libdispatch.dylib 0x19eef8a60 _dispatch_workloop_worker_thread + 540 22 libsystem_pthread.dylib 0x2214d5660 _pthread_wqthread + 292 23 libsystem_pthread.dylib 0x2214d29f8 start_wqthread + 8
6
0
133
May ’25
PhotosPicker obtains the result information of the selected video
In the AVP project, a selector pops up, only wanting to filter spatial videos. When selecting the material of one of the spatial videos, the selection result returns empty. How can we obtain the video selected by the user and get the path and the URL of the file The code is as follows: PhotosPicker(selection: $selectedItem, matching: .videos) { Text("Choose a spatial photo or video") } func loadTransferable(from imageSelection: PhotosPickerItem) -> Progress { return imageSelection.loadTransferable(type: URL.self) { result in DispatchQueue.main.async { // guard imageSelection == self.imageSelection else { return } print("加载成功的图片集合:(result)") switch result { case .success(let url?): self.selectSpatialVideoURL = url print("获取视频链接:(url)") case .success(nil): break // Handle the success case with an empty value. case .failure(let error): print("spatial错误:(error)") // Handle the failure case with the provided error. } } } }
4
0
117
May ’25
How to Get Device Orientation in Background (PiP) Mode?
My app is a camera app that supports Picture-in-Picture (PiP) mode. Normally, when the device rotates, I get the device orientation from iOS and use it to rotate the camera feed so that the preview stays correctly aligned. However, when the app enters PiP mode, it is considered to be in the background, and I can no longer receive orientation updates from the system. As a result, I can’t apply rotation corrections to the camera video in PiP mode. Is there any way to retrieve device orientation while the app is in the background (specifically during PiP mode)? Any guidance would be greatly appreciated. Thank you!
0
0
55
May ’25
ImageIO failed to encode HECIS in macOS 15.5
ImageIO encoding to HEICS fails in macOS 15.5. log writeImageAtIndex:1246: *** CMPhotoCompressionSessionAddImageToSequence: err = kCMPhotoError_UnsupportedOperation [-16994] (codec: 'hvc1') seems to be related with https://github.com/SDWebImage/SDWebImage/issues/3732 affected version iOS 18.4 (sim and device), macOS 15.5 unaffected version iOS 18.3 (sim and device), macOS 15.3
1
0
105
Jun ’25
iOS Camera access issues in Developer mode on real device - PermissionStatus.permanentlyDenied
Xcode Version 16.3 (16E140) App developed in Flutter Flutter 3.29.3 Test iPhone device: iPhone 16 Pro running iOS 18.5 I have an app that requires Camera access. This used to work before with iOS 18.4.x. I have dumbed down my app to just get Camera permission. Even then it fails flutter: Camera permission: PermissionStatus.denied flutter: Photos permission: PermissionStatus.denied flutter: Microphone permission: PermissionStatus.denied flutter: --- End Debug Info --- flutter: Loaded translations from asset for en_US container_create_or_lookup_app_group_path_by_app_group_identifier: client is not entitled container_create_or_lookup_app_group_path_by_app_group_identifier: client is not entitled container_create_or_lookup_app_group_path_by_app_group_identifier: client is not entitled container_create_or_lookup_app_group_path_by_app_group_identifier: client is not entitled container_create_or_lookup_app_group_path_by_app_group_identifier: client is not entitled container_create_or_lookup_app_group_path_by_app_group_identifier: client is not entitled flutter: CAMERA PERMISSION STATUS: PermissionStatus.permanentlyDenied Camera permissions don't show up in my App settings or under "Settings -> Privacy and Security -> Camera" and I am at loss to understand why this is happening.
1
0
133
May ’25
Is Phase Detection Autofocus degrading video stabilization, and can I disable it?
I'm developing a video capture app using AVFoundation, designed specifically for use on a boat pylon to record slalom water skiing. This setup involves considerable vibration. As you may know, the OIS that Apple began adding to lenses since the iPhone 7 is actually very problematic in high vibration circumstances, ironically creating very shaky video, whereas lenses without OIS produce perfectly stable video. Because of this, up until iPhone 14, the solution for my app was simply to use the Selfie lens, which did not have OIS. Starting with iPhone 14 through iPhone 16 (non-Pro models), technical specs suggest the selfie lens still does not include OIS. However, I’m still seeing the same kind of shaky video behavior I see on OIS-equipped lenses. The one hardware change I see in this camera module is the addition of PDAF (Phase Detection Autofocus), so that is my best guess as to what is causing the unstable video. 1- Does that make any sense - that in high vibration settings, PDAF could create unstable video in the same way that OIS does? Or could it be something else that was changed between the iPhone 13 and 14 Selfie lens? Thinking that the issue was PDAF, I figured that if I enabled my app to set a Manual Focus level, that ought to circumvent PDAF (expecting that if a lens is manually focusing, it can’t also be autofocusing via PDAF). However, even with manual focus locked via AVCaptureDevice in my app, on the Selfie lens of an iPhone 16, the video still comes out very shaky, basically unusable. I also tested with the built-in Apple Camera app (using the press-and-hold to lock focus and exposure) and another 3rd party camera app to lock focus, all with the same results, so it's not that my app just isn't correctly doing manual focus. So I'm stuck with these questions: 2- Does the selfie camera on iPhones 14–16 use PDAF even when focus is set to locked/manual mode? 3- Is there any way in AVFoundation to disable or suppress PDAF during video recording (e.g., a flag, device format setting, or private API)? 4- Is PDAF behavior or suppression documented or controllable via AVCaptureDevice or any related class? 5- If no control of PDAF is available, are there any best practices for stabilizing or smoothing this effect programmatically? Note that I also have set my app to use the most aggressive form of stabilization available, so it defaults to .cinematicExtendedEnhanced, if that’s not available, then .cinematicExtended, etc. On the 16 Selfie lens, it is using .cinematicExtended. As an additional question: 6- Would those be the most appropriate stabilization settings for a high vibration environment, and if not, what would be best?
0
0
176
May ’25
Telephoto Lens Keeps Switching to Other Lenses on iPhone 16 Pro Max During PPG (Finger on Camera)
Hi, I’m building a PPG-based heart rate feature where the user places their finger over the rear telephoto camera. On iPhone 16 Pro Max, I'm explicitly selecting the telephoto lens like this: videoDevice = AVCaptureDevice.default(.builtInTelephotoCamera, for: .video, position: .back) And trying to lock it: if #available(iOS 15.0, *), device.activePrimaryConstituentDeviceSwitchingBehavior != .unsupported { try? device.lockForConfiguration() device.setPrimaryConstituentDeviceSwitchingBehavior(.locked, restrictedSwitchingBehaviorConditions: []) device.unlockForConfiguration() } I also lock everything else to prevent dynamic changes: try device.lockForConfiguration() device.focusMode = .locked device.exposureMode = .locked device.whiteBalanceMode = .locked device.videoZoomFactor = 1.0 device.automaticallyEnablesLowLightBoostWhenAvailable = false device.automaticallyAdjustsVideoHDREnabled = false device.unlockForConfiguration() Despite this, the camera still switches to another lens, especially under different lighting, even though the user’s finger fully covers the lens. Questions: How can I completely prevent lens switching in this scenario? Would using videoZoomFactor = 3.0 or 5.0 better enforce use of the telephoto lens? Thanks! Gal
3
0
140
Jul ’25
any app shows a black square instead of the camera picture
Hi! I am making an app for Apple Vision pro (VisionOS 2.5) that is scanning the surroundings and recognises all the texts around you. I tried to use the AVCaptureSession library, but when I run the app from xcode on the real AVP device, the camera is not accessible. I enabled the camera access in my Info.plist: NSCameraUsageDescription Used for live text recognition and I checked camera settings in the AVP, there are no restrictions. However I have always a black square with a crossed camera icon displayed instead of the image from the camera. I tried a couple of different apps from Github using the AVCaptureSession and they all display the black square instead of the picture. What can be wrong with the camera?
2
0
116
Jun ’25
RealityKit/ARKit Memory Not Fully Released After AR Session Cleanup
Hi, I'm developing a SwiftUI app using RealityKit and ARKit for an AR measuring feature. I’ve noticed that after navigating away from my AR view and performing extensive cleanup (including removing all anchors/entities, pausing the ARSession, and nil-ing out all references), memory usage remains elevated and sometimes grows with repeated AR sessions. Each time I enter and exit the AR view, memory increases The memory does not return to the baseline after cleanup, even though all custom objects are deallocated. Are there best practices beyond what I’ve described to ensure all ARKit/RealityKit resources are released after an AR session?
0
0
86
Jun ’25
How to toggle usb device
When I use IOKit/usb/IOUSBLib to toggle build-in camera, I got an ERROR:ret IOReturn -536870210 How can I resolve it? Can I use IOUSBLib to disable or hide build-in camera? My environment: Model Name: MacBook Pro ProductVersion: 15.5 Model Identifier: MacBookPro15,2 Processor Name: Quad-Core Intel Core i5 Processor Speed: 2.4 GHz Number of Processors: 1 // 禁用/启用USB设备 bool toggleUSBDevice(uint16_t vendorID, uint16_t productID, bool enable) { std::cout << (enable ? "Enabling" : "Disabling") << " USB device with VID: 0x" << std::hex << vendorID << ", PID: 0x" << productID << std::endl; // 创建匹配字典查找指定VID/PID的USB设备 CFMutableDictionaryRef matchingDict = IOServiceMatching(kIOUSBDeviceClassName); if (!matchingDict) { std::cerr << "Failed to create USB device matching dictionary." << std::endl; return false; } // 设置VID/PID匹配条件 CFNumberRef vendorIDRef = CFNumberCreate(kCFAllocatorDefault, kCFNumberSInt16Type, &vendorID); CFNumberRef productIDRef = CFNumberCreate(kCFAllocatorDefault, kCFNumberSInt16Type, &productID); CFDictionarySetValue(matchingDict, CFSTR(kUSBVendorID), vendorIDRef); CFDictionarySetValue(matchingDict, CFSTR(kUSBProductID), productIDRef); CFRelease(vendorIDRef); CFRelease(productIDRef); // 获取匹配的设备迭代器 io_iterator_t deviceIterator; if (IOServiceGetMatchingServices(kIOMainPortDefault, matchingDict, &deviceIterator) != KERN_SUCCESS) { std::cerr << "Failed to get USB device iterator." << std::endl; CFRelease(matchingDict); return false; } io_service_t usbDevice; bool result = false; int deviceCount = 0; // 遍历所有匹配的设备 while ((usbDevice = IOIteratorNext(deviceIterator)) != IO_OBJECT_NULL) { deviceCount++; // 获取设备路径 char path[1024]; if (IORegistryEntryGetPath(usbDevice, kIOServicePlane, path) == KERN_SUCCESS) { std::cout << "Found device at path: " << path << std::endl; } // 打开设备 IOCFPlugInInterface** plugInInterface = NULL; IOUSBDeviceInterface** deviceInterface = NULL; SInt32 score; IOReturn ret = IOCreatePlugInInterfaceForService( usbDevice, kIOUSBDeviceUserClientTypeID, kIOCFPlugInInterfaceID, &plugInInterface, &score); if (ret == kIOReturnSuccess && plugInInterface) { ret = (*plugInInterface)->QueryInterface(plugInInterface, CFUUIDGetUUIDBytes(kIOUSBDeviceInterfaceID), (LPVOID*)&deviceInterface); (*plugInInterface)->Release(plugInInterface); } if (ret != kIOReturnSuccess) { std::cerr << "Failed to open USB device interface. Error:" << ret << std::endl; IOObjectRelease(usbDevice); continue; } // 禁用/启用设备 if (enable) { // 启用设备 - 重新配置设备 ret = (*deviceInterface)->USBDeviceReEnumerate(deviceInterface, 0); if (ret == kIOReturnSuccess) { std::cout << "Device enabled successfully." << std::endl; result = true; } else { std::cerr << "Failed to enable device. Error: " << ret << std::endl; } } else { // 禁用设备 - 断开设备连接 ret = (*deviceInterface)->USBDeviceClose(deviceInterface); if (ret == kIOReturnSuccess) { std::cout << "Device disabled successfully." << std::endl; result = true; } else { std::cerr << "Failed to disable device. Error: " << ret << std::endl; } } // 关闭设备接口 (*deviceInterface)->Release(deviceInterface); IOObjectRelease(usbDevice); } IOObjectRelease(deviceIterator); if (deviceCount == 0) { std::cerr << "No device found with specified VID/PID." << std::endl; return false; } return result; }
0
0
127
Jun ’25
How to override the default USB video
According to the doc, I did a simple demo to verify. My env: ProductName: macOS ProductVersion: 15.5 BuildVersion: 24F74 2.4 GHz 四核Intel Core i5 Info.plist: <?xml version="1.0" encoding="UTF-8"?> <!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd"> <plist version="1.0"> <dict> <key>IOKitPersonalities</key> <dict> <key>UVCamera</key> <dict> <key>CFBundleIdentifierKernel</key> <string>com.apple.kpi.iokit</string> <key>IOClass</key> <string>IOUserService</string> <key>IOMatchCategory</key> <string>$(PRODUCT_BUNDLE_IDENTIFIER)</string> <key>IOProviderClass</key> <string>IOUserResources</string> <key>IOResourceMatch</key> <string>IOKit</string> <key>IOUserClass</key> <string>UVCamera</string> <key>IOUserServerName</key> <string>$(PRODUCT_BUNDLE_IDENTIFIER)</string> <key>IOProbeScore</key> <integer>100000</integer> <key>idVendor</key> <integer>1452</integer> <key>idProduct</key> <integer>34068</integer> </dict> </dict> <key>OSBundleUsageDescription</key> <string></string> </dict> </plist> UVCamera.cpp // // UVCamera.cpp // UVCamera // // Created by DTEN on 2025/6/12. // #include <os/log.h> #include <DriverKit/IOUserServer.h> #include <DriverKit/IOLib.h> #include "UVCamera.h" kern_return_t IMPL(UVCamera, Start) { kern_return_t ret; ret = Start(provider, SUPERDISPATCH); os_log(OS_LOG_DEFAULT, "Hello World"); return ret; } UVCamera.iig // // UVCamera.iig // UVCamera // // Created by DTEN on 2025/6/12. // #ifndef UVCamera_h #define UVCamera_h #include <Availability.h> #include <DriverKit/IOService.iig> class UVCamera: public IOService { public: virtual kern_return_t Start(IOService * provider) override; }; #endif /* UVCamera_h */ Then I build by xcode and mv it to /Library/DriverExtensions: sudo mv com.lqs.MyVirtualCam.UVCamera.dext /Library/DriverExtensions sudo kmutil install -R / -r /Library/DriverExtensions kmutil rebuild done However,the dext can't be loaded: kmutil showloaded --list-only | grep UVCamera No variant specified, falling back to release What's the problem? anyone can help me?
0
0
98
Jun ’25
Why a driverkit extension needs a CMIO extension
I developed a driverkit extension based on overriding-the-default-usb-video-class-extension, but the link didn’t give the details of realization. I asked DTS who gave two tips: 1, Do you also have a CMIO extension to load in place of the default overriding-the-default-usb-video-class-extension 2, Your DriverKit extension’s info.plist is also missing the CameraAssistantBundleID. I want to know why a driverkit extension needs a CMIO extension, what’s the data and control flow?
2
0
112
Jun ’25
Capture session interrupted randomly
We are facing a strange issue where a small portion of our large userbase can not start the capture session in our app, as it gets interrupted with the following reason: AVCaptureSessionInterruptionReasonVideoDeviceNotAvailableWithMultipleForegroundApps Our users are all from iPhones, no one is using an iPad. Just to be sure we have set session.isMultitaskingCameraAccessEnabled = true but it does not seem to make any difference. Another weird scenario we are seeing on an even smaller number of users is that the following call: AVCaptureDevice.default(.builtInWideAngleCamera, for: .video, position: .back) returns nil. A quick look at our error reports show this happening on iPhone XR, 13 and 14 models. They should all support this device type. Any help on investigating these issue would be greatly appreciated!
0
0
100
Jun ’25
Capture session interrupted randomly
We are facing a strange issue where a small portion of our large userbase can not start the capture session in our app, as it gets interrupted with the following reason: AVCaptureSessionInterruptionReasonVideoDeviceNotAvailableWithMultipleForegroundApps Our users are all from iPhones, no one is using an iPad. Just to be sure we have set session.isMultitaskingCameraAccessEnabled = true but it does not seem to make any difference. Another weird interruption we are seeing
0
0
107
Jun ’25
Is a Locked Capture Extension allowed to just "open the app" when the device is unlocked?
Hey, Quick question. I noticed that Adobe's new app, Project Indigo, allows you to open the app using the Camera Control button. However, when your device is locked it just shows this screen: Would this normally be approved by the Appstore approval process? I ask because I would like to do something similar with my camera app. I know that this is not the best user experience, but my apps UI is not built in Swift and I don't have the resources to build the UI again. At least this way the user experience would be improved from what it is now, where users cannot even launch the app. I get many requests per week about this feature and would love to improve the UX for my users, even if it's not the best possible. Thanks, Alex
0
0
191
Jun ’25
When deleting photos, encountered PHPhotosError.operationInterrupted (3301).
Hi, I’ve developed a photo app that includes a photo deletion feature. Some users have reported encountering PHPhotosError.operationInterrupted (3301) when attempting to delete photos. Initially, I suspected that some of the assets might have a sourceType of typeiTunesSynced, since the documentation notes that iTunes-synced assets cannot be edited or deleted. However, after checking the logs, all of the assets involved are of typeUserLibrary. Additionally, the user mentioned that some photos in the iPhone Photos do not show a delete button. I’m unsure whether the absence of the delete button is related to the 3301 error. I’d like to confirm the following: Under what conditions does PHPhotosError.operationInterrupted (3301) occur, and how should it be handled? Why do some photos in the iPhone Photos not show a delete button? The code for deleting photos is as follows: PHPhotoLibrary *library = [PHPhotoLibrary sharedPhotoLibrary]; [library performChanges:^{ PHFetchResult *assetsToBeDeleted = [PHAsset fetchAssetsWithLocalIdentifiers:delUrls options:nil]; if (assetsToBeDeleted) { [PHAssetChangeRequest deleteAssets:assetsToBeDeleted]; } } completionHandler:^(BOOL success, NSError *error) {
0
0
99
Jun ’25
Memory leak when performing DetectHumanBodyPose3DRequest request
Hi, I'm developing an application for macos and ios that has to run DetectHumanBodyPose3DRequest model in real time for retrieving the 3d skeleton from the camera. I'm experiencing a memory leak every time the model is used (when i comment that line, the memory stays constant). After a minute it uses about 1GB of ram running with mac catalyst. I attached a minimal project that has this problem Code Camera View import SwiftUI import Combine import Vision struct CameraView: View { @StateObject private var viewModel = CameraViewModel() var body: some View { HStack { ZStack { GeometryReader { geometry in if let image = viewModel.currentFrame { Image(decorative: image, scale: 1) .resizable() .scaledToFill() .frame(width: geometry.size.width, height: geometry.size.height) .clipped() } else { ProgressView() } } } } } } class CameraViewModel: ObservableObject { @Published var currentFrame: CGImage? @Published var frameRate: Double = 0 @Published var currentVisionBodyPose: HumanBodyPose3DObservation? // Store current body pose @Published var currentImageSize: CGSize? // Store current image size private var cameraManager: CameraManager? private var humanBodyPose = HumanBodyPose3DDetector() private var lastClassificationTime = Date() private var frameCount = 0 private var lastFrameTime = Date() private let classificationThrottleInterval: TimeInterval = 1.0 private var lastPoseSendTime: Date = .distantPast init() { cameraManager = CameraManager() startPreview() startClassification() } private func startPreview() { Task { guard let previewStream = cameraManager?.previewStream else { return } for await frame in previewStream { let size = CGSize(width: frame.width, height: frame.height) Task { @MainActor in self.currentFrame = frame self.currentImageSize = size self.updateFrameRate() } } } } private func startClassification() { Task { guard let classificationStream = cameraManager?.classificationStream else { return } for await pixelBuffer in classificationStream { self.classifyFrame(pixelBuffer: pixelBuffer) } } } private func classifyFrame(pixelBuffer: CVPixelBuffer) { humanBodyPose.runHumanBodyPose3DRequestOnImage(pixelBuffer: pixelBuffer) { [weak self] observation in guard let self = self else { return } DispatchQueue.main.async { if let observation = observation { self.currentVisionBodyPose = observation print(observation) } else { self.currentVisionBodyPose = nil } } } } private func updateFrameRate() { frameCount += 1 let now = Date() let elapsed = now.timeIntervalSince(lastFrameTime) if elapsed >= 1.0 { frameRate = Double(frameCount) / elapsed frameCount = 0 lastFrameTime = now } } } HumanBodyPose3DDetector import Foundation import Vision class HumanBodyPose3DDetector: NSObject, ObservableObject { @Published var humanObservation: HumanBodyPose3DObservation? = nil private let queue = DispatchQueue(label: "humanbodypose.queue") private let request = DetectHumanBodyPose3DRequest() private struct SendablePixelBuffer: @unchecked Sendable { let buffer: CVPixelBuffer } public func runHumanBodyPose3DRequestOnImage(pixelBuffer: CVPixelBuffer, completion: @escaping (HumanBodyPose3DObservation?) -> Void) { let sendableBuffer = SendablePixelBuffer(buffer: pixelBuffer) queue.async { [weak self] in Task { [weak self, sendableBuffer] in do { guard let self = self else { return } let result = try await self.request.perform(on: sendableBuffer.buffer) //process result DispatchQueue.main.async { if result.isEmpty { completion(nil) } else { completion(result[0]) } } } catch { DispatchQueue.main.async { completion(nil) } } } } } }
1
0
139
Jun ’25