Photos and Imaging

RSS for tag

Integrate still images and other forms of photography into your apps.

Posts under Photos and Imaging tag

76 Posts
Sort by:

Post

Replies

Boosts

Views

Activity

iPhone 14 Pro Max produces corrupted images
My iPhone produces corrupted images under certain conditions. If I shoot same scene (with slightly varying angle) in same lightning conditions, I almost all the time receive corrupted photo, which contains magenta copy of image and green rectangle. If I add some other objects to the scene, thus changing overall brightness of the scene, chance of bug reduces significantly. Device info: iPhone 14 Pro Max (iOS 17 RC), iPhone 14 Pro (iOS 17 beta 6) Images with issue: f/1.664, 1/25s, ISO 500, digitalZoom=1.205, brightness=-0.568 f/1.664, 1/25s, ISO 500, digitalZoom=1.205, brightness=-0.514 f/1.664, 1/25s, ISO 640, digitalZoom=1.205, brightness=-0.641 f/1.664, 1/25s, ISO 500, digitalZoom=1.205, brightness=-0.448 f/1.664, 1/25s, ISO 500, digitalZoom=1.205, brightness=-0.132 Images without issue: f/1.664, 1/25s, ISO 500, digitalZoom=1.205, brightness=-0.456 f/1.664, 1/20s, ISO 640, digitalZoom=1.205, brightness=-1.666 f/1.664, 1/100s, ISO 50, digitalZoom=1.205, brightness=4.840 f/1.664, 1/25s, ISO 640, digitalZoom=1.205, brightness=-0.774 I'm using builtInWideAngleCamera with continuousAutoExposure, continuousAutoFocus and slight videoZoomFactor func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) { if let error = error { capturePhotoCallback?(.failure(.internalError(error.localizedDescription))) return } guard let data = photo.fileDataRepresentation() else { capturePhotoCallback?(.failure(.internalError("Can not get data representation."))) return } guard let image = UIImage(data: data) else { capturePhotoCallback?(.failure(.internalError("Can not get image from data representation."))) return } capturePhotoCallback?(.success(image)) }
0
0
481
Sep ’23
[PHImageManager requestImage] crash only iOS17.0
Unhandled error (NSCocoaErrorDomain, 134093) occurred during faulting and was thrown: Error Domain=NSCocoaErrorDomain Code=134093 "(null)" Fatal Exception: NSInternalInconsistencyException 0 CoreFoundation 0xed5e0 __exceptionPreprocess 1 libobjc.A.dylib 0x2bc00 objc_exception_throw 2 CoreData 0x129c8 _PFFaultHandlerLookupRow 3 CoreData 0x11d60 _PF_FulfillDeferredFault 4 CoreData 0x11c58 _pvfk_header 5 CoreData 0x98e64 _sharedIMPL_pvfk_core_c 6 PhotoLibraryServices 0x6d8b0 -[PLInternalResource orientation] 7 PhotoLibraryServices 0x6d7bc -[PLInternalResource orientedWidth] 8 Photos 0x147e74 ___presentFullResourceAtIndex_block_invoke 9 PhotoLibraryServices 0x174ee4 __53-[PLManagedObjectContext _directPerformBlockAndWait:]_block_invoke 10 CoreData 0x208ec developerSubmittedBlockToNSManagedObjectContextPerform 11 libdispatch.dylib 0x4300 _dispatch_client_callout 12 libdispatch.dylib 0x136b4 _dispatch_lane_barrier_sync_invoke_and_complete 13 CoreData 0x207f8 -[NSManagedObjectContext performBlockAndWait:] 14 PhotoLibraryServices 0x174e98 -[PLManagedObjectContext _directPerformBlockAndWait:] 15 PhotoLibraryServices 0x1738c8 -[PLManagedObjectContext performBlockAndWait:] 16 Photos 0x147d30 _presentFullResourceAtIndex 17 Photos 0x1476bc PHChooserListContinueEnumerating 18 Photos 0x1445e0 -[PHImageResourceChooser presentNextQualifyingResource] 19 Photos 0x2ea74 -[PHImageRequest startRequest] 20 Photos 0x3f2c0 -[PHMediaRequestContext _registerAndStartRequests:] 21 Photos 0x3e484 -[PHMediaRequestContext start] 22 Photos 0x1f0710 -[PHImageManager runRequestWithContext:] 23 Photos 0x1efdb0 -[PHImageManager requestImageDataAndOrientationForAsset:options:resultHandler:] 24 TeraBox 0x2497f0c closure #1 in LocalPhotoLibManager.getDataFrom(_:_:) + 549 (LocalPhotoLibManager.swift:549) 25 TeraBox 0x1835fc4 thunk for @escaping @callee_guaranteed () -> () (<compiler-generated>) 26 TeraBox 0x1cb1288 +[DuboxOCException tryOC:catchException:] + 18 (DuboxOCException.m:18) 27 TeraBox 0x249b4d4 specialized LocalPhotoLibManager.convert(with:_:) + 548 (LocalPhotoLibManager.swift:548) 28 TeraBox 0x2493b24 closure #1 in closure #1 in closure #1 in LocalPhotoLibManager.scanAlbumUpdateLocalphotoTable(_:) + 173 (LocalPhotoLibManager.swift:173) 29 TeraBox 0x1835fc4 thunk for @escaping @callee_guaranteed () -> () (<compiler-generated>) 30 libdispatch.dylib 0x26a8 _dispatch_call_block_and_release 31 libdispatch.dylib 0x4300 _dispatch_client_callout 32 libdispatch.dylib 0x744c _dispatch_queue_override_invoke 33 libdispatch.dylib 0x15be4 _dispatch_root_queue_drain 34 libdispatch.dylib 0x163ec _dispatch_worker_thread2 35 libsystem_pthread.dylib 0x1928 _pthread_wqthread 36 libsystem_pthread.dylib 0x1a04 start_wqthread
6
5
1.4k
Nov ’23
IPhone system album video can not play
We have received a lot of user feedback, saying that our app caused the video in the user's system album to not play, we did reproduce this phenomenon after operating some modules of our app many times, after monitoring the device log, click on the system album z probably received the following abnormal error VideoContentProvider received result:<AVPlayerItem: 0x281004850, asset = <AVURLAsset: 0x28128fce0, URL = file:///var/mobile/Media/DCIM/100APPLE/IMG_0085.MP4>>, info:{ PHImageResultRequestIDKey = 316; }, priority:oneup automatic, strategy:<PXDisplayAssetVideoContentDeliveryStrategy: 0x2836c3000>quality: medium+(med-high), segment:{ nan - nans }, streaming:YES, network:YES, audio:YES, targetSize:{1280, 1280}, displayAsset:8E30C461-B089-4142-82D9-3A8CFF3B5DE9 <PUBrowsingVideoPlayer: 0xc46a59770> Asset : <PHAsset: 0xc48f5fc50> 8E30C461-B089-4142-82D9-3A8CFF3B5DE9/L0/001 mediaType=2/524288, sourceType=1, (828x1792), creationDate=2023-07-19 上午7:36:41 +0000, location=0, hidden=0, favorite=0, adjusted=0 VideoSession : <PXVideoSession 0xc48a1ec50> { Content Provider: <PXPhotoKitVideoContentProvider: 0x282d441e0>, Asset <PHAsset: 0xc48f5fc50> 8E30C461-B089-4142-82D9-3A8CFF3B5DE9/L0/001 mediaType=2/524288, sourceType=1, (828x1792), creationDate=2023-07-19 上午7:36:41 +0000, location=0, hidden=0, favorite=0, adjusted=0 , Media Provider: <PUPhotoKitMediaProvider: 0x28104da70> Desired Play State: Paused Play State: Paused Stalled: 0 At Beginning: 1 End: 0 Playback: ‖ Paus √ b0 a0 s0 l1 f0 e0 r0.0 0.000/60.128 VideoOutput: (null) Got First Pixel Buffer: NO Pixel Buffer Frame Drops: 0 Buffering: 0 }: Starting disabling of video loading for reason: OutOfFocus <PUBrowsingVideoPlayer: 0xc46de66e0> Asset : <PHAsset: 0xc48f5f1d0> 11ECA95E-0B79-4C7C-97C6-5958EE139BAB/L0/001 mediaType=2/0, sourceType=1, (1080x1920), creationDate=2023-09-21 上午7:54:46 +0000, location=1, hidden=0, favorite=0, adjusted=0 VideoSession : (null): Starting disabling of video loading for reason: OutOfFocus I think this message is imporant VideoSession : (null): Starting disabling of video loading for reason: OutOfFocus restart the iPhone can resolve this anomalous ,can you know reason or how to avoid this bug the bug like :https://discussionschinese.apple.com/thread/254766045 https://discussionschinese.apple.com/thread/254787836
1
0
368
Sep ’23
UIImageView preferredImageDynamicRange not working
I am trying to display HDR Images (ProRAW) within UIImageView using preferredImageDynamicRange. This was shown in a 2023 WWDC Video let imageView = UIImageView() if #available(iOS 17.0, *) { self.imageView.preferredImageDynamicRange = UIImage.DynamicRange.high } self.imageView.clipsToBounds = true self.imageView.isMultipleTouchEnabled = true self.imageView.contentMode = .scaleAspectFit self.photoScrollView.addSubview(self.imageView) I pull the image from PHImageManager: let options = PHImageRequestOptions() options.deliveryMode = .highQualityFormat options.isNetworkAccessAllowed = true PHImageManager.default().requestImage(for: asset, targetSize: self.targetSize(), contentMode: .aspectFit, options: options, resultHandler: { image, info in guard let image = image else { return } DispatchQueue.main.async { self.imageView.image =image if #available(iOS 17.0, *) { self.imageView.preferredImageDynamicRange = UIImage.DynamicRange.high } } } Issue The image shows successfully, yet not in HDR mode (no bright specular highlights, as seen when the same image ((ProRAW) is pulled on the native camera app. What am I missing here?
1
0
666
Sep ’23
Memory increasing when changing image background
struct ContentView: View { @State var listOfImages: [String] = ["One", "Two", "Three", "Four"] @State var counter = 0 var body: some View { VStack { Button(action: { counter += 1 }, label: { Text("Next Image") }) } .background(Image(listOfImages[counter])) .padding() } } When I click on the button, counter increases and the next image is displayed as the background. The memory usage of the app increases as each image changes. Is there anyway to maintain a steady memory use?
0
0
442
Sep ’23
How can we access the new focal lengths on iPhone 15 models?
The latest iPhone 15 Pro models support additional focal lengths on the main 24mm (1x) lens: 28mm ("1.2x") and 35mm ("1.5x"). These are supposed to use data from the full sensor to achieve optical quality images (i.e. no upscaling), so I would expect these new focal lengths to appear in the secondaryNativeResolutionZoomFactors array, just like the 2x option does. However, the activeFormat.secondaryNativeResolutionZoomFactors property still only reports [2.0] when using the main 1x lens. Is this an oversight, or is there something special (other than setting the zoom factor) we need to do to access the high-quality 28mm and 35mm modes? I'm wary of simply setting 1.2 or 1.5 as the zoom factor, as that isn't truly the ratio between the base 24mm and the virtual focal lengths.
2
3
951
Oct ’23
camera view as photographic styles in ios 13
hii developers currently i developing a ios camera app in that camera app i need to add features like photographic styles in ios 13 i need only that page view not filters this is my big problem..i used uipageviewcontroller and swipe gesture if i use page in background main camera view func also run, i used one button if i press the button i need views like photographic styles view just view this is my problem i can't do that so please if anyone can read this comment please and solve ..thanks in advance
0
0
351
Oct ’23
how to execute photoLibraryDidChange(_:) when the app is in background?
Goal is to get/save the captured photo (From default camera) immediately from my app when the app is in background. When I capture a photo with default camera, photoLibraryDidChange(_:) function do not execute that time. But when I reopen my app, that time this function executes and give the images, which were captured in that particular time. how to execute photoLibraryDidChange(_:) when app is in background?
1
0
368
Oct ’23
Photos app doesn't properly show OpenEXR images on HDR display
I'm working on a game which uses HDR display output for a much brighter range. One of a feature of the game is the ability to export in-game photos. The only appropriate format I found for this is Open EXR. The embedded Photos app is capable of showing HDR photos on an HDR display. However, if drop an EXR file to the photos with a large range, it won't be properly displayed with HDR mode with the full range. At the same time, pressing Edit on the file makes it HDR displayable and it remains displayable if save the edit with any, even a tiny, change. Moreover, if the EXR file is placed next to 'true' HDR one (or an EXR 'fixed' as on above), then durring scroll between the files, the broken EXR magically fixes at the exact moment the other HDR drives up to the screen. I tested on different files with various internal format. Seems to be a coomon problem for all. Tested on the latest iOS 17.0.3. Thank you in advance.
1
0
564
Apr ’24
ICCameraDevice Takes Forever to Be Ready
Using ImageCaptureCore, to send PTP devices to cameras via tether, I noticed that all of my Nikon cameras can take up to an entire minute for PTP events to start logging. My Canons and Sonys are ready instantly. Any idea why? I use the ICDeviceBrowser to browse for cameras, and then request to open the session. According to the docs, it says it's ready after it enumerates its objects? If that's the case, is there a way to bypass that? Even on an empty SD card it's slow.
1
0
519
Oct ’23
Crashes in requestSendPTPCommand(_:outData:completion:) of ImageCaptureCore framework
Hi there :) We're in our way to make an app to can communicate with DSLR camera by using ImageCaptureCore framework for PTP communication with the camera. In our app, we're sending some PTP commands to the camera by using requestSendPTPCommand(_:outData:completion:). This is our snipped-code to execute a PTP command. public extension ICCameraDevice { func sendCommand(command: Command) async { do { print("sendCommand ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++") print("sendCommand \(command.tag()) : sendCommand Started") let result = try await self.requestSendPTPCommand(command.encodeCommand().commandBuffer, outData: nil) let (data, response) = result print("sendCommand \(command.tag()) : sendCommand Finished") print("sendCommand data: \(data.bytes.count)") print("sendCommand response: \(response.bytes.count)") if !response.bytes.isEmpty { command.decodeResponse(responseData: response) } print("sendCommand \(command.tag()) : sendCommand Finished with response code \(command.responseCode)") print("sendCommand ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++") if command.responseCode != .ok { isRunning = false errorResponseCode = command.responseCode.rawValue print("response error with code = \(command.responseCode)") return } let copiedData = data.deepCopy() command.decodeData(data: copiedData) } catch { isRunning = false print("Error Send Command with error: \(error.localizedDescription)") } } } The function sendCommand(command: Command) async is called in a while-loop in async-await way. So that it needs to wait a sent command to finish before executing another command. The looping keeps running since the device connected to the camera. The result is, the process is running by no problem at all for several minutes, It can get camera's setting, device info, even its images. But then the problems occurred. The amount of time is random, some time it takes only 15 minutes, some time it takes 1 hour. There are 2 problems recorded in our case: 1. The requestSendPTPCommand(_:outData:completion:) result returning empty data without throwing any error, because the error never be caught in catch block. This is my printed result: sendCommand +++++++++++++++++++++++++++++++++++++ sendCommand GetObjectHandlesCommand : sendCommand Started sendCommand GetObjectHandlesCommand : sendCommand Finished sendCommand data: 0 sendCommand response: 0 sendCommand GetObjectHandlesCommand : sendCommand Finished with response code undefined sendCommand +++++++++++++++++++++++++++++++++++++ 2. It crashes with the last message in my logger: sendCommand +++++++++++++++++++++++++++++++++++++ sendCommand GetObjectHandlesCommand : sendCommand Started 2023-10-27 10:44:37.186768+0700 PTPHelper_Example[76486:11538353] [PTPHelper_Example] remoteCamera ! Canon EOS 200D - Error Domain=NSCocoaErrorDomain Code=4097 “connection to service with pid 76493 created from an endpoint” UserInfo={NSDebugDescription=connection to service with pid 76493 created from an endpoint} 2023-10-27 10:44:37.187146+0700 PTPHelper_Example[76486:11538353] [PTPHelper_Example] failureBlock ! Canon EOS 200D - Failure block was called due to a connection failure. For crashed issue, I've tried to attach in this post. But it always failed with messaged An error occured while uploading this log. Please try again later.. So that, I uploaded it in my google drive with url: https://drive.google.com/file/d/1IvJohGwp1zSkncTWHc3430weGntciB3K/view?usp=sharing Reproduced on iOS 16.3.1. I've checked the stack traces of the other threads but nothing suspicious got my attention. But it might relate to this issue https://developer.apple.com/forums/thread/104576. But I can't ensure. Any good idea of how to address these crashes shown above? Thank you!
3
0
492
Oct ’23
Getting 16bit RGBA data from ProRAW by using CIRAWFilter is quite slow in iOS 17
guard let rawfilter = CoreImage.CIRAWFilter(imageData: data, identifierHint: nil) else { return } guard let ciImage = rawfilter.outputImage else { return } let width = Int(ciImage.extent.width) let height = Int(ciImage.extent.height) let rect = CGRect(x: 0, y: 0, width: width, height: height) let context = CIContext() guard let cgImage = context.createCGImage(ciImage, from: rect, format: .RGBA16, colorSpace: CGColorSpaceCreateDeviceRGB()) else { return } print("cgImage prepared") guard let dataProvider = cgImage.dataProvider else { return } let rgbaData = CFDataCreateMutableCopy(kCFAllocatorDefault, 0, dataProvider.data) In iOS 16 this process is much faster than the same process in iOS 17 Is there a method to boost up the decoding speed?
1
0
550
Nov ’23
Photogrammetry failed with crash(Assert: in line 417)
I'm developing a 3D scanner works on a iPad(6th gen, 12-inch). Photogrammetry with ObjectCaptureSession was successful, but other trials are not. I've tried Photogrammetry with URL inputs, these are pictures from AVCapturePhoto. It is strange... if metadata is not replaced, photogrammetry would be finished but it seems to be no depthData or gravity info were used. (depth and gravity is separated files). but if metadata is injected, this trial are fails. and this time i tried to Photogrammetry with PhotogrammetrySamples sequence and it also failed. the settings are: camera: back Lidar camera, image format: kCVPicelFormatType_32BGRA(failed with crash) or hevc(just failed) image depth format: kCVPixelFormatType_DisparityFloat32 or kCVPixelFormatType_DepthFloat32 photoSetting: isDepthDataDeliveryEnabled = true, isDepthDataFiltered = false, embeded = true I wonder iPad supports Photogrammetry with PhotogrammetrySamples I've already tested some sample codes provided by apple: https://developer.apple.com/documentation/realitykit/creating_a_photogrammetry_command-line_app https://developer.apple.com/documentation/avfoundation/additional_data_capture/capturing_depth_using_the_lidar_camera https://developer.apple.com/documentation/realitykit/taking_pictures_for_3d_object_capture What should I do to make Photogrammetry successful?
2
1
444
Apr ’24
iOS 17 ObjectCaptureSession beginNewScanPassAfterFlip
I am trying to get an implementation of Object Capture using ObjectCaptureSession in iOS 17. I have been following the example supplied by Apple, but I cannot get the session object to be in the correct state to allow ObjectCaptureSession::beginNewScanPassAfterFlip() to be called. I get the following error when I call session.beginNewScanPassAfterFlip() Can't beginNewScanPassAfterFlip() from state == capturing Must be .paused from .capturing To start with, there is no state of ObjectCaptureSession which is .paused , so is this talking about .isPaused? I have tried using session.pause() and confirm that it is paused using .isPaused but I get the same error as above. I have checked the output of session.state, and confirm it is .capturing I have put print statements in the example, and confirm that before session.beginNewScanPassAfterFlip() is called at line 104 of OnboardingButtonView the state is .capturing This goes against the documentation in this page: https://developer.apple.com/documentation/realitykit/objectcapturesession/beginnewscanpassafterflip() Note, I have also tried pausing the session and calling beginNewScanPassAfterFlip() but this results in the warning: I am hoping for some clarification if there is something I am missing?
0
0
477
Nov ’23