I'm developing a 3D scanner works on a iPad(6th gen, 12-inch).
Photogrammetry with ObjectCaptureSession was successful, but other trials are not.
I've tried Photogrammetry with URL inputs, these are pictures from AVCapturePhoto.
It is strange... if metadata is not replaced, photogrammetry would be finished but it seems to be no depthData or gravity info were used. (depth and gravity is separated files). but if metadata is injected, this trial are fails.
and this time i tried to Photogrammetry with PhotogrammetrySamples sequence and it also failed.
the settings are:
camera: back Lidar camera,
image format: kCVPicelFormatType_32BGRA(failed with crash) or hevc(just failed) image
depth format: kCVPixelFormatType_DisparityFloat32 or kCVPixelFormatType_DepthFloat32
photoSetting: isDepthDataDeliveryEnabled = true, isDepthDataFiltered = false, embeded = true
I wonder iPad supports Photogrammetry with PhotogrammetrySamples
I've already tested some sample codes provided by apple:
https://developer.apple.com/documentation/realitykit/creating_a_photogrammetry_command-line_app
https://developer.apple.com/documentation/avfoundation/additional_data_capture/capturing_depth_using_the_lidar_camera
https://developer.apple.com/documentation/realitykit/taking_pictures_for_3d_object_capture
What should I do to make Photogrammetry successful?
Photos and Imaging
RSS for tagIntegrate still images and other forms of photography into your apps.
Posts under Photos and Imaging tag
76 Posts
Sort by:
Post
Replies
Boosts
Views
Activity
guard let rawfilter = CoreImage.CIRAWFilter(imageData: data, identifierHint: nil) else { return }
guard let ciImage = rawfilter.outputImage else { return }
let width = Int(ciImage.extent.width)
let height = Int(ciImage.extent.height)
let rect = CGRect(x: 0, y: 0, width: width, height: height)
let context = CIContext()
guard let cgImage = context.createCGImage(ciImage, from: rect, format: .RGBA16, colorSpace: CGColorSpaceCreateDeviceRGB()) else { return }
print("cgImage prepared")
guard let dataProvider = cgImage.dataProvider else { return }
let rgbaData = CFDataCreateMutableCopy(kCFAllocatorDefault, 0, dataProvider.data)
In iOS 16 this process is much faster than the same process in iOS 17
Is there a method to boost up the decoding speed?
Hi! I am creating a document based app and I am wondering how can I save images in my model and integrate them in the document file.
"Image" does not conform to "Persistence model" and I am sure that "Data" will be saved in the document file.
Any clue on how can I do it? Many thanks.
Hi there :)
We're in our way to make an app to can communicate with DSLR camera by using ImageCaptureCore framework for PTP communication with the camera.
In our app, we're sending some PTP commands to the camera by using requestSendPTPCommand(_:outData:completion:). This is our snipped-code to execute a PTP command.
public extension ICCameraDevice {
func sendCommand(command: Command) async {
do {
print("sendCommand ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++")
print("sendCommand \(command.tag()) : sendCommand Started")
let result = try await self.requestSendPTPCommand(command.encodeCommand().commandBuffer, outData: nil)
let (data, response) = result
print("sendCommand \(command.tag()) : sendCommand Finished")
print("sendCommand data: \(data.bytes.count)")
print("sendCommand response: \(response.bytes.count)")
if !response.bytes.isEmpty {
command.decodeResponse(responseData: response)
}
print("sendCommand \(command.tag()) : sendCommand Finished with response code \(command.responseCode)")
print("sendCommand ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++")
if command.responseCode != .ok {
isRunning = false
errorResponseCode = command.responseCode.rawValue
print("response error with code = \(command.responseCode)")
return
}
let copiedData = data.deepCopy()
command.decodeData(data: copiedData)
} catch {
isRunning = false
print("Error Send Command with error: \(error.localizedDescription)")
}
}
}
The function sendCommand(command: Command) async is called in a while-loop in async-await way. So that it needs to wait a sent command to finish before executing another command. The looping keeps running since the device connected to the camera.
The result is, the process is running by no problem at all for several minutes, It can get camera's setting, device info, even its images. But then the problems occurred. The amount of time is random, some time it takes only 15 minutes, some time it takes 1 hour.
There are 2 problems recorded in our case:
1. The requestSendPTPCommand(_:outData:completion:) result returning empty data without throwing any error, because the error never be caught in catch block.
This is my printed result:
sendCommand +++++++++++++++++++++++++++++++++++++
sendCommand GetObjectHandlesCommand : sendCommand Started
sendCommand GetObjectHandlesCommand : sendCommand Finished
sendCommand data: 0
sendCommand response: 0
sendCommand GetObjectHandlesCommand : sendCommand Finished with response code undefined
sendCommand +++++++++++++++++++++++++++++++++++++
2. It crashes with the last message in my logger:
sendCommand +++++++++++++++++++++++++++++++++++++
sendCommand GetObjectHandlesCommand : sendCommand Started
2023-10-27 10:44:37.186768+0700 PTPHelper_Example[76486:11538353] [PTPHelper_Example] remoteCamera ! Canon EOS 200D - Error Domain=NSCocoaErrorDomain Code=4097 “connection to service with pid 76493 created from an endpoint” UserInfo={NSDebugDescription=connection to service with pid 76493 created from an endpoint}
2023-10-27 10:44:37.187146+0700 PTPHelper_Example[76486:11538353] [PTPHelper_Example] failureBlock ! Canon EOS 200D - Failure block was called due to a connection failure.
For crashed issue, I've tried to attach in this post. But it always failed with messaged An error occured while uploading this log. Please try again later.. So that, I uploaded it in my google drive with url: https://drive.google.com/file/d/1IvJohGwp1zSkncTWHc3430weGntciB3K/view?usp=sharing
Reproduced on iOS 16.3.1.
I've checked the stack traces of the other threads but nothing suspicious got my attention. But it might relate to this issue https://developer.apple.com/forums/thread/104576. But I can't ensure.
Any good idea of how to address these crashes shown above?
Thank you!
Using ImageCaptureCore, to send PTP devices to cameras via tether, I noticed that all of my Nikon cameras can take up to an entire minute for PTP events to start logging. My Canons and Sonys are ready instantly. Any idea why? I use the ICDeviceBrowser to browse for cameras, and then request to open the session. According to the docs, it says it's ready after it enumerates its objects? If that's the case, is there a way to bypass that? Even on an empty SD card it's slow.
I'm working on a game which uses HDR display output for a much brighter range.
One of a feature of the game is the ability to export in-game photos. The only appropriate format I found for this is Open EXR.
The embedded Photos app is capable of showing HDR photos on an HDR display.
However, if drop an EXR file to the photos with a large range, it won't be properly displayed with HDR mode with the full range. At the same time, pressing Edit on the file makes it HDR displayable and it remains displayable if save the edit with any, even a tiny, change.
Moreover, if the EXR file is placed next to 'true' HDR one (or an EXR 'fixed' as on above), then durring scroll between the files, the broken EXR magically fixes at the exact moment the other HDR drives up to the screen.
I tested on different files with various internal format. Seems to be a coomon problem for all.
Tested on the latest iOS 17.0.3.
Thank you in advance.
Hello,
I want to create a photo portfolio (in VueJS) and I'd like to use my icloud photo library to store the photos.
Is there an API that allows me to access one or more albums in my photo library?
I am using DeepAR and PixelSDK in my project and these both SDK's has same file name "cameraVieewController" when I try to run application I get this error. Is their any solution to resolve this naming issue?
Goal is to get/save the captured photo (From default camera) immediately from my app when the app is in background.
When I capture a photo with default camera, photoLibraryDidChange(_:) function do not execute that time. But when I reopen my app, that time this function executes and give the images, which were captured in that particular time.
how to execute photoLibraryDidChange(_:) when app is in background?
hii developers currently i developing a ios camera app in that camera app i need to add features like photographic styles in ios 13 i need only that page view not filters this is my big problem..i used uipageviewcontroller and swipe gesture if i use page in background main camera view func also run, i used one button if i press the button i need views like photographic styles view just view this is my problem i can't do that so please if anyone can read this comment please and solve ..thanks in advance
I am trying to locate information or documentation on how to pull in photos from the iCloud Shared Albums, but have not been able to find anything yet. Dakboard is currently doing it so I know it is possible, but I cannot find an API or any documentation covering how to access the photos in a Shared Album for incorporation into web applications. Can anyone help?
With AVFoundation, how do you set up the new 24MP capture on new iPhone 15 models?
I strongly believed it was in videodevice.activeFormat.supportedMaxPhotoDimensions array, but not.
The latest iPhone 15 Pro models support additional focal lengths on the main 24mm (1x) lens: 28mm ("1.2x") and 35mm ("1.5x"). These are supposed to use data from the full sensor to achieve optical quality images (i.e. no upscaling), so I would expect these new focal lengths to appear in the secondaryNativeResolutionZoomFactors array, just like the 2x option does. However, the activeFormat.secondaryNativeResolutionZoomFactors property still only reports [2.0] when using the main 1x lens. Is this an oversight, or is there something special (other than setting the zoom factor) we need to do to access the high-quality 28mm and 35mm modes? I'm wary of simply setting 1.2 or 1.5 as the zoom factor, as that isn't truly the ratio between the base 24mm and the virtual focal lengths.
struct ContentView: View {
@State var listOfImages: [String] = ["One", "Two", "Three", "Four"]
@State var counter = 0
var body: some View {
VStack {
Button(action: {
counter += 1
}, label: {
Text("Next Image")
})
}
.background(Image(listOfImages[counter]))
.padding()
}
}
When I click on the button, counter increases and the next image is displayed as the background. The memory usage of the app increases as each image changes. Is there anyway to maintain a steady memory use?
I am trying to display HDR Images (ProRAW) within UIImageView using preferredImageDynamicRange. This was shown in a 2023 WWDC Video
let imageView = UIImageView()
if #available(iOS 17.0, *) {
self.imageView.preferredImageDynamicRange = UIImage.DynamicRange.high
}
self.imageView.clipsToBounds = true
self.imageView.isMultipleTouchEnabled = true
self.imageView.contentMode = .scaleAspectFit
self.photoScrollView.addSubview(self.imageView)
I pull the image from PHImageManager:
let options = PHImageRequestOptions()
options.deliveryMode = .highQualityFormat
options.isNetworkAccessAllowed = true
PHImageManager.default().requestImage(for: asset, targetSize: self.targetSize(), contentMode: .aspectFit, options: options, resultHandler: { image, info in
guard let image = image else {
return
}
DispatchQueue.main.async {
self.imageView.image =image
if #available(iOS 17.0, *) {
self.imageView.preferredImageDynamicRange = UIImage.DynamicRange.high
}
}
}
Issue
The image shows successfully, yet not in HDR mode (no bright specular highlights, as seen when the same image ((ProRAW) is pulled on the native camera app.
What am I missing here?
We have received a lot of user feedback, saying that our app caused the video in the user's system album to not play, we did reproduce this phenomenon after operating some modules of our app many times, after monitoring the device log, click on the system album z probably received the following abnormal error
VideoContentProvider received result:<AVPlayerItem: 0x281004850, asset = <AVURLAsset: 0x28128fce0, URL = file:///var/mobile/Media/DCIM/100APPLE/IMG_0085.MP4>>, info:{
PHImageResultRequestIDKey = 316;
}, priority:oneup automatic, strategy:<PXDisplayAssetVideoContentDeliveryStrategy: 0x2836c3000>quality: medium+(med-high), segment:{ nan - nans }, streaming:YES, network:YES, audio:YES, targetSize:{1280, 1280}, displayAsset:8E30C461-B089-4142-82D9-3A8CFF3B5DE9
<PUBrowsingVideoPlayer: 0xc46a59770>
Asset : <PHAsset: 0xc48f5fc50> 8E30C461-B089-4142-82D9-3A8CFF3B5DE9/L0/001 mediaType=2/524288, sourceType=1, (828x1792), creationDate=2023-07-19 上午7:36:41 +0000, location=0, hidden=0, favorite=0, adjusted=0
VideoSession : <PXVideoSession 0xc48a1ec50> {
Content Provider: <PXPhotoKitVideoContentProvider: 0x282d441e0>, Asset <PHAsset: 0xc48f5fc50> 8E30C461-B089-4142-82D9-3A8CFF3B5DE9/L0/001 mediaType=2/524288, sourceType=1, (828x1792), creationDate=2023-07-19 上午7:36:41 +0000, location=0, hidden=0, favorite=0, adjusted=0 , Media Provider: <PUPhotoKitMediaProvider: 0x28104da70>
Desired Play State: Paused
Play State: Paused
Stalled: 0
At Beginning: 1 End: 0
Playback: ‖ Paus √ b0 a0 s0 l1 f0 e0 r0.0 0.000/60.128
VideoOutput: (null)
Got First Pixel Buffer: NO
Pixel Buffer Frame Drops: 0 Buffering: 0
}: Starting disabling of video loading for reason: OutOfFocus
<PUBrowsingVideoPlayer: 0xc46de66e0>
Asset : <PHAsset: 0xc48f5f1d0> 11ECA95E-0B79-4C7C-97C6-5958EE139BAB/L0/001 mediaType=2/0, sourceType=1, (1080x1920), creationDate=2023-09-21 上午7:54:46 +0000, location=1, hidden=0, favorite=0, adjusted=0
VideoSession : (null): Starting disabling of video loading for reason: OutOfFocus
I think this message is imporant
VideoSession : (null): Starting disabling of video loading for reason: OutOfFocus
restart the iPhone can resolve this anomalous ,can you know reason or how to avoid
this bug
the bug like :https://discussionschinese.apple.com/thread/254766045
https://discussionschinese.apple.com/thread/254787836
HEIF Decompression Crash on iOS 17.
Unhandled error (NSCocoaErrorDomain, 134093) occurred during faulting and was thrown: Error Domain=NSCocoaErrorDomain Code=134093 "(null)"
Fatal Exception: NSInternalInconsistencyException
0 CoreFoundation 0xed5e0 __exceptionPreprocess
1 libobjc.A.dylib 0x2bc00 objc_exception_throw
2 CoreData 0x129c8 _PFFaultHandlerLookupRow
3 CoreData 0x11d60 _PF_FulfillDeferredFault
4 CoreData 0x11c58 _pvfk_header
5 CoreData 0x98e64 _sharedIMPL_pvfk_core_c
6 PhotoLibraryServices 0x6d8b0 -[PLInternalResource orientation]
7 PhotoLibraryServices 0x6d7bc -[PLInternalResource orientedWidth]
8 Photos 0x147e74 ___presentFullResourceAtIndex_block_invoke
9 PhotoLibraryServices 0x174ee4 __53-[PLManagedObjectContext _directPerformBlockAndWait:]_block_invoke
10 CoreData 0x208ec developerSubmittedBlockToNSManagedObjectContextPerform
11 libdispatch.dylib 0x4300 _dispatch_client_callout
12 libdispatch.dylib 0x136b4 _dispatch_lane_barrier_sync_invoke_and_complete
13 CoreData 0x207f8 -[NSManagedObjectContext performBlockAndWait:]
14 PhotoLibraryServices 0x174e98 -[PLManagedObjectContext _directPerformBlockAndWait:]
15 PhotoLibraryServices 0x1738c8 -[PLManagedObjectContext performBlockAndWait:]
16 Photos 0x147d30 _presentFullResourceAtIndex
17 Photos 0x1476bc PHChooserListContinueEnumerating
18 Photos 0x1445e0 -[PHImageResourceChooser presentNextQualifyingResource]
19 Photos 0x2ea74 -[PHImageRequest startRequest]
20 Photos 0x3f2c0 -[PHMediaRequestContext _registerAndStartRequests:]
21 Photos 0x3e484 -[PHMediaRequestContext start]
22 Photos 0x1f0710 -[PHImageManager runRequestWithContext:]
23 Photos 0x1efdb0 -[PHImageManager requestImageDataAndOrientationForAsset:options:resultHandler:]
24 TeraBox 0x2497f0c closure #1 in LocalPhotoLibManager.getDataFrom(_:_:) + 549 (LocalPhotoLibManager.swift:549)
25 TeraBox 0x1835fc4 thunk for @escaping @callee_guaranteed () -> () (<compiler-generated>)
26 TeraBox 0x1cb1288 +[DuboxOCException tryOC:catchException:] + 18 (DuboxOCException.m:18)
27 TeraBox 0x249b4d4 specialized LocalPhotoLibManager.convert(with:_:) + 548 (LocalPhotoLibManager.swift:548)
28 TeraBox 0x2493b24 closure #1 in closure #1 in closure #1 in LocalPhotoLibManager.scanAlbumUpdateLocalphotoTable(_:) + 173 (LocalPhotoLibManager.swift:173)
29 TeraBox 0x1835fc4 thunk for @escaping @callee_guaranteed () -> () (<compiler-generated>)
30 libdispatch.dylib 0x26a8 _dispatch_call_block_and_release
31 libdispatch.dylib 0x4300 _dispatch_client_callout
32 libdispatch.dylib 0x744c _dispatch_queue_override_invoke
33 libdispatch.dylib 0x15be4 _dispatch_root_queue_drain
34 libdispatch.dylib 0x163ec _dispatch_worker_thread2
35 libsystem_pthread.dylib 0x1928 _pthread_wqthread
36 libsystem_pthread.dylib 0x1a04 start_wqthread
My iPhone produces corrupted images under certain conditions. If I shoot same scene (with slightly varying angle) in same lightning conditions, I almost all the time receive corrupted photo, which contains magenta copy of image and green rectangle. If I add some other objects to the scene, thus changing overall brightness of the scene, chance of bug reduces significantly.
Device info: iPhone 14 Pro Max (iOS 17 RC), iPhone 14 Pro (iOS 17 beta 6)
Images with issue:
f/1.664, 1/25s, ISO 500, digitalZoom=1.205, brightness=-0.568
f/1.664, 1/25s, ISO 500, digitalZoom=1.205, brightness=-0.514
f/1.664, 1/25s, ISO 640, digitalZoom=1.205, brightness=-0.641
f/1.664, 1/25s, ISO 500, digitalZoom=1.205, brightness=-0.448
f/1.664, 1/25s, ISO 500, digitalZoom=1.205, brightness=-0.132
Images without issue:
f/1.664, 1/25s, ISO 500, digitalZoom=1.205, brightness=-0.456
f/1.664, 1/20s, ISO 640, digitalZoom=1.205, brightness=-1.666
f/1.664, 1/100s, ISO 50, digitalZoom=1.205, brightness=4.840
f/1.664, 1/25s, ISO 640, digitalZoom=1.205, brightness=-0.774
I'm using builtInWideAngleCamera with continuousAutoExposure, continuousAutoFocus and slight videoZoomFactor
func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) {
if let error = error {
capturePhotoCallback?(.failure(.internalError(error.localizedDescription)))
return
}
guard let data = photo.fileDataRepresentation() else {
capturePhotoCallback?(.failure(.internalError("Can not get data representation.")))
return
}
guard let image = UIImage(data: data) else {
capturePhotoCallback?(.failure(.internalError("Can not get image from data representation.")))
return
}
capturePhotoCallback?(.success(image))
}
struct ImageTest: View {
@State private var selectedPhotos: [PhotosPickerItem] = []
@State private var photosData: [Data] = []
@State private var text: String = ""
var body: some View {
NavigationStack{
Form {
Section {
HStack {
PhotosPicker(selection: $selectedPhotos,
maxSelectionCount: 10,
matching: .images,
photoLibrary: .shared()) {
AddPhotoIcon(numOfImages: photosData.count)
}
.task(id: selectedPhotos) {
for selectedPhoto in selectedPhotos {
if let data = try? await selectedPhoto.loadTransferable(type: Data.self) {
self.photosData.append(data)
}
}
self.selectedPhotos = []
}
if !photosData.isEmpty {
ScrollView(.horizontal) {
HStack {
ForEach(photosData, id: \.self) { data in
if let image = UIImage(data: data) {
Image(uiImage: image)
.resizable()
.scaledToFill()
.frame(width: 50, height: 50)
}
}
}
}
}
Spacer()
}
}
Section {
TextField("Any", text: $text)
}
}
}
}
}
Here with SwiftData I'm making the model that contains images and title.
There is a delay when entering just ten-character text after selecting photos for like 2~3 seconds.
In the debug the memory usage it soars to 700MB.
How can I improve it?