I use loadFileRepresentation to register video URL. After that, loadItem will take less time to see the video from the picker. But after upgrade Iphone to 16.6, this func can not work. Have any change or dev team removed loadItem it? Here is my sample code:
provider.loadItem(forTypeIdentifier: "public.movie", options: nil) { url, error in
guard let url = url as? URL else {
return
}
self.parent.selectedVideoURL = url
// Check cached, if loadFileRepresentation before use url from loadItem more faster
let storage = LocalStorageHelper()
if storage.checkStringExistStorage(MEDIA_PICKER_STORAGE_NAMESPACE, url.relativeString) {
DispatchQueue.main.async {
self.parent.presentationMode.wrappedValue.dismiss()
self.parent.videoURL = url
}
} else {
provider.loadFileRepresentation(forTypeIdentifier: "public.movie") { urlResult, error in
DispatchQueue.main.async {
self.parent.presentationMode.wrappedValue.dismiss()
}
if let error = error {
// Handle errors loading video
LOGGING.error("Error loading video: \(error.localizedDescription)")
return
}
guard let urlFile = urlResult else {return}
// create a new filename
let fileName = "\(Int(Date().timeIntervalSince1970)).\(urlFile.pathExtension)"
// create new URL
let newUrl = URL(fileURLWithPath: NSTemporaryDirectory() + fileName)
// copy item to APP Storage
try? FileManager.default.copyItem(at: urlFile, to: newUrl)
DispatchQueue.main.async {
storage.saveStringToStorage(MEDIA_PICKER_STORAGE_NAMESPACE, url.relativeString)
self.parent.videoURL = URL(string: newUrl.absoluteString)
}
}
}
}
PhotoKit
RSS for tagWork with image and video assets managed by the Photos app, including those from iCloud Photos and Live Photos, using PhotoKit.
Posts under PhotoKit tag
87 Posts
Sort by:
Post
Replies
Boosts
Views
Activity
I know that I can uniquely identify a PHAsset on a given device using localIdentifier but if that asset is synched (through iCloud, say) to another device, how to I uniquely identify that asset across multiple devices?
My app allows users to store their images in the standard photo gallery, but I have no way of referring to them when they sync their app profile to another iOS device with my app installed.
It is 2023 and Photos.app still provides no way to sort photos/movies by file size.
I ended up writing some AppleScript that displays the file sizes of images/movies in a specific album. It is also simple to create a shell script to examine the photo library contents directly but mapping the UUID-based filenames back to the names used in Photos.app is not straightforward (to me at least).
You can't even create a smart album based upon file size.
Why is there no native support for this in Photos.app/PhotoKit?
(And, yes, I have submitted many feature requests over the years)
Currently, PhotosPickerItem cannot be used from async/await code safely, for example in Tasks etc., because it does not conform to Sendable.
The public api has only two properties, both of which are of a Sendable type: UTType and String?.
Is this something that's going to be changed so I can mark it as @unchecked Sendable in my code, or is there an actual underlying reason why PhotosPickerItem is not Sendable?
PHAsset, PHAssetCollection, PHFetchRequest, PHObject are all sendable, because - similarly to PhotosPickerItem - they contain no actual image data, they are just a preliminary representation of an image in the Photos app.
Hello, is it possible to reference a photo from the photo picker (either UIKit or SwiftUI), such that I do not need to copy it somewhere else? Currently, I am storing the copy in CoreData in a field with external storage, but it did cross my mind that I don't actually need a copy in my own storage if I could point back to the photo library. If the user deleted the photo from the library that would be fine.
I am able to detect screenshot only after screenshot is taken.
UIApplication.userDidTakeScreenshotNotification.
But my requirement is to detect at the same time while user is taking screenshot, so that I can hide my confidential data.
Please comment out if there are any better solutions or any callback functions available.
The documentation for this API mentions:
The system uses the current representation and avoids transcoding, if possible.
What are the scenarios in which transcoding takes place?
The reason for asking is that we've had a user reaching out saying they selected a video file from their Photos app, which resulted in a decrease in size from ~110MB to 35MB. We find it unlikely it's transcoding-related, but we want to gain more insights into the possible scenarios.
I would like to use a third-party app to edit the metadata of a photo to change its Caption and then be able to search in the Photos app to find that image with the edited caption.
I have managed to do this by duplicating the photo with the edited metadata. The Photos app recognizes it as a new photo and indexes it with the new caption, making it searchable. However, when editing the photo in-place, the Photos app will not re-index the photo, therefore it will not be searchable.
Is there a way to edit photos in-place and have them searchable with the new metadata?
The new live photos format works in iOS 17 with captures from camera only. We need an API or methods to generate live photos from existing images and videos without capturing new ones. In the past this was possible by matching a .jpg and a .mov with same identifier but now the PhotoData format is not documented, or exactly the way we can combine an image or series of images with a video file and convert them to live photo outside an AVCaptureSession. It is indeed an intentional action for live photo and new PhotoKit to let users use only camera captured sessions , or will be some method to allow users use their existing movies and generate live photos on iOS117?
Problem:
While calling PHAsset.fetchAssets() and iterating over its results, if the Photos.app is simultaneously running an import operation (File | Import), the photolibraryd process crashes. I have already flagged this issue to Apple (FB13178379) but wanted to check if anyone else has encountered this behavior.
Steps to Reproduce:
Initiate an import in the Photos.app.
Run the following code snippet in the Playground:
import Photos
PHPhotoLibrary.requestAuthorization(for: .readWrite) { authorizationStatus in
guard authorizationStatus == .authorized else { return }
let fetchResult = PHAsset.fetchAssets(with: nil)
print(fetchResult)
for i in 0..<fetchResult.count {
print(fetchResult[i])
}
}
Upon doing this, I consistently receive the error: Connection to assetsd was interrupted - photolibraryd exited, died, or closed the photo library in the Console, causing my code to hang.
Environment:
macOS Version: 13.5.2 (22G91)
Xcode Version: Version 14.3.1 (14E300c)
Additional Info:
After the crash, a report pops up in the Console, and typically, the Photos.app import operation freezes too. I've noticed that after terminating all processes and/or rebooting, the Photos.app displays "Restoring from iCloud…" and the recovery process lasts overnight.
Seeking Suggestions:
I'm exploring potential workarounds for this issue. I've attempted to use fetchLimit to obtain assets in batches, but the problem persists. Is there a method to detect that the Photos.app is executing an import, allowing my process to wait until completion? Alternatively, can I catch the photolibraryd crash and delay until it's restored?
I'm operating in a batch processing mode for photos, so pausing and retrying later in the event of a Photos.app import isn't an issue.
Any guidance or shared experiences would be greatly appreciated!
Cheers and happy coding!
Unhandled error (NSCocoaErrorDomain, 134093) occurred during faulting and was thrown: Error Domain=NSCocoaErrorDomain Code=134093 "(null)"
Fatal Exception: NSInternalInconsistencyException
0 CoreFoundation 0xed5e0 __exceptionPreprocess
1 libobjc.A.dylib 0x2bc00 objc_exception_throw
2 CoreData 0x129c8 _PFFaultHandlerLookupRow
3 CoreData 0x11d60 _PF_FulfillDeferredFault
4 CoreData 0x11c58 _pvfk_header
5 CoreData 0x98e64 _sharedIMPL_pvfk_core_c
6 PhotoLibraryServices 0x6d8b0 -[PLInternalResource orientation]
7 PhotoLibraryServices 0x6d7bc -[PLInternalResource orientedWidth]
8 Photos 0x147e74 ___presentFullResourceAtIndex_block_invoke
9 PhotoLibraryServices 0x174ee4 __53-[PLManagedObjectContext _directPerformBlockAndWait:]_block_invoke
10 CoreData 0x208ec developerSubmittedBlockToNSManagedObjectContextPerform
11 libdispatch.dylib 0x4300 _dispatch_client_callout
12 libdispatch.dylib 0x136b4 _dispatch_lane_barrier_sync_invoke_and_complete
13 CoreData 0x207f8 -[NSManagedObjectContext performBlockAndWait:]
14 PhotoLibraryServices 0x174e98 -[PLManagedObjectContext _directPerformBlockAndWait:]
15 PhotoLibraryServices 0x1738c8 -[PLManagedObjectContext performBlockAndWait:]
16 Photos 0x147d30 _presentFullResourceAtIndex
17 Photos 0x1476bc PHChooserListContinueEnumerating
18 Photos 0x1445e0 -[PHImageResourceChooser presentNextQualifyingResource]
19 Photos 0x2ea74 -[PHImageRequest startRequest]
20 Photos 0x3f2c0 -[PHMediaRequestContext _registerAndStartRequests:]
21 Photos 0x3e484 -[PHMediaRequestContext start]
22 Photos 0x1f0710 -[PHImageManager runRequestWithContext:]
23 Photos 0x1efdb0 -[PHImageManager requestImageDataAndOrientationForAsset:options:resultHandler:]
24 TeraBox 0x2497f0c closure #1 in LocalPhotoLibManager.getDataFrom(_:_:) + 549 (LocalPhotoLibManager.swift:549)
25 TeraBox 0x1835fc4 thunk for @escaping @callee_guaranteed () -> () (<compiler-generated>)
26 TeraBox 0x1cb1288 +[DuboxOCException tryOC:catchException:] + 18 (DuboxOCException.m:18)
27 TeraBox 0x249b4d4 specialized LocalPhotoLibManager.convert(with:_:) + 548 (LocalPhotoLibManager.swift:548)
28 TeraBox 0x2493b24 closure #1 in closure #1 in closure #1 in LocalPhotoLibManager.scanAlbumUpdateLocalphotoTable(_:) + 173 (LocalPhotoLibManager.swift:173)
29 TeraBox 0x1835fc4 thunk for @escaping @callee_guaranteed () -> () (<compiler-generated>)
30 libdispatch.dylib 0x26a8 _dispatch_call_block_and_release
31 libdispatch.dylib 0x4300 _dispatch_client_callout
32 libdispatch.dylib 0x744c _dispatch_queue_override_invoke
33 libdispatch.dylib 0x15be4 _dispatch_root_queue_drain
34 libdispatch.dylib 0x163ec _dispatch_worker_thread2
35 libsystem_pthread.dylib 0x1928 _pthread_wqthread
36 libsystem_pthread.dylib 0x1a04 start_wqthread
Hello, after updating the physical device to iOS17, it seems there's an issue with the ImagePicker's functionality. In our app, even though NSItemProvider indicates canLoadObject(ofClass: UIImage.self) returns true, the method loadObject(ofClass: UIImage.self) { (object, error) consistently returns nil.
There's also a possibility that the same phenomenon is occurring with the standard Notes app's image picker, preventing images from being passed to the app.
When selecting more photos with previous limited authorization, I get this crash on iOS 17.0
*** Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '-[PHPhotoLibrary presentLimitedLibraryPickerFromViewController:completionHandler:]: unrecognized selector sent to instance 0x105ea2a60'
when using the synchronous and asynchronous methods:
PHPhotoLibrary.shared().presentLimitedLibraryPicker(from: viewController) { newlySelectedPhotoIDs in ...
OR
let newlySelectedPhotoIDs = await PHPhotoLibrary.shared().presentLimitedLibraryPicker(from: viewController)
Debugger output is unexpected... after all these methods are in the header...
(lldb) po PHPhotoLibrary.shared().responds(to: #selector(PHPhotoLibrary.presentLimitedLibraryPicker(from:)))
false
(lldb) po PHPhotoLibrary.shared().responds(to: #selector(PHPhotoLibrary.presentLimitedLibraryPicker(from:completionHandler:)))
false
HEIF Decompression Crash on iOS 17.
We have received a lot of user feedback, saying that our app caused the video in the user's system album to not play, we did reproduce this phenomenon after operating some modules of our app many times, after monitoring the device log, click on the system album z probably received the following abnormal error
VideoContentProvider received result:<AVPlayerItem: 0x281004850, asset = <AVURLAsset: 0x28128fce0, URL = file:///var/mobile/Media/DCIM/100APPLE/IMG_0085.MP4>>, info:{
PHImageResultRequestIDKey = 316;
}, priority:oneup automatic, strategy:<PXDisplayAssetVideoContentDeliveryStrategy: 0x2836c3000>quality: medium+(med-high), segment:{ nan - nans }, streaming:YES, network:YES, audio:YES, targetSize:{1280, 1280}, displayAsset:8E30C461-B089-4142-82D9-3A8CFF3B5DE9
<PUBrowsingVideoPlayer: 0xc46a59770>
Asset : <PHAsset: 0xc48f5fc50> 8E30C461-B089-4142-82D9-3A8CFF3B5DE9/L0/001 mediaType=2/524288, sourceType=1, (828x1792), creationDate=2023-07-19 上午7:36:41 +0000, location=0, hidden=0, favorite=0, adjusted=0
VideoSession : <PXVideoSession 0xc48a1ec50> {
Content Provider: <PXPhotoKitVideoContentProvider: 0x282d441e0>, Asset <PHAsset: 0xc48f5fc50> 8E30C461-B089-4142-82D9-3A8CFF3B5DE9/L0/001 mediaType=2/524288, sourceType=1, (828x1792), creationDate=2023-07-19 上午7:36:41 +0000, location=0, hidden=0, favorite=0, adjusted=0 , Media Provider: <PUPhotoKitMediaProvider: 0x28104da70>
Desired Play State: Paused
Play State: Paused
Stalled: 0
At Beginning: 1 End: 0
Playback: ‖ Paus √ b0 a0 s0 l1 f0 e0 r0.0 0.000/60.128
VideoOutput: (null)
Got First Pixel Buffer: NO
Pixel Buffer Frame Drops: 0 Buffering: 0
}: Starting disabling of video loading for reason: OutOfFocus
<PUBrowsingVideoPlayer: 0xc46de66e0>
Asset : <PHAsset: 0xc48f5f1d0> 11ECA95E-0B79-4C7C-97C6-5958EE139BAB/L0/001 mediaType=2/0, sourceType=1, (1080x1920), creationDate=2023-09-21 上午7:54:46 +0000, location=1, hidden=0, favorite=0, adjusted=0
VideoSession : (null): Starting disabling of video loading for reason: OutOfFocus
I think this message is imporant
VideoSession : (null): Starting disabling of video loading for reason: OutOfFocus
restart the iPhone can resolve this anomalous ,can you know reason or how to avoid
this bug
the bug like :https://discussionschinese.apple.com/thread/254766045
https://discussionschinese.apple.com/thread/254787836
I am trying to display HDR Images (ProRAW) within UIImageView using preferredImageDynamicRange. This was shown in a 2023 WWDC Video
let imageView = UIImageView()
if #available(iOS 17.0, *) {
self.imageView.preferredImageDynamicRange = UIImage.DynamicRange.high
}
self.imageView.clipsToBounds = true
self.imageView.isMultipleTouchEnabled = true
self.imageView.contentMode = .scaleAspectFit
self.photoScrollView.addSubview(self.imageView)
I pull the image from PHImageManager:
let options = PHImageRequestOptions()
options.deliveryMode = .highQualityFormat
options.isNetworkAccessAllowed = true
PHImageManager.default().requestImage(for: asset, targetSize: self.targetSize(), contentMode: .aspectFit, options: options, resultHandler: { image, info in
guard let image = image else {
return
}
DispatchQueue.main.async {
self.imageView.image =image
if #available(iOS 17.0, *) {
self.imageView.preferredImageDynamicRange = UIImage.DynamicRange.high
}
}
}
Issue
The image shows successfully, yet not in HDR mode (no bright specular highlights, as seen when the same image ((ProRAW) is pulled on the native camera app.
What am I missing here?
Is it possible to access "From my mac" photos/PHAssetCollection through PhotoKit in iOS?
"From my mac" photos/videos are media synced from a mac where iCloud Photos are turned off on the iOS device, like what we did in the ole' days before iCloud Photos.
I have set up an iOS device with "From my mac" albums present in Photos.app, but in my own app I don't seem to be able to access those collections/photos through PhotoKit using all the defined PHAssetCollectionTypes.
Are these directly synced photos simply not available through PhotoKit and you would have to revert to the deprecated ALAssetLibrary?
Goal is to get/save the captured photo (From default camera) immediately from my app when the app is in background.
When I capture a photo with default camera, photoLibraryDidChange(_:) function do not execute that time. But when I reopen my app, that time this function executes and give the images, which were captured in that particular time.
how to execute photoLibraryDidChange(_:) when app is in background?
The native camera app on iPhone 15 can record videos directly in external hard drive. Is there an API to achieve the same in Photos framework?
In iOS 17.0.3, photos taken using Apple's native camera app can't be loaded immediately (within approximately 30 seconds) through PHPickerViewController. Specifically, the method itemProvider.canLoadObject(ofClass: UIImage.self) returns false. However, after about 30 seconds post-capture, the photos load without any hindrance.
Initially, I considered an issue with my own photo-loading code, but the same problem persists even with Apple's official PHPickerDemo sample code.
[PHPickerDemo - SelectingPhotosAndVideosInIOS.zip]
https://developer.apple.com/documentation/photokit/selecting_photos_and_videos_in_ios
ViewController.swift Line 89 (PHPickerDemo)
func displayNext() {
guard let assetIdentifier = selectedAssetIdentifierIterator?.next() else { return }
currentAssetIdentifier = assetIdentifier
let progress: Progress?
let itemProvider = selection[assetIdentifier]!.itemProvider
if itemProvider.canLoadObject(ofClass: PHLivePhoto.self) {
progress = itemProvider.loadObject(ofClass: PHLivePhoto.self) { [weak self] livePhoto, error in
DispatchQueue.main.async {
self?.handleCompletion(assetIdentifier: assetIdentifier, object: livePhoto, error: error)
}
}
}
else if itemProvider.canLoadObject(ofClass: UIImage.self) { <========================================= FALSE
progress = itemProvider.loadObject(ofClass: UIImage.self) { [weak self] image, error in
DispatchQueue.main.async {
self?.handleCompletion(assetIdentifier: assetIdentifier, object: image, error: error)
}
}
}
...omitted...
}
Environment & Settings:
iPhone 12
iOS 17.0.3
Settings -> Camera -> Formats -> High Efficiency (Enabled)
Reproduction Steps:
Take a photo in normal photo mode with Apple's native camera app (not in portrait mode).
Launch the PHPickerDemo app.
Tap the photo icon located in the top right.
Observe that the photo fails to load.
Workarounds:
Wait for over 30 seconds prior to selecting the photo.
Opt to shoot in portrait mode rather than the standard photo mode.
Switch on Settings -> Camera -> Formats -> Most Compatible.
I am developing a photo editing app, and I have received many emails from users stating that they cannot select photos since updating to iOS 17.
Thanks.