I have an app that allows the user to change a photo’s EXIF metadata. To do this, I request a content editing input, get the full size image, modify its properties, create a content editing output, write the output image to the rendered content URL, then call performChanges on the PHPhotoLibrary creating an asset change request for that asset setting its content editing output. This works as expected for regular photos but Live Photos get turned off converted to a regular photo.
To address this, I’m doing something similar by changing the properties of the .photo image in the Live Photo. I detect when the content editing input has a Live Photo, create a Live Photo editing context, set a frame processor that returns the frame’s image after setting its properties to the updated properties when the frame type is photo, then I create the content editing output and save the Live Photo to that output. It modifies the Live Photo successfully, but the metadata is not updated. If you get the full size image again the properties are the original properties. If you look at the EXIF metadata using an app like Metapho it remains unchanged. What am I doing wrong here? Thanks!
let imageURL = contentEditingInput.fullSizeImageURL!
let inputImage = CIImage(contentsOf: imageURL, options: [.applyOrientationProperty: true])!
var metadata: [AnyHashable: Any] = inputImage.properties
// Edit the metadata as desired...
let editingContext = PHLivePhotoEditingContext(livePhotoEditingInput: contentEditingInput)!
editingContext.frameProcessor = { frame, error -> CIImage? in
// Edit only the still photo
if frame.type == .photo {
return frame.image.settingProperties(metadata)
}
return frame.image
}
let contentEditingOutput = try await withCheckedThrowingContinuation { continuation in
let editingOutput = PHContentEditingOutput(contentEditingInput: contentEditingInput)
editingOutput.adjustmentData = adjustmentData
editingContext.saveLivePhoto(to: editingOutput) { success, error in
if success {
continuation.resume(returning: editingOutput)
} else {
continuation.resume(throwing: error!)
}
}
}
try await PHPhotoLibrary.shared().performChanges {
let request = PHAssetChangeRequest(for: asset)
request.contentEditingOutput = contentEditingOutput
}
PhotoKit
RSS for tagWork with image and video assets managed by the Photos app, including those from iCloud Photos and Live Photos, using PhotoKit.
Posts under PhotoKit tag
76 Posts
Sort by:
Post
Replies
Boosts
Views
Activity
Hi fellow iOS developers! 👋
I've written a Swift code that converts a video (from a URL) into a Live Photo after downloading it. The conversion process seems fine, but when I try to set the generated Live Photo as a wallpaper on iOS 17+, it shows the message 'Motion not Available.'
Has anyone else experienced this issue or know why this might be happening? Could it be related to changes in iOS 17 Live Photo handling or the generated file structure? Any help or suggestions would be greatly appreciated! 🙏
I tried several times to use the PhotosPickerItem loadTransferable function with the goal of receiving some progress value especially when loading large video media.
However, the Progress object returned by the function has always isIndeterminate == true and so doesn't have any progress value to observe.
Is there some way to make it work ? Some configuration I might have overlooked ? Or is it just not working
I might have to revert back to the UIKit photo picker because of this
Hello!
I am trying to remove a photo from my Photos Library using PhotosUI. I run into this error when I attempt to delete it:
"Error returned from daemon: Error Domain=com.apple.accounts Code=7 "(null)""
No photos access scope requirements declared for changes
Failed to log access with error: access= accessor:<> identifier:82068C12-FD10-4DE2-9867-B4406FBFB706 kind:intervalEvent timestampAdjustment:0 visibilityState:0 assetIdentifierCount:0 accessCount:0 tccService:kTCCServicePhotos, error=Error Domain=NSCocoaErrorDomain Code=4097 "connection to service named com.apple.privacyaccountingd" UserInfo={NSDebugDescription=connection to service named com.apple.privacyaccountingd}
While customizing ImagePicker and using it, we find out that the metadata is not reflected normally and report it.
The situation is as follows.
The time or time zone of an image is changed in the Photos app.
Changing the time zone of an image with an actual capture date of 2024:11:08 08:27:44 → 2024:11:07 17:27:44
Image data is extracted from a PHAsset using PHImageManager.
The metadata is obtained from this image data.
The time zone information exposed in the Exif tag information does not reflect the time or time zone changed in the Photos app.
let asset: PHAsset = ...
....
let options = PHImageRequestOptions()
options.isSynchronous = true
options.version = .current
options.deliveryMode = .highQualityFormat
options.resizeMode = .none
options.normalizedCropRect = .zero
options.isNetworkAccessAllowed = true
options.progressHandler = { progress, error, _, _ in }
PHImageManager.default().requestImageDataAndOrientation(for: asset, options: options) { imageData, uti, orientation, info in
let cgImageSource = CGImageSourceCreateWithData(imageData! as CFData, nil)
let properties = CGImageSourceCopyPropertiesAtIndex(cgImageSource!, 0, nil) as? Dictionary<String, Any>
let exif = properties!["{Exif}"]
let dictionary = exif as? Dictionary<String, Any>
}
Metadata Check
In this case, it is reflected in the creationDate of PHAsset, so it can be somewhat compensated by forcibly replacing the metadata.
However, because PHAsset does not include time zone information, when changing the time zone as well, it's impossible to calculate the correct time according to the time zone.
PHPicker
This issue is resolved when using the PHPickerResult provided by PHPicker.
extension PhotosPickerViewController: PHPickerViewControllerDelegate {
public func picker(_ picker: PHPickerViewController,
didFinishPicking results: [PHPickerResult]) {
.....
for result in results {
let identifier = UTType.image.identifier
if result.itemProvider.hasItemConformingToTypeIdentifier(identifier) {
result.itemProvider.loadDataRepresentation(forTypeIdentifier: identifier) { data, error in
guard let data = data,
let cgImageSource = CGImageSourceCreateWithData(data as CFData, nil),
let properties = CGImageSourceCopyPropertiesAtIndex(cgImageSource, 0, nil) as? Dictionary<String, Any>,
let exif = properties["{Exif}"],
let dictionary = exif as? Dictionary<String, Any>
else {
return
}
}
}
}
}
}
Metadata Check
Question
I wonder why this happens, and if this is normal behavior.
Instead of the System Picker that Apple provides as a base, I wonder if there is any way I can supplement it in that situation if I use a customizer.
iOS (Official) Photos app can display some EXIF-related metadata (e.g. camera and lens info, ISO, shutter speed, F-number) even when photos are offloaded to iCloud and the device is not connected to internet (e.g. airplane mode).
However, with the Photos.framework, we need to download photos to retrive those metadata (which means it will not work with airplane mode).
I tried the following methods, but none of those worked when photos were offloaded to iCloud and the device was in airplane mode:
Requesting data with PHImageManager.default().requestImageDataAndOrientation
Result: It does not return Data if the photo is not stored locally on the device, even with options.deliveryMode = .fastFormat
Converting PHAsset#localIdentifier to an AssetsLibrary.framework URL (assets-library://asset/...)
(I am aware that AssetsLibrary.framework is deprecated, but this was just a test.)
Result: If PHImageManager does not returns Data, ALAsset#defaultRepresentation().metadata() returns an empty NSDictionary
I have an app that allows you to edit your photos. To preserve HDR, I edit both the SDR image and gain map image, like so:
let sdrImage = CIImage(data: data, options: [.applyOrientationProperty: true])
let gainMapImage = CIImage(data: data, options: [.applyOrientationProperty: true, .auxiliaryHDRGainMap: true])
// edit them...
try CIContext().writeHEIFRepresentation(of: sdrImage, to: url, format: .RGBA8, colorSpace: colorSpace, options: [.hdrGainMapImage: gainMapImage])
I also support editing the still photo in Live Photos. To do this you create a PHLivePhotoEditingContext, set the frameProcessor block which gives you a CIImage that I edit when the frame.type is .photo, then you create a PHContentEditingOutput and call saveLivePhoto. I’m not seeing any way to preserve HDR here. Interestingly the frame processor is called twice with .photo frame.type, but I don’t see any difference between these images. How can I edit a gain map image to preserve HDR in the still photo of a Live Photo?
I've requested the authentication in my main app.
PHPhotoLibrary.requestAuthorization(for: .readWrite) { status in }
Add the privacy description in both the main app and the extension.
But No matter the device is locked or unlocked. When I call
let fetchResult = PHAsset.fetchAssets(with: .image, options: nil)
let count = fetchResult.count
the count is always zero, even after a new photo is saved to the album in the same session.
I have an app that edits photos in your library. When I call
try CIContext().writeHEIFRepresentation(of: editedImage, to: fileURL, format: .RGBA8, colorSpace: originalImage.colorSpace!)
The following is logged to the console:
writeImageAtIndex:1012: ⭕️ ERROR: 'App' is trying to save an opaque image (5712x4284) with 'AlphaLast'. This would unnecessarily increase the file size and will double (!!!) the required memory when decoding the image --> ignoring alpha.
What does that mean and how can I resolve it?
Xcode Version 16.0 (16A242d)
iOS 18.1 (22B82)
We are currently in the process of migrating our application from using ALAssetsLibrary to PHPhotoLibrary to ensure compatibility with the latest versions of iOS. However, we have noticed a discrepancy in the file sizes of images obtained using PHPhotoLibrary compared to those obtained using ALAssetsLibrary.
Specifically, we would like to understand the following points:
1.Reason for File Size Differences:
What are the reasons for the difference in file sizes between images obtained using ALAssetsLibrary and those obtained using PHPhotoLibrary?
Could you provide detailed information on the settings and options in PHPhotoLibrary that affect the size and quality of the images?
2.Optimal Settings:
What are the optimal settings in PHPhotoLibrary to obtain images with the same quality and file size as those obtained using ALAssetsLibrary?
If possible, could you provide code examples or recommended option settings?
I have noticed a problem when a PHAsset creation request is made with the resource type PHAssetResourceType.photoProxy.
let creationRequest = PHAssetCreationRequest.forAsset()
creationRequest.addResource(with: .photoProxy, data: photoData, options: nil)
creationRequest.location = location
creationRequest.isFavorite = true
After successfully saving the resulting asset through PHPhotoLibrary.shared().performChanges, I could verify it in the Photos app.
I noticed that the created photo was initially marked as Favorite and that the location was added to the info as expected. The title of the image changes from "Today" to "" too.
Next, the photo was refreshed, and location data was purged. However, the title remains unchanged and displays the .
This refresh was also observed in the code. PHPhotoLibraryChangeObserver protocols func photoLibraryDidChange(_ changeInstance: PHChange) receives a change notification. The same asset has been changed, and there is no location information anymore. isFavorite information persists correctly.
After debugging for a few hours, I discovered that changing the resource type to .photo fixes this issue. Location data is not removed in the Photos app, and no refresh callback is seen in func photoLibraryDidChange(_ changeInstance: PHChange).
I initially used .photoProxy because in the AVCapturePhotoCaptureDelegate implementation class, I always get the call in func photoOutput(_ output: AVCapturePhotoOutput, didFinishCapturingDeferredPhotoProxy deferredPhotoProxy: AVCaptureDeferredPhotoProxy?, error: Error?). So here is where I am capturing the photo data as photoData = deferredPhotoProxy?.fileDataRepresentation().
Hello,
I’ve encountered an issue where a green line appears only in the thumbnail view of an image in the iPhone photo gallery. The green line is not present in the actual image itself.
Here are some additional details:
• The issue occurs only when saving the image as a JPEG. When saved as PNG, the green line does not appear.
• The green line also shows up when using the PHAsset method requestImage(for:targetSize:). Depending on the targetSize, the resulting image may contain the green line.
• Interestingly, this issue does not appear on iPhone Xs Max running iOS 15.2.1. However, the green line does appear on iPhone 15 Pro Max running iOS 18.0.1 when viewing the same image.
I have attached the problematic image for your reference.
Following images are the screen captures that shows the issue occurring on my iPhone.
iPhone 15 pro max (iOS 18.0.1)
iPhone Xs Max (iOS 15.2.1 )
Could this be related to a display or gallery app issue on iOS? Any advice or solutions would be greatly appreciated.
Thank you!
A very common use case in our iOS app is that users take a large number of pictures (about 30) in low-light conditions using the camera app, and immediately after, they try to upload them to our servers.
We measured the time to load photos from the PHPickerResult. For most photos, it takes less than 100 milliseconds, but for some of them, it takes several seconds—we even saw minutes in some extreme cases.
We believe this started happening with iOS 17, when deferred photo processing was introduced. If users take the pictures using our in-app camera experience, the options to customize the camera are enough to avoid the long waiting times. However, the majority of our users still prefer to take the photos with the camera app, and there is little we can do about that.
In the past few weeks, we tried many combinations:
Without asking for permissions, we tried loadFileRepresentation, loadData, and loadObject.
We explored the PHImageManager route, asking permissions and with different options for deliveryMode, resizeMode, version, isSynchronous, and allowSecondaryDegradedImage.
We also tried fetching the photos in parallel, with very bad results.
In summary, nothing helped the long waiting times—minutes in some cases.
The first question is then, is there anything we can do to ignore the post-processing of the photos and get them fast? We could accept the unprocessed images.
At a minimum, we would like to show our users what we are doing and why we are taking so much time. We tried to load thumbnails with loadPreviewImage and put a progress indicator on top. This method consistently gives us an error for all photos:
(lldb) p error.localizedDescription
(String) "Cannot load preview."
We can load thumbnails with the PHImageManager option, but it seems excessive to need to get permissions only for that.
Second question would be then, what can we do to load thumbnails without asking for permission?
I created a feedback report with a video and sample code to reproduce -> FB15493683
I'm using PhotoKit in macOS to add photos to the user's library. Experimenting with Shared Photo Library, it seems that these new photos always end up in the Personal Library, not the Shared Library. I'd like to get them into the Shared Photo Library somehow. Is this possible?
Things I've considered:
A variation/option for PHAssetChangeRequest.creationRequestForAsset: doesn't seem to exist
A property of PHAsset: can't find anything
A special PHAssetCollection that I could add to: again, doesn't seem to exist
Hello developer community.
I purchase recently my new iPhone 16 Pro Max; it is a premium device with great quality overall.
However, I am having a big trouble shotting in ProRaw MAX (48 mode) with native camera.
Just to be clear, the problem that i will describe do not happen in 3rd apps, such as ProCam; only with native camera.
When I use ProRaw Max, and take the photo, and watch the photo in the gallery the image can’t load and render properly. Even, when I maximize the image to the maximum I can see pixelated portions, defects and super low resolution and excessive denoise.
For comparison, this not occur with my previous iPhone 15 PM and/or when I capture photos from ProCam (same settings and configurations) in the 16PM. I proceed to take the photo, open the gallery and I see full of details, when zoomed to 100%.
I tried to format the phone, reinstall the software via my mac. Tried even to look at some forums to find if there’s someone with the same issue, the information available so far is very low.
I’m in contact with apple assistant from my country (Portugal), and they escalated this problem to the engineers. (that’s what I’ve been told).
They did all the tests remotely (via analysis and improvement’s) and they told me that my phone is perfect in the hardware department. I will wait for the next days to be contacted again.
I’m on iOS 18.0.1. (The last software available at this time).
I tried multiple 16PM, from friends, family and stores (more or less 10 units), and they all showed the exact same problematic.
I’m a professional photographer, so I find this frustrating and unacceptable.
I would appreciate any additional suggestion or information. Thank you!
Cannot add photos or files because they are bigger than 5Mb.
Hey, I have an app that user selects wallpaper for iPhone. I want a feature that user can set wallpaper direct from app itself for lock screen and home screen not download the image and manually set the wallpaper. As my research there was a PhotoLibrary api that contains PLWallpaperImageViewController.h which allows you to set wallpaper directly.
Thank You!
I'm updating my Photo Editing Extension to support HDR. To do this I set imageView.preferredImageDynamicRange = .high. But you can turn off the option to view HDR photos in the complete dynamic range in Settings > Photos. When you do that, open a photo, and tap the edit button, it does not appear in the full range as expected, but when you select my app from More > Extensions, it does appear in the complete dynamic range unexpectedly. I need to set imageView.preferredImageDynamicRange = .standard when View Full HDR is off, but I don't see any way to get that in my PHContentEditingController.
In a photo editing extension, is it possible to display the photo in HDR? In this context you only have a placeholder UIImage and a PHContentEditingInput which has a displaySizeImage and fullSizeImageURL. The displaySizeImage has isHighDynamicRange false.
In the WWDC 24 session "Use HDR for dynamic image experiences in your app" it's noted this is how you save edits for Adaptive HDR:
SDR + HDR: writeHEIFRepresentation(of: sdrImage, to: url, colorSpace: p3Space, options: [.hdrImage: hdrImage])
SDR + Gain: writeHEIFRepresentation(of: sdrImage, to: url, colorSpace: p3Space, options: [.hdrGainMapImage: gainImage])
This won't compile because the format argument is missing. What format should be used?
In the WWDC 23 session "Support HDR images in your app" RGBAf, RGBAh, and RGBA16, and RGB10 were mentioned but I'm not sure which one to use.
If relevant, I'm editing photos from the user's photo library, so the image was probably taken on iPhone but perhaps not. Thanks!
[[PHImageManager defaultManager]
requestAVAssetForVideo:asset
options:videoOptions
resultHandler:^(AVAsset *_Nullable avAsset,
AVAudioMix *_Nullable audioMix,
NSDictionary *_Nullable info) {
if ([avAsset isKindOfClass:[AVURLAsset class]]) {
AVURLAsset *urlAsset = (AVURLAsset *)avAsset;
NSURL *videoURL = urlAsset.URL;
mediaInfo[@"path"] = videoURL.absoluteString;
} else {
// Failed to get video asset
completion(nil);
}
}];```
Before iOS 18, i could able access AVAsset video using the method mentioned above with the url, but starting from the iOS 18 version, the following error appears
'You don’t have permission. - The AVPlayerItem instance has failed with the error code 257 and domain "NSCocoaErrorDomain".'