Post not yet marked as solved
I have my callback registered and it's called repeatedly without anything changing in the photo album. I'm getting the assets via
PHAssetCollection.fetchAssetCollections
In the callback, I update my PHFetchResult and PHCollection to the one passed in the callback. I process the changeDetails
yet the callback keeps getting called.
Any ideas?
Post not yet marked as solved
What technology is used to add an iOS 16 album by long pressing and dragging it to the app?
Post not yet marked as solved
Hi, I'm on an iOS 14 Pro Max. We were so eager to try the iCloud Shared Photo Library, my wife and I. She is on an iPhone 14 Pro. All was fine when we were on the beta, but when we switched back to regular iOS 16 public release I noticed that our photos were no longer syncing. I tried to remove her from the photo library and start over after upgrading us both to iOS 16.1 Build 20B5056e, but now nothing works.
We have 41,160 photos & 4,354 videos all told. Right now, your beta is hung up somewhere on my iPhone 14 Pro Max on "Deleting Shared Library", and moving everything back from my shared library back to my personal library in order for me to go back to "normal." On her Pro, it's stuck on "Leaving Shared Library". In my personal library I can see 19,056 photos and 3,092 videos.
In my shared library at beta.icloud.com I can see 22,104 photos & 1262 videos. All of these totals of course add up to 41,160 photos and 4,354 videos as you can see. But it's stuck. It is stuck, and I don't know where...it will not transfer the remaining 22,104 photos and 1,262 videos back into my personal library in order for me to start fresh. I can still see them on beta.icloud.com, but my iPhone will not budge as far as moving them back, and neither will hers as far as leaving the shared library. Please tell me my photos are not lost nor unrecoverable.
I've tried suggestions such as those at https://forums.macrumors.com/threads/shared-photo-library-risky.2359656/ posted by Fibrozyt, but to no avail. Nothing triggers the sync to resume, and so I have 22,104 photos and 1,262 videos that are stuck in some phantom zone "Shared Library" online now.
Please tell me a future iOS Beta solves this, because we cannot sync our photos. I've up-down-rebooted both iPhones, I've killed off the apps, I've turned off iCloud photos and turned it back on on both devices...I've looked for a way in beta.icloud.com to move them back to my personal library so that we could start again. Nothing is working. HELP please. These are precious memories of course, of my wife, our family, our wedding, our kids, our homes, our lives.
Post not yet marked as solved
We are developing a widget for an iOS app that displays photos from the user's local photo library. To get photos, we use the PHImageManager.requestImage method with the following parameters:
let requestOptions = PHImageRequestOptions()
requestOptions.isNetworkAccessAllowed = false
requestOptions.isSynchronous = true
requestOptions.resizeMode = .exact
requestOptions.deliveryMode = .highQualityFormat
PHImageManager.requestImage(for: asset, targetSize: context.displaySize /* context is the TimelineProviderContext of the widget */, contentMode: .aspectFill, options: requestOptions) { (image, error) in
...
}
We load about 12 images per update cycle, calling the requestImage method that many times. While executing this code we can regularly monitor that executing the requestImage method increases the memory consumption by 20MB, even while loading an image with a size of about 3MB. (Image size has been retrieved from the PHAssetResource.) The memory spikes before the callback closure is being executed. We tinkered around with displaySize, PHImageRequestOptions and trying to reduce our overall memory consumption, to no avail. We integrated a DispatchGroup in our code to ensure that requesting images is strictly synchronous and do so in an autoreleasepool to free up memory after processing the image data. I like to stress that we a reasonably sure that the memory spike does not occur in our closure that receives the requested image but in the time between calling the requestImage method and the callback. This has been tested by placing a breakpoint in the line where we call requestImage and a breakpoint in the first line of the callback closure. The first breakpoint stops execution of the code, when continuing execution, the process ends immediately and we get a "" warning and never hitting the second breakpoint.
This is a huge problem for us because iOS terminates our widget process every time the memory consumption spikes due to exceeding the 30MB memory limit (our widget consumes about 13-15MB of memory while updating the timeline).
The issue described above was observed on an iPhone 11, iOS 16.4 and occurs while trying to load JPG photos between 0.7-4.0MB.
We kindly ask for help on why memory spikes when using PHImageManager.requestImage, how to prevent this or if this is a known issue.
Post not yet marked as solved
item.loadTransferable(type: Data.self) { result in
switch result {
case .success(let data):
guard let data = data, let image = UIImage(data: data) else { break }
imageView.image = image
case .failure(let error):
...
}
}
I load the raw format photo through the above code, but the displayed image is very blurry and messed up.
But not all raw format photos will be like this, what is the reason?
I use PHPicker for user to import photos, but UIImage not support the pic of .AVIF, so I want to get the origin data of .AVIF pic, this is my code:
func picker(_ picker: PHPickerViewController, didFinishPicking results: [PHPickerResult]) {
picker.dismiss(animated: true)
for image in results {
loadImagePickerResult(image: image)
}
}
func loadImagePickerResult(image: PHPickerResult) {
if image.itemProvider.canLoadObject(ofClass: UIImage.self) {
image.itemProvider.loadObject(ofClass: UIImage.self) { [weak self] newImage, error in
guard let self = self else { return }
if let _ = error {
} else if let needAddImage = newImage as? UIImage {
let imageItem = ContentImageItem()
imageItem.image = needAddImage
self.viewModel.selectedImageList.append(imageItem)
DispatchQueue.main.async {
self.scrollImageView.reloadData()
self.checkConfirmState()
}
}
}
} else if image.itemProvider.hasItemConformingToTypeIdentifier(kUTTypeImage as String) {
image.itemProvider.loadItem(forTypeIdentifier: kUTTypeImage as String, options: nil) { [weak self] item, error in
guard let self = self else { return }
guard let url = item as? URL else { return }
var imageData: Data?
do {
imageData = try Data(contentsOf: url, options: [.mappedIfSafe, .alwaysMapped])
} catch {
}
guard let selectedImageData = imageData else { return }
/// selectedImageData is empty data
}
} else {
}
}
When I choose .AVIF pic, itemProvider can load the image by "kUTTypeImage" typeIdentifier, and success to get the local path of the pic, but when I use Data(contentsOf: ) to read the origin data, I can only get an empty data. So, is there any problem with this code?Does anyone have experience in handling this matter?
"FileManager.default.contents(atPath: url.path)" and "NSData(contentsOf" is also return empty Data
Post not yet marked as solved
I have been unable to capture Live Photos using UIImagePickerController. I can capture still photos and even video (which is not my scenario but I checked just to make sure), but the camera does not capture live photos. The documentation suggests it should (source):
To obtain the motion and sound content of a live photo for display (using the PHLivePhotoView class), include the kUTTypeImage and kUTTypeLivePhoto identifiers in the allowed media types when configuring an image picker controller. When the user picks or captures a Live Photo, the editingInfo dictionary contains the livePhoto key, with a PHLivePhoto representation of the photo as the corresponding value.
I've set up my controller:
let camera = UIImagePickerController()
camera.sourceType = .camera
camera.mediaTypes = [UTType.image.identifier, UTType.livePhoto.identifier]
camera.delegate = context.coordinator
In the delegate I check for the Live Photo:
func imagePickerController(_ picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [UIImagePickerController.InfoKey : Any]) {
if let live = info[.livePhoto] as? PHLivePhoto {
// handle live photo
} else if let takenImage = info[.originalImage] as? UIImage, let metadata = info[.mediaMetadata] as? [AnyHashable:Any] {
// handle still photo
}
}
But I never get the Live Photo.
I've tried adding NSMicrophoneUsageDescription to the info.plist thinking it needs permissions for the microphone, but that did not help. Of course, I've added the NSCameraUsageDescription to give camera permissions.
Has anyone successfully captured Live Photos using UIImagePickerController?
Since iOS16 introduced the shared albumn, is there an API for developers to access the assets inside that shared albumn?
Post not yet marked as solved
Hey guys. I am trying to get the url out of an image selected from an imagePicker in swiftui (target: iPhone). Any ideas on how to do this?
Post not yet marked as solved
Is this accessible from Swift directly?
Visual Look Up
Lift subject from background
Lift the subject from an image or isolate the subject by removing the background. This works in Photos, Screenshot, Quick Look, Safari, and more.
Source: macOS Ventura Preview - New Features - Apple
I see that Shortcuts now has a native Remove Background command that wasn't there in iOS 25 or MacOS 12. Is there any way to call that from Swift besides x-callback url schemes?
Post not yet marked as solved
My app crashed a lot on iOS14 since I've added PhotoLibrary Authorization.
The firebase stack below:
已崩溃:NSPersistentStoreCoordinator 0x28031ddc0
EXC_BREAKPOINT 0x00000001976f9cd0
0
libdispatch.dylib
__DISPATCH_WAIT_FOR_QUEUE__ + 492
1
libdispatch.dylib
_dispatch_sync_f_slow + 148
arrow_right
2
AssetsLibraryServices[PLAssetsdClient libraryClient] + 168
3
PhotoLibraryServices[PLXPCPhotoLibraryStoreEndpointFactory newEndpoint] + 32
4
CoreData[NSXPCStoreConnection createConnectionWithOptions:] + 572
5
CoreData[NSXPCStoreConnection initForStore:] + 100
6
CoreData[NSXPCStoreConnectionManager initForStore:] + 216
7
CoreData[NSXPCStore initWithPersistentStoreCoordinator:configurationName:URL:options:] + 1068
8
CoreData
__91-[NSPersistentStoreCoordinator addPersistentStoreWithType:configuration:URL:options:error:]_block_invoke + 1452
9
CoreData
gutsOfBlockToNSPersistentStoreCoordinatorPerform + 208
10
libdispatch.dylib
_dispatch_client_callout + 20
11
libdispatch.dylib
_dispatch_lane_barrier_sync_invoke_and_complete + 60
12
CoreData
_perform + 184
13
CoreData[NSPersistentStoreCoordinator addPersistentStoreWithType:configuration:URL:options:error:] + 484
14
PhotoLibraryServices[PLPersistentContainer _configureXPCPersistentStoreCoordinator:error:] + 672
15
PhotoLibraryServices[PLPersistentContainer newSharedPersistentStoreCoordinator] + 172
16
PhotoLibraryServices
__57-[PLPersistentContainer sharedPersistentStoreCoordinator]_block_invoke + 60
17
AssetsLibraryServices
PLResultWithUnfairLock + 60
18
PhotoLibraryServices[PLPersistentContainer sharedPersistentStoreCoordinator] + 104
19
PhotoLibraryServices[PLPhotoLibraryBundle newChangeHandlingContainer] + 116
20
PhotoLibraryServices
__60-[PLPhotoLibraryBundle initWithLibraryURL:bundleController:]_block_invoke.66 + 48
21
AssetsLibraryServices
__27-[PLLazyObject objectValue]_block_invoke + 52
22
AssetsLibraryServices
PLResultWithUnfairLock + 60
23
AssetsLibraryServices[PLLazyObject objectValue] + 104
24
PhotoLibraryServices[PLPhotoLibraryBundle distributeChangesSinceLastCheckpoint] + 28
25
Photos
__93-[PHPhotoLibrary _commitTransactionOnExecutionContext:withInstrumentation:completionHandler:]_block_invoke + 224
26
Photos
__94-[PHPhotoLibrary _sendChangesRequest:onExecutionContext:withInstrumentation:retryCount:reply:]_block_invoke + 324
27
Photos
__83-[PHPhotoLibrary _sendChangesRequest:onExecutionContext:withInstrumentation:reply:]_block_invoke + 92
28
AssetsLibraryServices
__70+[PLAssetsdPhotoKitClient sendChangesRequest:usingProxyFactory:reply:]_block_invoke + 120
29
AssetsLibraryServices[PLAssetsdServiceProxyFactory _inq_createServiceProxyWithCallStackSymbols:errorHandler:] + 1188
30
AssetsLibraryServices
__74-[PLAssetsdServiceProxyFactory remoteObjectProxyWithErrorHandler:handler:]_block_invoke_2 + 64
31
libdispatch.dylib
_dispatch_call_block_and_release + 32
32
libdispatch.dylib
_dispatch_client_callout + 20
33
libdispatch.dylib
_dispatch_lane_serial_drain + 620
34
libdispatch.dylib
_dispatch_lane_invoke + 404
35
libdispatch.dylib
_dispatch_workloop_worker_thread + 780
36
libsystem_pthread.dylib
_pthread_wqthread + 276
37
libsystem_pthread.dylib
start_wqthread + 8
And another suspicious thread stack below:
com.apple.photos.accessCallbacks
keyboard_arrow_up
0
libsystem_kernel.dylib
__ulock_wait + 8
1
libdispatch.dylib
_dlock_wait + 56
2
libdispatch.dylib
_dispatch_group_wait_slow + 60
3
libdispatch.dylib
dispatch_block_wait + 308
4
PhotoLibraryServices[PLLimitedLibraryPicker _presentLimitedLibraryPickerFromViewController:options:] + 572
5
AssetsLibraryServices
PLPresentLimitedLibraryPicker + 308
6
AssetsLibraryServices[PLPrivacy _checkAuthStatusForPhotosAccessScope:promptIfUnknown:resultHandler:] + 1008
7
libdispatch.dylib
_dispatch_call_block_and_release + 32
8
libdispatch.dylib
_dispatch_client_callout + 20
9
libdispatch.dylib
_dispatch_lane_serial_drain + 620
10
libdispatch.dylib
_dispatch_lane_invoke + 456
11
libdispatch.dylib
_dispatch_workloop_worker_thread + 780
12
libsystem_pthread.dylib
_pthread_wqthread + 276
13
libsystem_pthread.dylib
start_wqthread + 8
It looks like a deadlock issue.
Anybody any clues?
Thanks a lot!
Post not yet marked as solved
I have a SwiftUI application that processes image files from Fujifilm cameras - both raw and jpeg. When the image files are imported into the Photos app they are stacked so that you see only a single image when there are both raw and jpeg versions of the same image. Using Swift I cannot work out how to access both files - using the following code you get the jpeg file or the raw file if there is only a single file - but if there are both raw and jpeg files you only get the jpeg file.
import SwiftUI
import PhotosUI
struct PhotoPicker1: UIViewControllerRepresentable {
typealias UIViewControllerType = PHPickerViewController
@ObservedObject var mediaItems: PickedMediaItems
var didFinishPicking: (_ didSelectItems: Bool) -> Void
func makeUIViewController(context: Context) -> PHPickerViewController {
var config = PHPickerConfiguration(photoLibrary: .shared())
config.filter = .any(of: [.images])
config.selectionLimit = 0
config.preferredAssetRepresentationMode = .current
let controller = PHPickerViewController(configuration: config)
controller.delegate = context.coordinator
return controller
}
func updateUIViewController(_ uiViewController: PHPickerViewController, context: Context) { }
func makeCoordinator() -> Coordinator { Coordinator(with: self) }
class Coordinator: PHPickerViewControllerDelegate {
var photoPicker1: PhotoPicker1
init(with photoPicker1: PhotoPicker1) {
self.photoPicker1 = photoPicker1
}
func picker(_ picker: PHPickerViewController, didFinishPicking results: [PHPickerResult]) {
photoPicker1.didFinishPicking(!results.isEmpty)
guard !results.isEmpty else {
return
}
for result in results {
let itemProvider = result.itemProvider
let typeIdentifier = itemProvider.registeredTypeIdentifiers.first ?? ""
print("typeIdentifier: \(typeIdentifier)")
}
}
}
}
Post not yet marked as solved
Dear Experts,
PHAsset.creationDate is an NSDate, which does not have a timezone associated with it, right?
Consider a photo viewer app. If I take a photo of the sunrise at 0600 local time while I am away, when I get home and view the photo in the app, I believe I want the timestamp shown with the photo to be 0600. Do you agree?
But NSDate is just a time-point, and I don't think Foundation (or anything else in iOS) has a type that combines a time-point with a time zone. Nor does PHAsset have any other useful attributes - unless I were to determine the time zone from the location!
Am I missing anything?
Post not yet marked as solved
I am using the code below to import images into a Photos Library. The code works for JPGs and HEICs but when it encounters Apple ProRaw (DNG) files it give the following error messages:
Error Domain=PHPhotosErrorDomain Code=3302
findWriterForTypeAndAlternateType:119: unsupported file format 'com.adobe.raw-image'
Here is the code:
func createPhotoOnAlbum(photo: UIImage, album: PHAssetCollection) {
PHPhotoLibrary.shared().performChanges({
// Request creating an asset from the image
let createAssetRequest = PHAssetChangeRequest.creationRequestForAsset(from: photo)
// Request editing the album
guard let albumChangeRequest = PHAssetCollectionChangeRequest(for: album) else {
print("album change request has failed")
// Album change request has failed
return
}
// Get a placeholder for the new asset and add it to the album editing request
guard let photoPlaceholder = createAssetRequest.placeholderForCreatedAsset else {
// Photo Placeholder is nil
return
}
albumChangeRequest.addAssets([photoPlaceholder] as NSArray)
}, completionHandler: { success, error in
if success {
// Saved successfully!
print("saved successfully")
self.importCount += 1
}
else if let e = error {
// Save photo failed with error
print("error saving: \(error)")
}
else {
print("error -> ")
// Save photo failed with no error
}
})
These are definitely unedited ProRaw DNGs.
On a Mac they can be imported into a Photos Library using the "Import" menu command.
On an iPad they can be brought into the Photos Library by selecting them in the file system on the iPad, tapping Share, then tapping Save Images.
Thank you for any assistance with this.
Post not yet marked as solved
Hello. Looks like there is kind of block for using UIImagepickercontroller for full screen views and there is only popup option available. Why?! We can use photos from library, try to create custom grid.. but there is a lot if already solved issues with memory usage and so on.. Now I used to wrote my custom scrolling and zooming by pinch tool like native photo grid but it works on UIImageView and only with not very big amount of UIImageViews (creating more than several thousand on old iPhones like 6s start being problematic..). After I decided to try to create more specific tool. I'v created UIImage and set it as cgImage for View content. And updating this image and View content after it. But it takes really long time to print thousands of images even with PHCachingImageManager it looks like imageCachingManager.requestImage takes too much time even with ready to use thumbnails or there is an error in startCachingImages tool or it's bad idea to print images one by one in background and lots of other questions and problems.. why? Really..? It's complete tool and you just blocked it... Or there is somewhere complete ready to use example how to implement grid of photos like in native app to main view without popups and so on?
https://developer.apple.com/documentation/uikit/uiimagepickercontroller
Post not yet marked as solved
Users can run our apps on Macs with Apple Silicon via the "iPad Apps on Mac" feature.
The apps use PHPhotoLibrary.requestAuthorization(for: .addOnly, handler: callback) to request write-only access to the user's Photo Library during image export. This works as intended on macOS, but a huge problem arises when the user denies access (by accident or intentionally) and later decides that they want us to add their image to Photos: There is no way to grant this permission again.
In System Preferences → Privacy & Security → Photos, the app is just not listed – in fact, none of the "iPad Apps on Mac" apps appear here.
Not even tccutil reset all my.bundle.id works. It just reports
tccutil: Failed to reset all approval status for my.bundle.id.
Uninstalling, restarting the Mac, and reinstalling the app also doesn't work. The system seems to remember the initial decision.
Is this an oversight in the integration of those apps with macOS, or are we missing something fundamental here? Is there maybe a way to prompt the user again?
Post not yet marked as solved
I cannot find anything documentation re: isPrivacySensitiveAlbum. I've granted my app access to all photos. Not sure what else to try
Code that triggers the crash:
let options = PHFetchOptions()
options.fetchLimit = 1
let assetColl = PHAssetCollection.fetchAssetCollections(withLocalIdentifiers: [localId], options: options)
if assetColl.count > 0 {
if let asset = PHAsset.fetchKeyAssets(in: assetColl.firstObject!, options: options)
stack trace from here on
`2023-04-15 06:34:41.628537-0700 DPF[33615:6484880] -[PHCollectionList isPrivacySensitiveAlbum]: unrecognized selector sent to instance 0x7ff09232aec0
2023-04-15 06:34:41.632378-0700 DPF[33615:6484880] *** Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '-[PHCollectionList isPrivacySensitiveAlbum]: unrecognized selector sent to instance 0x7ff09232aec0'
*** First throw call stack:
(
0 CoreFoundation 0x00007ff80045478b __exceptionPreprocess + 242
1 libobjc.A.dylib 0x00007ff80004db73 objc_exception_throw + 48
2 CoreFoundation 0x00007ff8004638c4 +[NSObject(NSObject) instanceMethodSignatureForSelector:] + 0
3 CoreFoundation 0x00007ff800458c66 ___forwarding___ + 1443
4 CoreFoundation 0x00007ff80045ae08 _CF_forwarding_prep_0 + 120
5 Photos 0x00007ff80b8480e1 +[PHAsset fetchKeyAssetsInAssetCollection:options:] + 86
6 DPF 0x0000000100791029 $s3DPF16AlbumListFetcherV22loadKeyImageForLocalIdySo7UIImageCSgSSYaFTY0_ + 569`
Post not yet marked as solved
Hello,
I am trying to use the new SwiftUI method called ImageRenderer to render any SwiftUI view into an image. However, I am encountering an issue when using it with AsyncImage:
AsyncImage(url: URL(string: imageURL)) { phase in
if let sampleImage = phase.image {
sampleImage.onAppear{
Task {
render(content: sampleImage)
}
}
} else if phase.error != nil {
Color.red // Indicates an error.
} else {
Color.blue // Acts as a placeholder.
}
}
@MainActor
private func render(content: some View) {
let renderer = ImageRenderer(content: content)
renderer.scale = 500
if let cgImage = renderer.cgImage {
print("Rendering")
ImageSaver.shared.saveCGImageToDisk(cgImage)
}
}
public func saveCGImageToDisk(_ cgImage: CGImage) {
let image = UIImage(cgImage: cgImage, scale: 500, orientation: .down)
PHPhotoLibrary.shared().performChanges({
PHAssetChangeRequest.creationRequestForAsset(from: image)
}) { success, error in
if success {
print("Image saved to photo library")
} else {
print("Error saving image: \(error?.localizedDescription ?? "unknown error")")
}
}
}
When rendering a different image, such as Image(systemName: "pencil"), everything works as expected. However, when using AsyncImage, it seems that the rendering process is not working as intended.
Here are some screenshots:
What could be causing this issue when rendering AsyncImages with ImageRenderer, and how can it be resolved? Any insights or suggestions are greatly appreciated.
Thank you in advance.
Post not yet marked as solved
Hey all, a question about PhotosUI. I see a weird behaviour when the completion handler for presentLimitedLibraryPicker gets called twice. The first one returns only one identifier, and the second one comes with all the selected photos.
Here is the sample code
import UIKit
import PhotosUI
class ViewController: UIViewController {
let button = UIButton(type: .system)
override func viewDidLoad() {
super.viewDidLoad()
view.backgroundColor = .systemBackground
view.addSubview(button)
PHPhotoLibrary.requestAuthorization(for: .readWrite) { status in }
button.frame = CGRect(origin: view.center, size: .init(width: 200, height: 50))
button.center = view.center
button.setTitle("Open Picker", for: .normal)
button.addAction(UIAction(handler: { _ in
PHPhotoLibrary.shared().presentLimitedLibraryPicker(from: self) { ids in
print(ids)
}
}), for: .touchUpInside)
}
}
After selecting 3 photos the output in the console will be like
["CC95F08C-88C3-4012-9D6D-64A413D254B3/L0/001"] //1 item
["CC95F08C-88C3-4012-9D6D-64A413D254B3/L0/001", "ED7AC36B-A150-4C38-BB8C-B6D696F4F2ED/L0/001", "99D53A1F-FEEF-40E1-8BB3-7DD55A43C8B7/L0/001"] // 3 items
Post not yet marked as solved
I am working on an app that allows a user to upload a photo or take a photo of a clothing item (shirt, pants, shorts, hoodie, etc) and then it is going to get the color or colors of the image. Currently I have it configured to just take a picture of the clothing item and everything in the background is included as well. Is there a way for once the user selects or takes a picture, the app automatically selects the subject (clothing item) so that way the color calculator does not get confused by background colors?