PhotoKit

RSS for tag

Work with image and video assets managed by the Photos app, including those from iCloud Photos and Live Photos, using PhotoKit.

PhotoKit Documentation

Posts under PhotoKit tag

99 Posts
Sort by:
Post marked as solved
2 Replies
1.4k Views
Hello, I'm currently stuck trying to load a Video - previously picked by an PHPicker. In the photos you can see the current Views. The Videoplayer View stays unresponsive but in the first frames when the picker disappears you can see the thumbnail and a play button. What am i doing wrong? Should i load the file differently? This is my Picker: struct VideoPicker: UIViewControllerRepresentable{     @Binding var videoURL:String? func makeUIViewController(context: Context) -> PHPickerViewController {         var config = PHPickerConfiguration()         config.filter = .videos         let picker = PHPickerViewController(configuration: config)         picker.delegate = context.coordinator         return picker     } func updateUIViewController(_ uiViewController: PHPickerViewController, context: Context) {} func makeCoordinator() -> Coordinator {         Coordinator(self)     } class Coordinator:NSObject, PHPickerViewControllerDelegate{ let parent:VideoPicker init(_ parent: VideoPicker){ self.parent = parent } func picker(_ picker: PHPickerViewController, didFinishPicking results: [PHPickerResult]) { picker.dismiss(animated: true) { // do something on dismiss }              guard let provider = results.first?.itemProvider else {return} provider.loadFileRepresentation(forTypeIdentifier: "public.movie") { url, error in guard let url = url else {return} self.parent.videoURL = url.absoluteString print(url) print(FileManager.default.fileExists(atPath: url.path)) } } } } I'm totally able to get the URL (local URL - e.g.: file:///private/var/mobile/Containers/Data/Application/22126131-CBF4-4CAF-B943-22540F1096E1/tmp/.com.apple.Foundation.NSItemProvider. ) But for the life of me - the VideoPlayer won't play it: struct VideoView:View{     @Binding var videoURL:String? @Binding var showVideoPicker:Bool     var body: some View{         if let videoURL = videoURL {             VideoPlayer(player: AVPlayer(url: URL(fileURLWithPath:videoURL)))  .frame(width: 100, height: 100, alignment: .center)  .clipShape(RoundedRectangle(cornerRadius: 16)) .onLongPressGesture{ generator.feedback.notificationOccurred(.success) showVideoPicker.toggle() } } else{     Text("...") } } } Maybe somebody can point me in the right direction because in every Tutorial everybody uses stuff that's bundled to play a video. I want to use Videos from the Photos APP (apple). The videoURL is a @State in my ContentView. It gets updated through the VideoPicker. Sorry for the formatting this is my first Post.
Posted
by brunzbus.
Last updated
.
Post not yet marked as solved
3 Replies
723 Views
I wrote the code as below for save image from PHImageManager, but a crash occurs. The data is from PHImageManager.default().requestImageDataAndOrientation. Crash Code guard let cgImage = UIImage(data: data)?.cgImage else { return } let metadata = ciImage.properties let destination: CGImageDestination = CGImageDestinationCreateWithURL(url as CFURL, uti as CFString, 1, nil)! CGImageDestinationAddImage(destination, cgImage, metadata as CFDictionary?) let success: Bool = CGImageDestinationFinalize(destination) // <- crashed Not Crash Code guard let cgImage = UIImage(data: data)?.cgImage else { return } let metadata = ciImage.properties let destination: CGImageDestination = CGImageDestinationCreateWithURL(url as CFURL, uti as CFString, 1, nil)! CGImageDestinationAddImage(destination, cgImage, nil) let success: Bool = CGImageDestinationFinalize(destination) // <- not crashed metadata { ColorModel = RGB; DPIHeight = 72; DPIWidth = 72; Depth = 8; PixelHeight = 2160; PixelWidth = 2880; ProfileName = "sRGB IEC61966-2.1"; "{Exif}" = { ApertureValue = "1.356143809255609"; BrightnessValue = "0.1278596944592232"; ColorSpace = 1; ComponentsConfiguration = ( 1, 2, 3, 0 ); CompositeImage = 2; DateTimeDigitized = "2021:12:28 08:38:28"; DateTimeOriginal = "2021:12:28 08:38:28"; DigitalZoomRatio = "1.300085984522786"; ExifVersion = ( 2, 2, 1 ); ExposureBiasValue = "0.09803208290449658"; ExposureMode = 0; ExposureProgram = 2; ExposureTime = "0.025"; FNumber = "1.6"; Flash = 16; FlashPixVersion = ( 1, 0 ); FocalLenIn35mmFilm = 33; FocalLength = "4.2"; ISOSpeedRatings = ( 400 ); LensMake = Apple; LensModel = "iPhone 12 back camera 4.2mm f/1.6"; LensSpecification = ( "4.2", "4.2", "1.6", "1.6" ); MeteringMode = 5; OffsetTime = "+09:00"; OffsetTimeDigitized = "+09:00"; OffsetTimeOriginal = "+09:00"; PixelXDimension = 2880; PixelYDimension = 2160; SceneCaptureType = 0; SceneType = 1; SensingMethod = 2; ShutterSpeedValue = "5.321697281908764"; SubjectArea = ( 2011, 1509, 2216, 1329 ); SubsecTimeDigitized = 686; SubsecTimeOriginal = 686; WhiteBalance = 0; }; "{IPTC}" = { DateCreated = 20211228; DigitalCreationDate = 20211228; DigitalCreationTime = 083828; TimeCreated = 083828; }; "{JFIF}" = { DensityUnit = 0; JFIFVersion = ( 1, 0, 1 ); XDensity = 72; YDensity = 72; }; "{TIFF}" = { DateTime = "2021:12:28 08:38:28"; HostComputer = "iPhone 12"; Make = Apple; Model = "iPhone 12"; Orientation = 0; ResolutionUnit = 2; Software = "Snowcorp SODA 5.4.8 / 15.2"; XResolution = 72; YResolution = 72; }; } What are the reasons? If I use CGimageDestinationAddImageFromSource instead of CGimageDestinationAddImage, there is no crash even if I add metadata. If I use PHImageManager.default().requestImage instead of PHImageManager.default().requestImageDataAndOrientation, and extract cgImage, there is no crash even if I add metadata.
Posted
by mj.lee123.
Last updated
.
Post not yet marked as solved
0 Replies
294 Views
I've tried to load icloud image using phasset with options requestOptions.isSynchronous = false requestOptions.isNetworkAccessAllowed = true I get CloudPhotoLibraryErrorDomain Code=1005 error I don't understand where I make mistake, I have used SDWebImagePhotosPlugin methods as well as Photos methods like requestImageDataAndOrientation and requestImageData, still I get the image as nil and the above error this is my code: imageManager.requestImageDataAndOrientation(for: deviceImage, options: phImageRequestOptions()) { data,deliveryMode, orentation, _ in if data != nil { completion(data) } else { SDImageLoadersManager.shared.loaders = [SDWebImageDownloader.shared, SDImagePhotosLoader.shared] SDWebImageManager.defaultImageLoader = SDImageLoadersManager.shared let photosURL = NSURL.sd_URL(with: deviceImage) SDImagePhotosLoader.shared.requestImage(with: photosURL as URL?, options: [.highPriority,.retryFailed,.refreshCached], context: [.customManager: self.manager], progress: nil) { image, data,error, success in if image != nil { completion(image?.pngData()) } else { completion(nil) } } }
Posted Last updated
.
Post not yet marked as solved
0 Replies
201 Views
Latest Albums\Videos in iCloud not available from code until I open the Photos App. Figured out that when I open the PHCollection picker and close it (even in a fraction of second), it triggers the iCloud sync on local Photos albums and I am able to get latest content. Is there way to trigger the iCloud sync with Photos albums through program or at least a workaround like open\close the PHCollection picker in background?
Posted
by aspiricx.
Last updated
.
Post not yet marked as solved
0 Replies
306 Views
I have been unable to capture Live Photos using UIImagePickerController. I can capture still photos and even video (which is not my scenario but I checked just to make sure), but the camera does not capture live photos. The documentation suggests it should (source): To obtain the motion and sound content of a live photo for display (using the PHLivePhotoView class), include the kUTTypeImage and kUTTypeLivePhoto identifiers in the allowed media types when configuring an image picker controller. When the user picks or captures a Live Photo, the editingInfo dictionary contains the livePhoto key, with a PHLivePhoto representation of the photo as the corresponding value. I've set up my controller: let camera = UIImagePickerController() camera.sourceType = .camera camera.mediaTypes = [UTType.image.identifier, UTType.livePhoto.identifier] camera.delegate = context.coordinator In the delegate I check for the Live Photo: func imagePickerController(_ picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [UIImagePickerController.InfoKey : Any]) { if let live = info[.livePhoto] as? PHLivePhoto { // handle live photo } else if let takenImage = info[.originalImage] as? UIImage, let metadata = info[.mediaMetadata] as? [AnyHashable:Any] { // handle still photo } } But I never get the Live Photo. I've tried adding NSMicrophoneUsageDescription to the info.plist thinking it needs permissions for the microphone, but that did not help. Of course, I've added the NSCameraUsageDescription to give camera permissions. Has anyone successfully captured Live Photos using UIImagePickerController?
Posted Last updated
.
Post not yet marked as solved
0 Replies
206 Views
In a Photos extension, the function finishProject(completionHandler completion: () -> Void) of PHProjectExtensionController is not called when Photos quits. Is there anything I need to setup for Photos to call it? When the user leaves the extension (and keeps Photos alive) the function is called.
Posted
by danielkbx.
Last updated
.
Post not yet marked as solved
0 Replies
252 Views
Does Photospicker change the hash value of the photo after giving permission to third-party apps?
Posted Last updated
.
Post not yet marked as solved
0 Replies
190 Views
The sample code in the Apple documentation found in  PHCloudIdentifier does not compile in xCode 13.2.1. Can the interface for identifier conversion be clarified so that the answer values are more accessible/readable. The values are 'hidden' inside a Result enum It was difficult (for me) to rewrite the sample code because I made the mistake of interpreting the Result type as a tuple. Result type is really an enum. Using the Result type as the return from library.cloudIdentifierMappings(forLocalIdentifiers: ) and .localIdentifierMappings( for: ) puts the actual mapped identifiers inside the the enum where they need additional access via a .stringValue message or an evaluation of an element of the result enum. For others finding the same compile issue, here is a working version of the sample code. This compiles in xCode 13.2.1. func localId2CloudId(localIdentifiers: [String]) -> [String] {         var mappedIdentifiers = [String]()        let library = PHPhotoLibrary.shared()         let iCloudIDs = library.cloudIdentifierMappings(forLocalIdentifiers: localIdentifiers)         for aCloudID in iCloudIDs {           let cloudResult: Result = aCloudID.value             // Result is an enum .. not a tuple             switch cloudResult {                 case .success(let success):                     let newValue = success.stringValue                     mappedIdentifiers.append(newValue)                 case .failure(let failure):                     // do error notify to user                       }         }         return mappedIdentifiers     } ``` swift func func cloudId2LocalId(assetCloudIdentifiers: [PHCloudIdentifier]) -> [String] {             // patterned error handling per documentation         var localIDs = [String]()         let localIdentifiers: [PHCloudIdentifier: Result<String, Error>]  = PHPhotoLibrary.shared() .localIdentifierMappings(                   for: assetCloudIdentifiers)         for cloudIdentifier in assetCloudIdentifiers {             guard let identifierMapping = localIdentifiers[cloudIdentifier] else {                 print("Failed to find a mapping for \(cloudIdentifier).")                 continue             }             switch identifierMapping {                 case .success(let success):                     localIDs.append(success)                 case .failure(let failure) :                     let thisError = failure as? PHPhotosError                     switch thisError?.code {                         case .identifierNotFound:                             // Skip the missing or deleted assets.                             print("Failed to find the local identifier for \(cloudIdentifier). \(String(describing: thisError?.localizedDescription)))")                         case .multipleIdentifiersFound:                             // Prompt the user to resolve the cloud identifier that matched multiple assets.                             print("Found multiple local identifiers for \(cloudIdentifier). \(String(describing: thisError?.localizedDescription))") //                            if let selectedLocalIdentifier = promptUserForPotentialReplacement(with: thisError.userInfo[PHLocalIdentifiersErrorKey]) { //                                localIDs.append(selectedLocalIdentifier)                         default:                             print("Encountered an unexpected error looking up the local identifier for \(cloudIdentifier). \(String(describing: thisError?.localizedDescription))")                     }               }             }         return localIDs     }
Posted Last updated
.
Post not yet marked as solved
1 Replies
470 Views
Related to this question Loading slow-mo videos with PHPicker via NSItemProvider.loadFileRepresentation seems to ignore PHPickerConfiguration.preferredAssetRepresentationMode and always reencodes (presumably to bake the slow-mo time segments into the video file). This reencode appears to be ~realtime so this can take a minute for a 1 minute video. Normal videos are near instant. Is there a faster way to import slow-mo videos than this API? One nuance - If I use loadInPlaceFileRepresentation I get no error, inPlace is set to true, but the file is 0 bytes/nonexistent. This seems perfect for the case mentioned in the docs where it would set inPlace to false and take time to make a local copy, so this seems like a bug. Interestingly on a slow-mo that I have previously used loadFileRepresentation that I then use loadInPlaceFileRepresentation it will work instantly but I think it's just using a cached version var configuration = PHPickerConfiguration(photoLibrary: PHPhotoLibrary.shared())   configuration.filter = .videos   configuration.selectionLimit = 1   configuration.preferredAssetRepresentationMode = .current       let photoPicker = PHPickerViewController(configuration: configuration)   photoPicker.delegate = self present(photoPicker, animated: true, completion: nil) ... func picker(_ picker: PHPickerViewController, didFinishPicking results: [PHPickerResult]) {   guard let provider = results.first?.itemProvider, provider.hasItemConformingToTypeIdentifier(UTType.movie.identifier) else {    print("Failed to get photo asset")    picker.dismiss(animated: true, completion: nil)    return   }       provider.loadFileRepresentation(forTypeIdentifier: UTType.movie.identifier) { url, error in    DispatchQueue.main.async {     picker.dismiss(animated: true, completion: nil)    }         guard error == nil, let url = url, let copyTo = DocumentsHelper.getCachedVideoDirectory()?.appendingPathComponent(url.lastPathComponent) else {     print("Failed to load video")     return    }    do {     if FileManager.default.fileExists(atPath: copyTo.path) {      try FileManager.default.removeItem(at: copyTo)     }     try FileManager.default.copyItem(at: url, to: copyTo)    } catch let error {     print("error")     return    } // load video into a player }
Posted
by ryan204.
Last updated
.
Post not yet marked as solved
4 Replies
565 Views
Dear: When I use iOS15 save video camera equipment to album, show PHPhotosErrorInvalidResource. Videos stored in the sandbox can play normally, lower than ios15 can be saved. And only one camera video fails to save, others succeeds. Confused and hoping for help. Here is the code [[PHPhotoLibrary sharedPhotoLibrary] performChanges:^{         PHAssetChangeRequest *photoRequest = [PHAssetChangeRequest creationRequestForAssetFromVideoAtFileURL:url];         PHAssetCollectionChangeRequest *albumChangeRequest = [PHAssetCollectionChangeRequest changeRequestForAssetCollection:collection];         PHObjectPlaceholder *assetPlaceholder = [photoRequest placeholderForCreatedAsset];         [albumChangeRequest addAssets:@[ assetPlaceholder ]];     } completionHandler:^(BOOL success, NSError * _Nullable error) {         [self performInMainThreadBlock:^{             if (success) {                 successBlock();             }else{                 failureBlock(error);             }         }];     }]
Posted
by Roffa.
Last updated
.
Post not yet marked as solved
4 Replies
489 Views
On Xcode 12 Beta 6, on Simulator, I cannot present a PHPickerViewController. I get the error message: [Picker] Picker failed with error: Error Domain=PXErrorDomain Code=-1 "PHPickerViewController did timeout." UserInfo={NSDebugDescription=PHPickerViewController did timeout.} And the view that is presented says, "Picker Unavailable". This is occurring in an existing app. If I start a new project and copy the presentation code, it works as expected.
Posted Last updated
.
Post not yet marked as solved
0 Replies
411 Views
I'm trying to implement a Gridview with photos stored in the photos app so the user can choose one picture and pick that as his profile picture. Before SwiftUI I used the collection view and Photos Kit to fetch the images and display them in a grid. Now that I switched to SwiftUI I tried to use LazyVGrid. I am able to fetch all the photos of the user and display them in a Grid. However it uses a lot of memory. I had a memory leak before but now Instruments isn't showing any leak any more. I thought it may be, that the Grid isn't really unloading the displayed images, when it gets invisible to the user. However if you scroll up and down multiple times it just uses a lot more memory than before. Like the grid is creating always new views and the old ones don't get deleted. Am I using something wrong or misunderstand the principles of LazyVGrid? My current code
Posted
by fbertuzzi.
Last updated
.
Post not yet marked as solved
1 Replies
472 Views
Apple Photos has support for "Optimize iPhone Storage" within iCloud Photos that removes Videos and Photos off the device when device storage is constrained. Is there a way to test this on simulators? Specifically, I'd like to know if it would be possible to make a Video always require a download so I can ensure my app handles downloading iCloud videos gracefully. I've tried signing in to iCloud on the simulator but it never seems to load my photos and videos. Thanks in advance!
Posted
by externvar.
Last updated
.
Post not yet marked as solved
0 Replies
344 Views
I have enabled runtime concurrency warnings to check for future problems concerning concurrency: Build Setting / Other Swift Flags: -Xfrontend -warn-concurrency -Xfrontend -enable-actor-data-race-checks When trying to call the async form of PHPhotoLibrary.shared().performChanges{} I get the following runtime warning: warning: data race detected: @MainActor function at ... was not called on the main thread in the line containing performChanges. My sample code inside a default Xcode multi platform app template is as follows: import SwiftUI import Photos @MainActor class FotoChanger{     func addFotos() async throws{         await PHPhotoLibrary.requestAuthorization(for: .addOnly)         try! await PHPhotoLibrary.shared().performChanges{             let data = NSDataAsset(name: "Swift")!.data             let creationRequest = PHAssetCreationRequest.forAsset()             creationRequest.addResource(with: .photo, data: data, options: PHAssetResourceCreationOptions())         }     } } struct ContentView: View {     var body: some View {         ProgressView()             .task{                 try! await FotoChanger().addFotos()             }     } } You would have to have a Swift data asset inside the asset catalog to run the above code, but the error can even be recreated if the data is invalid. But what am I doing wrong? I have not found a way to run perform changes, the block or whatever causes the error on the main thread. PS: This is only test code to show the problem, don't mind the forced unwraps.
Posted
by Dirk-FU.
Last updated
.
Post marked as solved
2 Replies
538 Views
Hi, I'm currently let the user pick a photo using PHPickerViewController. The advantage of PHPickerViewController is that the user is not requested to grant photo access (no permission alerts). After the user picked the photo I need to take note of the local identifier and the cloud identifier (PHCloudIdentifier). For the local identifier no problem, I just use the assetIdentifier from the PHPickerResult. For obtaining the cloud identifier (PHCloudIdentifier, on iOS 15 only) I need to use the cloudIdentifierMappings method of PHPhotoLibrary. The problem of that method is that is causing the photo library access permission alerts to display. Someone know if there is another way to get the cloud identifier from a local identifier without having to prompt the user photo library access? Thank you
Posted
by DaleOne.
Last updated
.
Post marked as solved
1 Replies
422 Views
Inside func picker(_ picker: didFinishPicking results:) I am trying to obtain a UIImage from a Live Photo. I have parsed the results and filtered by regular still photos and live photos. For the still photos, I already have the UIImage but for the live photos I do not understand how to get from a PHLivePhoto to the PHAsset. if let livePhoto = object as? PHLivePhoto {   DispatchQueue.main.async { // what code do I insert here to get from PHLivePhoto to a UIImage? I need to extract a UIImage from a PHLivePhoto } } Thanks for the help in advance!
Posted
by ninumedia.
Last updated
.