PhotoKit

RSS for tag

Work with image and video assets managed by the Photos app, including those from iCloud Photos and Live Photos, using PhotoKit.

PhotoKit Documentation

Posts under PhotoKit tag

105 Posts
Sort by:
Post not yet marked as solved
1 Replies
62 Views
Hey, for a very long time I just can’t jump-in a video in the gallery or what I get in social media and getting downloading to my gallery. hope there is a bug bounty lol tnx
Posted
by Ygroos.
Last updated
.
Post not yet marked as solved
6 Replies
3.5k Views
In Xcode 12 beta 4 i could load both a jpeg and HEIC image using itemProvider.loadObject(ofClass: UIImage.self). But in beta 5 when using it with a HEIC file i get "Error Domain=NSItemProviderErrorDomain Code=-1000 "Cannot load representation of type public.jpeg"" Is this by design? Also, itemProvider.canLoadObject(ofClass: UIImage.self) still returns true for both jpeg and HEIC files.
Posted Last updated
.
Post not yet marked as solved
0 Replies
34 Views
PhotosPicker in iOS 16 can be declared only with a separate trigger view. If this view is a part of regular view, then it’s OK. But if it’s declared in Menu as one of the options, then it doesn’t appear after triggering (Xcode 14 beta 5). So, this works: struct ContentView: View {     @State private var selectedPhotoItem: PhotosPickerItem?     var body: some View {         PhotosPicker(             selection: $selectedPhotoItem,             matching: .images,             photoLibrary: .shared()          ) {              Image(systemName: "photo.on.rectangle.angled")       }     } } And this is not: struct ContentView: View {     @State private var selectedPhotoItem: PhotosPickerItem?     var body: some View {         Menu {             PhotosPicker(                 selection: $selectedPhotoItem,                 matching: .images,                 photoLibrary: .shared()             ) {                 Label("Photo Library", systemImage: "photo.on.rectangle.angled")             }             Button(“Another action”) { }       } label: { Text("Menu") }     } } The same behaviour we have for ‘toolbar’ in NavigationStack. If PhotosPicker declared directly in ToolbarItem, it’s OK. But if it’s inside Menu of ToolbarItem, then it doesn’t appear after triggering. As I can guess PhotosPicker can’t appear for the same reason as the '.sheet' or '.alert' attached to Button inside Menu. It may be because of disappearing Menu after picking an option. This also won’t work: struct ContentView: View {     @State private var isPresentedSheet = false     var body: some View {         Menu {             Button(“Some action”) { isPresentedSheet = true }                 .sheet(isPresented: $isPresentedSheet) {                     SomeView()                 }             Button(“Another action”) { }         } label: { Text("Menu") }     } } struct SomeView: View {     var body: some View {         Text("Hello")     } } But in this case we can simply shift .sheet modifier up in the view hierarchy and it will work. struct ContentView: View {     @State private var isPresentedSheet = false     var body: some View {         Menu {             Button(“Some action”) { isPresentedSheet = true }             Button(“Another action”) { }         } label: { Text("Menu") }         .sheet(isPresented: $isPresentedSheet) {             SomeView()         }     } } struct SomeView: View {     var body: some View {         Text("Hello")     } } However with PhotosPicker there is no possibility to trigger it outside the Menu anywhere else in ‘body’. It has no ‘isPresented’ parameter or something like that. So the only possibility to show PhotosPicker now is the explicit view, that you can press on the screen - not from Menu. I have a design where PhotosPicker should be triggered from Menu of a Button. And I have no idea, how to show it right now. Is it a bug? Or is it a fundamental limitation of SwiftUI presentation system?
Posted Last updated
.
Post not yet marked as solved
2 Replies
143 Views
Pick 2 Photo with PhotosPicker Deselect 2 Photo in PhotosPicker And Done selection(PhotosPickerItem) doesn't change PhotosPicker(selection: $photoPickerItems, maxSelectionCount: 0, selectionBehavior: .ordered, matching: nil, preferredItemEncoding: .current, photoLibrary: .shared()) { Image(systemName: "photo") }
Posted
by zunda.
Last updated
.
Post marked as solved
3 Replies
365 Views
I use loadFileRepresentation(forTypeIdentifier:completionHandler:) to load video with PHPickerViewController What can I use load video with? // my sample code func loadPhoto(pickerItem: PhotosPickerItem) async throws -> Photo {     if let livePhoto = try await pickerItem.loadTransferable(type: PHLivePhoto.self) {       let photo: Photo = .init(id: pickerItem.itemIdentifier, item: livePhoto)       return photo     } else if let url = try await pickerItem.loadTransferable(type: URL.self) {       let photo: Photo = .init(id: pickerItem.itemIdentifier, item: url)       return photo     } else if let data = try await pickerItem.loadTransferable(type: Data.self) {       let photo: Photo = .init(id: pickerItem.itemIdentifier, item: data)       return photo     }     throw PhotoError.load   }
Posted
by zunda.
Last updated
.
Post not yet marked as solved
1 Replies
82 Views
Is it ok to call requestContentEditingInput for a lot of PHAssets to get URLs for their full size image? It seems odd because I would not be using the content editing input to actually modify these images. Is that ok are or are there implications to be aware of? Use case: I want to allow the user to share multiple PHAssets via UIActivityViewController. I can download and share an array of UIImage, which works, but I found if you tap Copy the app freezes for like 1 second for each photo (10 seconds if you shared 10 photos). Profiling the app it looks like iOS is spending the time creating a PNG for each image. Also it's probably not a good idea to store huge images in memory like that. I figured I'd try sharing an array of URLs to the images. Seemingly the only way you can get a URL for a photo is by requesting a content editing input for the asset and accessing its fullSizeImageURL property. Is this a good idea, and is this the right approach to share PHAssets?
Posted
by Jordan.
Last updated
.
Post not yet marked as solved
5 Replies
1.2k Views
I use the following code to parse Photo metadata and this works well. However, I am unable to pull the new iOS 14 "caption" from this metadata (it worked in early iOS 14 betas, but has since stopped working in the GM.) Does anyone know how I can get the caption data from a PHAsset? Thanks! Stephen         let options = PHContentEditingInputRequestOptions()         options.isNetworkAccessAllowed = true         asset.requestContentEditingInput(with: options, completionHandler: {(contentEditingInput, _) -> Void in             if let url = contentEditingInput?.fullSizeImageURL {                 let fullImage = CIImage(contentsOf: url)                                  // get all the metadata                 self.allPhotoMetadata = fullImage?.properties ?? [:]                                  // {TIFF}                 if let tiffDict = self.allPhotoMetadata["{TIFF}"] as? [String:Any] {                     if tiffDict["Make"] != nil {                         self.cameraData[cameraKeys.make] = tiffDict["Make"]                     }                     if tiffDict["Model"] != nil {                         self.cameraData[cameraKeys.model] = tiffDict["Model"]                     }                     if tiffDict["ImageDescription"] != nil {                         self.imageData[imageKeys.caption] = tiffDict["ImageDescription"]                     }                 }                                  // {IPTC}                 if let iptcDict = self.allPhotoMetadata["{IPTC}"] as? [String:Any] {                     // if we didn't find a caption in the TIFF dict, try to get it from IPTC data                     // first try, Caption/Abtract, then ArtworkContentDescription                     if self.imageData[imageKeys.caption] == nil {                         if iptcDict["Caption/Abstract"] != nil {                             self.imageData[imageKeys.caption] = iptcDict["ArtworkContentDescription"]                         } else if iptcDict["ArtworkContentDescription"] != nil {                             self.imageData[imageKeys.caption] = iptcDict["ArtworkContentDescription"]                         }                     }                 }             }         })     }
Posted
by sorth.
Last updated
.
Post not yet marked as solved
0 Replies
69 Views
I want to display a PhotosPicker in "half size " with the .presentationDetents([.medium, .large]) approach, but I cannot make it work. Here is a code example. If you run the example you will see that the bottom button opens a sheet in half screen. The top button will open the PhotosPicker, but not in "half size". Any help or info is greatly appreciated. import SwiftUI import PhotosUI struct ContentView: View {   @State private var showingGreeting = false   @State private var selectedItem: PhotosPickerItem? = nil   @State private var selectedImageData: Data? = nil   var body: some View {     VStack {       PhotosPicker(         selection: $selectedItem,         matching: .images,         photoLibrary: .shared()) {           Text("Open PhotosPicker")             .presentationDetents([.medium, .large])         }         .onChange(of: selectedItem) { newItem in           Task {             // Retrieve selected asset in the form of Data             if let data = try? await newItem?.loadTransferable(type: Data.self)             {               selectedImageData = data             }           }         }       if let selectedImageData,         let uiImage = UIImage(data: selectedImageData) {         Image(uiImage: uiImage)           .resizable()           .scaledToFit()           .frame(width: 250, height: 250)       }       Divider()       Button("Open sheet in half screen") {         showingGreeting.toggle()       }       .sheet(isPresented: $showingGreeting) {         Text("Hello there!")           .presentationDetents([.medium, .large])       }     }   } } struct ContentView_Previews: PreviewProvider {   static var previews: some View {     ContentView()   } }
Posted
by Ezrap.
Last updated
.
Post not yet marked as solved
0 Replies
84 Views
i have received a lot of crash log only in iOS16 the crash occured when i called : [[PHImageManager defaultManager] requestImageDataForAsset:asset options:options resultHandler:resultHandler] here is the crash log Exception Type: NSInternalInconsistencyException ExtraInfo: Code Type: arm64 OS Version: iPhone OS 16.0 (20A5328h) Hardware Model: iPhone14,3 Launch Time: 2022-07-30 18:43:25 Date/Time: 2022-07-30 18:49:17 *** Terminating app due to uncaught exception 'NSInternalInconsistencyException', reason:Unhandled error (NSCocoaErrorDomain, 134093) occurred during faulting and was thrown: Error Domain=NSCocoaErrorDomain Code=134093 "(null)" Last Exception Backtrace: 0 CoreFoundation 0x00000001cf985dc4 0x1cf97c000 + 40388 1 libobjc.A.dylib 0x00000001c8ddfa68 0x1c8dc8000 + 96872 2 CoreData 0x00000001d56d2358 0x1d56cc000 + 25432 3 CoreData 0x00000001d56fa19c 0x1d56cc000 + 188828 4 CoreData 0x00000001d5755be4 0x1d56cc000 + 564196 5 CoreData 0x00000001d57b0508 0x1d56cc000 + 935176 6 PhotoLibraryServices 0x00000001df1783e0 0x1df0ed000 + 570336 7 Photos 0x00000001df8aa88c 0x1df85d000 + 317580 8 PhotoLibraryServices 0x00000001df291de0 0x1df0ed000 + 1723872 9 CoreData 0x00000001d574e518 0x1d56cc000 + 533784 10 libdispatch.dylib 0x00000001d51fc0fc 0x1d51f8000 + 16636 11 libdispatch.dylib 0x00000001d520b634 0x1d51f8000 + 79412 12 CoreData 0x00000001d574e0a0 0x1d56cc000 + 532640 13 PhotoLibraryServices 0x00000001df291d94 0x1df0ed000 + 1723796 14 PhotoLibraryServices 0x00000001df291434 0x1df0ed000 + 1721396 15 Photos 0x00000001df8a8380 0x1df85d000 + 308096 16 Photos 0x00000001df89d050 0x1df85d000 + 262224 17 Photos 0x00000001df87f62c 0x1df85d000 + 140844 18 Photos 0x00000001df87ee94 0x1df85d000 + 138900 19 Photos 0x00000001df87e594 0x1df85d000 + 136596 20 Photos 0x00000001df86b5c8 0x1df85d000 + 58824 21 Photos 0x00000001df86d938 0x1df85d000 + 67896 22 Photos 0x00000001dfa37a64 0x1df85d000 + 1944164 23 Photos 0x00000001dfa37d18 0x1df85d000 + 1944856 24 youavideo -[YouaImageManager requestImageDataForAsset:options:resultHandler:] (in youavideo) (YouaImageManager.m:0) 27 25 youavideo -[YouaAlbumTransDataController requstTransImageHandler:] (in youavideo) (YouaAlbumTransDataController.m:0) 27 26 youavideo -[YouaAlbumTransDataController requstTransWithHandler:] (in youavideo) (YouaAlbumTransDataController.m:77) 11 27 youavideo -[YouaUploadTransDataOperation startTrans] (in youavideo) (YouaUploadTransDataOperation.m:102) 19 28 Foundation 0x00000001c9e78038 0x1c9e3c000 + 245816 29 Foundation 0x00000001c9e7d704 0x1c9e3c000 + 268036 30 libdispatch.dylib 0x00000001d51fa5d4 0x1d51f8000 + 9684 31 libdispatch.dylib 0x00000001d51fc0fc 0x1d51f8000 + 16636 32 libdispatch.dylib 0x00000001d51ff58c 0x1d51f8000 + 30092 33 libdispatch.dylib 0x00000001d51febf4 0x1d51f8000 + 27636 34 libdispatch.dylib 0x00000001d520db2c 0x1d51f8000 + 88876 35 libdispatch.dylib 0x00000001d520e338 0x1d51f8000 + 90936 36 libsystem_pthread.dylib 0x00000002544b9dbc 0x2544b9000 + 3516 37 libsystem_pthread.dylib 0x00000002544b9b98 0x2544b9000 + 2968 i can't find the error code 134093 definition i don't know what's going wrong in iOS16 Would anyone have a hint of why this could happen and how to resolve it? thanks very much
Posted Last updated
.
Post not yet marked as solved
0 Replies
112 Views
My train of thought: Create a ScrollView with SwiftUI and use LazyVGrid for layout and performance optimization Use .onApear {} to perform the Photos fetchResult operation Dynamically load photos in LazyVGrid (turn asset into UIImage, then into Image View) code show as below: import SwiftUI import UIKit import Photos struct ContentView: View {     var albumManager = Album()     @State var count: Int = 0     var thumbnailSize: CGSize{         CGSize(width: 150, height: 150)     }          var body: some View {         VStack {             Text("Photos Count: \(count)")                 .font(.title)                 .foregroundColor(.blue)             if count > 0  {                 ScrollView {                     LazyVGrid(columns: [ GridItem(.adaptive(minimum: 150), spacing: 20)], spacing: 20)  {                         ForEach(0 ..< count) { index in                             let asset = albumManager.fetchResult.object(at: index)                             var thumbImage: UIImage? = UIImage(systemName: "photo")                             let aaa = albumManager.imageManager.requestImage(for: asset, targetSize: thumbnailSize, contentMode: .aspectFill, options: nil)                             { image, _ in                                 if image != nil {                                     thumbImage = image!                                     print("\(asset.localIdentifier)")                                 }                                                             }                             if thumbImage != nil {                                 Image(uiImage: thumbImage!)                                     .resizable()                                     .aspectRatio(contentMode: .fill)                                     .frame(width: 150, height: 150)                                     .cornerRadius(5)                             }                                                    }                     }                 }                             }                      }         .onAppear {             Task {                 await MainActor.run {                     albumManager.loadAllImages()                     albumManager.options.deliveryMode = .highQualityFormat                     count = albumManager.count                 }             }         }     } } class Album: ObservableObject {     @Published var count: Int = 0     @Published var fetchResult: PHFetchResult<PHAsset>!          let allPhotosOptions = PHFetchOptions()     let imageManager = PHCachingImageManager()     var options = PHImageRequestOptions()          func loadAllImages() {         allPhotosOptions.sortDescriptors = [NSSortDescriptor(key: "creationDate", ascending: false)]         fetchResult = PHAsset.fetchAssets(with: allPhotosOptions)         count = fetchResult.count     } } Here comes the question: .requestImage(for: asset, targetSize: thumbnailSize, contentMode: .aspectFill, options: nil) { image, _ in} This code will asynchronously fetch photo thumbnails multiple times. When options is set to nil, it will quickly get the image and load it immediately. But the thumbnails are blurry. When options is set to .highQualityFormat, it will get high-quality images, but because the speed is not fast enough, it will not be loaded. If you can use await to get the image directly, then all the problems are solved. But Apple only provides a callback for a closure here: .requestImage(for: asset, targetSize: thumbnailSize, contentMode: .aspectFill, options: nil) { image, _ in} I also tried another solution: Use a @State variable to store image, so you can get instant refreshed images. But the pictures will be refreshed all the time, which is not easy to manage. So, how can I handle this like the Cell in UIKit using CollectionView.
Posted
by NeilWang.
Last updated
.
Post not yet marked as solved
1 Replies
752 Views
Hi,Let's say - i have quite big image in Photo Library on my iPad (21600x21600 pixels, jpeg) and i'm trying to retrieve it with PHImageManager.requestImage(). If i'm passing targetSize with some smaller value (for example 1000x1000) - image retrieved properly. Othervise i'm getting following error (on iOS 9, on iOS 11 app crashes due to memory):CGBitmapContextInfoCreate: unable to allocate 1866240000 bytes for bitmap dataMy use case is to retrieve image from Photos, change it's size and jpeg quality as defined by user in settings and send it to server. Is there any way to make this work for really big images?
Posted
by JL3.
Last updated
.
Post not yet marked as solved
0 Replies
87 Views
While uploading photos on iCloud in multiple albums at the same time, same photos getting uploaded on all the selected albums. This should not happen, user should see the photos of respective albums only.
Posted Last updated
.
Post not yet marked as solved
1 Replies
93 Views
In the case where an app uses the Photo Library to reload images by using a fetchRequest with localIdentifier there is hole in the following guidance in WWDC21 "Improve access to Photos in your app" "When the picker session is completed, selected photos will be returned to your app. Suppose the configuration is initialized with a PHPhotoLibrary object, picker results will contain both item providers and asset identifiers. Your app can use the item provider to load and display selected photos." The picker result providers do not load the image if the image is not in the limited library list, even though the user has selected them with the PHPicker. It looks like the workaround is to raise a user error that the image is not loaded because it is not in the limited selection, then do the PHPhotoLibrary.shared().presentLimitedLibraryPicker so the limited selection is the way the user wants it. This workaround seems like an awful thing to do to the user since they have to pick the same photo(s) twice.. Also - this scenario works fine with the PHPickerDemo because privacy setting for the demo is not added in settings.. Presumably because PHPickerDemo is not directly using the PhotoKit API methods. Any thoughts or workaround alternatives that you see? Or raise a bug? Thanks!!
Posted Last updated
.
Post not yet marked as solved
1 Replies
198 Views
Hello, Recently we added a change in our app that changed the resource addition to Photos - Until now we called addResourceWithType:fileURL:options: with PHAssetResourceCreationOptions.shouldMoveFile set to YES, and when we changed it to NO (the default value) we observed much more asset creation failures. Specifically we see a new error code in the procedure - PHPhotosErrorNotEnoughSpace. One can clearly see a connection between adding more storage to the file system and an asset creation failure that is related to storage, but we are struggling to understand a few things: The storage of the device is always higher than the video size, usually by a great amount - We observed failures for devices with 120GB free storage, while the video size was 200MB. Generally we save quite a lot of resources to the file system, so it is quite surprising to see supposedly storage issues when adding a relatively low amount of extra storage. The asset creation is part of a bigger procedure of encoding a video to a file system, and then moving/copying it to Photos. Is it that big of a difference to copy a video of 100MB-200MB instead of moving it, such that the overall procedure failure will increase drastically? Appreciate any help.
Posted
by galk1d.
Last updated
.
Post not yet marked as solved
2 Replies
142 Views
iOS 16 Beta 3 introduces the "Shared Photo Library". PhotoKit however currently lacks any functionality to import photos/videos into the "Shared Photo Library" It would be welcome, if PHAssetCreationRequest could be extended with e.g. a property "destination" that defaults to the personal library, but can also be set to the shared library. Scenarios: 3rd party camera apps that should be able to import into the "Shared Library" Any type of family orientated apps that should be able to import into the "Shared Library" Filled this also as FB10577456
Posted
by hhtouch.
Last updated
.