Photos and Imaging

RSS for tag

Integrate still images and other forms of photography into your apps.

Posts under Photos and Imaging tag

76 Posts
Sort by:

Post

Replies

Boosts

Views

Activity

INCOMPATIBLE DISK error message upon launching Big Sur APFS vs MacOS Extended issue in Big Sur?
Running on: iMac 27" 5k late 2015 - 64gb ram and a 16tb Pegasus Promise2 R4 raid5 via Thunderbolt. After trying Big Sur - found issues with Luminar Photo app, decided to return to Catalina on the iMac. Reformatted my internal drive and reinstalled Catalina 15.5 and reformatted the raid. But I keep getting the following message upon restarting: "Incompatible Disk. This disk uses features that are not supported on this version of MacOS" and my Pegasus2 R4 portion no longer appears on the desktop or in Disk Utility... Looked into this and discovered that it may be an issue of Mac OS Extended vs APFS The iMac was formatted to APFS prior to installing OS11 so I reformatted to APFS when returning to Catalina. The issues persisted so I re-reformatted from a bootable USB - this time to Mac OS Extended (journaled) and the issues seems to be resolved. The iMac runs slower on MacOS Ext, but it is running and the Raid is recognised... I'd love to go back to APFS but am afraid it will "break" things. Any thought on this would be welcome. Thanks Nick
6
0
15k
Oct ’23
How to correctly load video selected with PHPickerViewController?
Hello! I am playing around with the PHPickerViewController and so far I was able to get the selected images by loading them into UIImage instances but I don't know how to get the selected video. Below is the relevant implementation of the method: func picker(_ picker: PHPickerViewController, didFinishPicking results: [PHPickerResult]): let provider = result.itemProvider guard provider.hasItemConformingToTypeIdentifier(AVFileType.mov.rawValue) else { return } 						provider.loadItem(forTypeIdentifier: AVFileType.mov.rawValue, options: nil) { (fileURL, error) in 								if let error = error { 										print(error) 										return 								} 								guard let videoURL = fileURL as? URL else { return } 								DispatchQueue.main.async { 										let fm = FileManager.default 										let destination = fm.temporaryDirectory.appendingPathComponent("video123.mov") 										try! fm.copyItem(at: videoURL, to: destination) 										let playerVC = AVPlayerViewController() 										playerVC.player = AVPlayer(url: destination) 										self.present(playerVC, animated: true, completion: nil) 								} 						} I get crash trying to copy the item. It says the source file does not exists but the path looks real to me. "The file “3C2BCCBC-4474-491B-90C2-93DF848AADF5.mov” couldn’t be opened because there is no such file." I tried it without copying first and just passing the URL to AVPlayer but nothing would play. I am testing this on a simulator. Thanks for help!
13
0
7.4k
Sep ’23
PhotoKit - non-system library
Hi, I'm wondering if it is possible to manipulate the "other" Photos library with PhotoKit, not the only System one. PhotoKit documentation only mentioned object of PHPhotoLibrary class, which is "a shared object that manages access and changes to the user’s shared photo library". But the Photos app in macOS allows to create and switch between multiple photo libraries. Thanks.
2
2
696
Sep ’23
iOS 14 Photo caption - how to access in metadata??
I use the following code to parse Photo metadata and this works well. However, I am unable to pull the new iOS 14 "caption" from this metadata (it worked in early iOS 14 betas, but has since stopped working in the GM.) Does anyone know how I can get the caption data from a PHAsset? Thanks! Stephen         let options = PHContentEditingInputRequestOptions()         options.isNetworkAccessAllowed = true         asset.requestContentEditingInput(with: options, completionHandler: {(contentEditingInput, _) -> Void in             if let url = contentEditingInput?.fullSizeImageURL {                 let fullImage = CIImage(contentsOf: url)                                  // get all the metadata                 self.allPhotoMetadata = fullImage?.properties ?? [:]                                  // {TIFF}                 if let tiffDict = self.allPhotoMetadata["{TIFF}"] as? [String:Any] {                     if tiffDict["Make"] != nil {                         self.cameraData[cameraKeys.make] = tiffDict["Make"]                     }                     if tiffDict["Model"] != nil {                         self.cameraData[cameraKeys.model] = tiffDict["Model"]                     }                     if tiffDict["ImageDescription"] != nil {                         self.imageData[imageKeys.caption] = tiffDict["ImageDescription"]                     }                 }                                  // {IPTC}                 if let iptcDict = self.allPhotoMetadata["{IPTC}"] as? [String:Any] {                     // if we didn't find a caption in the TIFF dict, try to get it from IPTC data                     // first try, Caption/Abtract, then ArtworkContentDescription                     if self.imageData[imageKeys.caption] == nil {                         if iptcDict["Caption/Abstract"] != nil {                             self.imageData[imageKeys.caption] = iptcDict["ArtworkContentDescription"]                         } else if iptcDict["ArtworkContentDescription"] != nil {                             self.imageData[imageKeys.caption] = iptcDict["ArtworkContentDescription"]                         }                     }                 }             }         })     }
9
2
3.4k
Sep ’23
iOS16: localIdentifier of PHAsset gets changed after saving to camera roll
Environment: iOS 16 beta 2, beta 3. iPhone 11 Pro, 12 mini Steps to reproduce: Subscribe to Photo Library changes via PHPhotoLibraryChangeObserver, put some logs to track inserted/deleted objects: func photoLibraryDidChange(_ changeInstance: PHChange) { if let changeDetails = changes.changeDetails(for: allPhotosFetchResult) { for insertion in changeDetails.insertedObjects { print("🥶 INSERTED: ", insertion.localIdentifier) } for deletion in changeDetails.removedObjects { print("🥶 DELETED: ", deletion.localIdentifier) } } } Save a photo to camera roll with PHAssetCreationRequest Go to the Photo Library, delete the newly saved photo Come back to the app and watch the logs: 🥶 INSERTED:  903933C3-7B83-4212-8DF1-37C2AD3A923D/L0/001 🥶 DELETED:  39F673E7-C5AC-422C-8BAA-1BF865120BBF/L0/001 Expected result: localIdentifier of the saved and deleted asset is the same string in both logs. In fact: It's different. So it appears that either the localIdentifier of an asset gets changed after successful saving, or it's a bug in the Photos framework in iOS 16. I've checked - in iOS 15 it works fine (IDs in logs match).
2
1
1.4k
Nov ’23
PHPicker fails to load RAW images
We observed that the PHPicker is unable to load RAW images captured on an iPhone in some scenarios. And it is also somehow related to iCloud. Here is the setup: The PHPickerViewController is configured with preferredAssetRepresentationMode = .current to avoid transcoding. The image is loaded from the item provider like this: if itemProvider.hasItemConformingToTypeIdentifier(kUTTypeImage) { itemProvider.loadFileRepresentation(forTypeIdentifier: kUTTypeImage) { url, error in // work } } This usually works, also for RAW images. However, when trying to load a RAW image that has just been captured with the iPhone, the loading fails with the following errors on the console: [claims] 43A5D3B2-84CD-488D-B9E4-19F9ED5F39EB grantAccessClaim reply is an error: Error Domain=NSCocoaErrorDomain Code=4097 "Couldn’t communicate with a helper application." UserInfo={NSUnderlyingError=0x2804a8e70 {Error Domain=NSCocoaErrorDomain Code=4097 "connection from pid 19420 on anonymousListener or serviceListener" UserInfo={NSDebugDescription=connection from pid 19420 on anonymousListener or serviceListener}}} Error copying file type public.image. Error: Error Domain=NSItemProviderErrorDomain Code=-1000 "Cannot load representation of type public.image" UserInfo={NSLocalizedDescription=Cannot load representation of type public.image, NSUnderlyingError=0x280480540 {Error Domain=NSCocoaErrorDomain Code=4097 "Couldn’t communicate with a helper application." UserInfo={NSUnderlyingError=0x2804a8e70 {Error Domain=NSCocoaErrorDomain Code=4097 "connection from pid 19420 on anonymousListener or serviceListener" UserInfo={NSDebugDescription=connection from pid 19420 on anonymousListener or serviceListener}}}}} We observed that on some devices, loading the image will actually work after a short time (~30 sec), but on others it will always fail. We think it is related to iCloud Photos: On the device that has iCloud Photos sync enabled, the picker is able to load the image right after it was synced to the cloud. On devices that don't sync the image, loading always fails. It seems that the sync process is doing some processing (?) of the image that will later enable the picker to load it successfully, but that's just guessing. Additional observations: This seems to only occur for images that were taken with the stock Camera app. When using Halide to capture RAW (either ProRAW or RAW), the Picker is able to load the image. When trying to load the image as kUTTypeRawImage instead of kUTTypeImage, it also fails. The picker also can't load RAW images that were AirDroped from another device, unless it synced to iCloud first. This is reproducable using the Selecting Photos and Videos in iOS sample code project. We observed this happening in other apps that use the PHPicker, not just ours. Is this a bug, or is there something that we are missing?
5
1
1.8k
Sep ’23
{Error Domain=PHPhotosErrorDomain Code=3305 "(null)"}
When I used BOOL compatible = UIVideoAtPathIsCompatibleWithSavedPhotosAlbum([pathUrl path]); if (compatible) { UISaveVideoAtPathToSavedPhotosAlbum([pathUrl path], self, @selector(savedPhotoImage:didFinishSavingWithError:contextInfo:), nil); } to save video,but here is some error. My phone has 33.72GB available and the video is only 4.2GB in size. What should i do? ps: I try to use [PHPhotoLibrary sharedPhotoLibrary] performChanges to save, but i got the same error. error code -3305.
2
1
2.6k
Aug ’23
Permissions when creating an album for an app and saving media
I am trying to create an album for an app in photo_kit and store images in it, is there any way to do this under the NSPhotoLibraryAddUsageDescription permission? At first glance, using NSPhotoLibraryAddUsageDescription seems to be the best choice for this app, since I will not be loading any images. However, there are two album operations that can only be done under NSPhotoLibraryUsageDescription. Creating an album Even though the creation of an album does not involve loading media, it is necessary to use NSPhotoLibraryUsageDescription to allow users to load media. This is a bit unconvincing. Saving images in the created album Before saving, you must check to see if the app has already created the album. You need to fetch the name as a key. This is where NSPhotLibraryUsageDescription is needed. I understand that the NSPhotLibraryUsageDescription is needed for fetching, but if iOS forbids the creation of albums with the same name and ignores attempts to create an album with an already existing name, this fetching will not be necessary. In summary, I just want to create an album for my app and store media in it, but in order to do so I need to get permission from the user to read into the photos, which goes against the idea that the permissions I request should be minimal and only what I need. If there is a way to do this under the NSPhotoLibraryAddUsageDescription permission I would like to know, I am new to Swift so sorry if I am wrong.
1
0
803
Jul ’23
HEIF10 representation doesn't contain alpha channel
When using the heif10Representation and writeHEIF10Representation APIs of CIContext, the resulting image doesn’t contain an alpha channel. When using the heifRepresentation and writeHEIFRepresentation APIs, the alpha channel is properly preserved, i.e., the resulting HEIC will contain a urn:mpeg:hevc:2015:auxid:1 auxiliary image. This image is missing when exporting as HEIF10. Is this a bug or is this intentional? If I understand the spec correctly, HEIF10 should be able to support alpha via auxiliary image (like HEIF8).
1
0
770
Jul ’23
Display HDR images for PhotoKit assets
In my app I get a UIImage for a PHAsset via PHImageManager.requestImage(for:targetSize:contentMode:options:resultHandler:). I directly display that image in a UIImageView that has preferredImageDynamicRange set to .high. The problem is I do not see the high dynamic range. I see the HDRDemo23 sample code uses PhotosPicker to get a UIImage from Data through UIImageReader whose config enables prefersHighDynamicRange. Is there a way to support HDR when using the Photos APIs to request display images? And is there support for PHLivePhoto displayed in PHLivePhotoView retrieved via PHImageManager.requestLivePhoto?
4
1
999
Apr ’24
Proper way to import screen recordings with SwiftUI PhotosPicker?
Hello, I am building contact form that allows to attach screenshots and screen recordings. The PhotosPicker part is relatively straightforward but I am not sure how to properly import the selected items. The binding is of type [PhotosPickerItem] which requires (at least for my current implementation) to first know if the item is image or video. I have this not so pretty code to detect if the item is video: let isVideo = item.supportedContentTypes.first(where: { $0.conforms(to: .video) }) != nil || item.supportedContentTypes.contains(.mpeg4Movie) Which for screen recordings seems to work only because I ask about .mpeg4Movie and then I have this struct: struct ScreenRecording: Transferable { let url: URL static var transferRepresentation: some TransferRepresentation { FileRepresentation(contentType: .mpeg4Movie) { video in SentTransferredFile(video.url) } importing: { received in let copy = URL.temporaryDirectory.appending(path: "\(UUID().uuidString).mp4") try FileManager.default.copyItem(at: received.file, to: copy) return Self.init(url: copy) } } } Notice here I have just the .mpeg4Movie content type, I couldn't get it to work with more generic ones like movie and I am afraid this implementation could soon break if the screen recordings change video format/codec. And finally my logic to load the item: if isVideo { if let movie = try? await item.loadTransferable(type: ScreenRecording.self) { viewModel.addVideoAttachment(movie) } } else { if let data = try? await item.loadTransferable(type: Data.self) { if let uiImage = UIImage(data: data) { viewModel.addScreenshotAttachment(uiImage) } } } I would like to make this more "future proof" and less error prone - particularly the screen recordings part. I don't even need the UIImage since I am saving the attachments as files, I just need to know if the attachment is screenshot or video and get its URL. Thanks!
0
0
585
Aug ’23
What are the available NSSortDescriptors for PHFetchOptions?
I'm working with Apple's sample code for PhotoBrowse. In the main view controller's viewDidLoad() method we have this code: let allPhotosOptions = PHFetchOptions() allPhotosOptions.sortDescriptors = [NSSortDescriptor(key: "creationDate", ascending: true)] This code sorts the photos by "creationDate". How do I determine/find the other valid values for these sort descriptors? For example, what if I wanted to sort photos by their file size, last modified date, etc.? TIA!
2
0
425
Aug ’23
PHAsset unique identifier across devices.
I know that I can uniquely identify a PHAsset on a given device using localIdentifier but if that asset is synched (through iCloud, say) to another device, how to I uniquely identify that asset across multiple devices? My app allows users to store their images in the standard photo gallery, but I have no way of referring to them when they sync their app profile to another iOS device with my app installed.
2
0
736
Aug ’23
Is there a technical reason why Photos.app/PhotoKit does not support displaying file sizes?
It is 2023 and Photos.app still provides no way to sort photos/movies by file size. I ended up writing some AppleScript that displays the file sizes of images/movies in a specific album. It is also simple to create a shell script to examine the photo library contents directly but mapping the UUID-based filenames back to the names used in Photos.app is not straightforward (to me at least). You can't even create a smart album based upon file size. Why is there no native support for this in Photos.app/PhotoKit? (And, yes, I have submitted many feature requests over the years)
0
0
387
Aug ’23
Reference photo from picker, rather than copy?
Hello, is it possible to reference a photo from the photo picker (either UIKit or SwiftUI), such that I do not need to copy it somewhere else? Currently, I am storing the copy in CoreData in a field with external storage, but it did cross my mind that I don't actually need a copy in my own storage if I could point back to the photo library. If the user deleted the photo from the library that would be fine.
1
0
395
Aug ’23
Understanding PHPickerConfiguration.AssetRepresentationMode.current
The documentation for this API mentions: The system uses the current representation and avoids transcoding, if possible. What are the scenarios in which transcoding takes place? The reason for asking is that we've had a user reaching out saying they selected a video file from their Photos app, which resulted in a decrease in size from ~110MB to 35MB. We find it unlikely it's transcoding-related, but we want to gain more insights into the possible scenarios.
3
1
599
Sep ’23
Photos App: Re-indexing of edited photos
I would like to use a third-party app to edit the metadata of a photo to change its Caption and then be able to search in the Photos app to find that image with the edited caption. I have managed to do this by duplicating the photo with the edited metadata. The Photos app recognizes it as a new photo and indexes it with the new caption, making it searchable. However, when editing the photo in-place, the Photos app will not re-index the photo, therefore it will not be searchable. Is there a way to edit photos in-place and have them searchable with the new metadata?
1
0
629
Sep ’23
HDR Image capture/conversion
Hello! After recent talk on the WWDC2023 about HDR support and finding this documentation page on Applying Apple HDR effect on photos, I became very interested in the HDR Gain Map format. From documentation page it is clear how we can restore original HDR from SDR and Gain Map representation, but my question is - how from HDR we can convert back to the SDR + Gain Map representation? As I understand right know, conversion from HDR to SDR + Gain Map includes two steps: Tone mapping of HDR for getting correct SDR When we have both HDR and SDR, from equation in the documentation page we can calculate Gain Map Am I correct? If so, what tone mapping algorithm for HDR -> SDR conversion is used right know? Can't find any information about this in the internet:( Would be very grateful for your response!
3
2
1.3k
Sep ’23