Post not yet marked as solved
There's only a single function under PHPickerViewControllerDelegate which is didFinishPicking according to the documentation. How do I implement dismiss for the Cancel button that comes along when tapping out with PHPickerViewController?
Have no problem if continue using UIImagePickerControllerDelegate, as it comes with imagePickerControllerDidCancel . However, if I were to implement with the new PHPickerViewController that currently only have a single function, how to enable proper dismiss right on the click of Cancel button instead of just relying the swipe down to dismiss the screen. Kindly advise. Thanks.
Post not yet marked as solved
There are list of frames displayed at the bottom of the screen when videos are played on Photos app in iPhone. Which API is used for this functionality in iOS photo app. Is it possible to check this ?
Post not yet marked as solved
I would like to be able to select a photo in iOS using Swift from the Photos library using
UIImagePicker and copy all the image (with modified Exif metadata) to a
new photo which I save in Photos. If I use UIActivityViewController to
choose the save option (from copy/save/assign to contact/print/add to
shared album/save to file), the input image Exif metadata is not transferred
when I create a new UIimage from the loaded image data with the
modified metadata. How can get the image with modified Exif metadata attached
to the saved photo?
Started working on a photo editing extension in Xcode. no matter what I do whenever I select my extension to edit a photo the app crashes with the following error.
After finding the solution for hours only thing I got was to uninstall and restart the app, but that too didn’t work.
The following is the screenshot of the error.
Post not yet marked as solved
The same code can generate livePhoto in iOS 14, but can't generate livePhoto in iOS 15.1.
Does anyone know how to solve this problem? please help me. thanks
Post not yet marked as solved
tl;dr; Is there a way to ensure that only "Never" and "Read and Write" appear under Settings -> App Name -> Photos on devices running iOS 14 or higher?
The Issue:
The enhanced image library access permissions incorporated as part of iOS 14 are creating problems with a custom image selection flow created for an app and I'm just curious if there is a way to eliminate the "Selected Photos" option from the app settings until the app can be fully updated to offer users the enhanced security.
I've removed the "Select Photos..." option from the iOS permission alert viewcontroller for the Images Library by setting the value for the PHPhotoLibraryPreventAutomaticLimitedAccessAlert key to 'true' as recommended in the Apple documentation:
https://developer.apple.com/documentation/photokit/phauthorizationstatus/limited
However, if the device is running iOS 14 or higher, the option for "Selected Photos" is still available when I go to the App's settings in the device's Settings menu. If a user interacts with permissions at this level the app does not function properly. I was wondering if anyone has experienced this as well and possibly come up with an interim solution.
Post not yet marked as solved
Am using a Supervised Device with 15.4 OS
Scenario:
I have tried to push the Restriction payload to Device , with value true for "allowOpenFromUnmanagedToManaged" key .
case 1:
When i try to open a photo from Photo Library and try to open in with any Managed App ,the suggestions for managed app is not listed there. (Working as Expected)
My Problem is:
case 2:
But when i open the Managed App and Try to add a photo , It allows me to open the Photo Library ,from where i can add it.
If Sharing data from unmanaged to managed app is restricted, then it shouldn't be added in case2 ,Right?
FYI: The managed App i have used in Outlook App
Can Anyone Help me this strange Behaviour? Thanks In Advance
Post not yet marked as solved
I've tried to load icloud image using phasset with options
requestOptions.isSynchronous = false requestOptions.isNetworkAccessAllowed = true
I get CloudPhotoLibraryErrorDomain Code=1005 error I don't understand where I make mistake, I have used SDWebImagePhotosPlugin methods as well as Photos methods like requestImageDataAndOrientation and requestImageData, still I get the image as nil and the above error
this is my code:
imageManager.requestImageDataAndOrientation(for: deviceImage, options: phImageRequestOptions()) { data,deliveryMode, orentation, _ in
if data != nil {
completion(data)
} else {
SDImageLoadersManager.shared.loaders = [SDWebImageDownloader.shared, SDImagePhotosLoader.shared]
SDWebImageManager.defaultImageLoader = SDImageLoadersManager.shared
let photosURL = NSURL.sd_URL(with: deviceImage)
SDImagePhotosLoader.shared.requestImage(with: photosURL as URL?, options: [.highPriority,.retryFailed,.refreshCached], context: [.customManager: self.manager], progress: nil) { image, data,error, success in
if image != nil {
completion(image?.pngData())
} else {
completion(nil)
}
}
}
Post not yet marked as solved
Latest Albums\Videos in iCloud not available from code until I open the Photos App.
Figured out that when I open the PHCollection picker and close it (even in a fraction of second), it triggers the iCloud sync on local Photos albums and I am able to get latest content.
Is there way to trigger the iCloud sync with Photos albums through program or at least a workaround like open\close the PHCollection picker in background?
I want to change the background color of PHPicker's navigationbar in iOS15.
↓This answer did not work in PHPicker
https://developer.apple.com/forums/thread/682420
How to customize?
Post not yet marked as solved
I have been unable to capture Live Photos using UIImagePickerController. I can capture still photos and even video (which is not my scenario but I checked just to make sure), but the camera does not capture live photos. The documentation suggests it should (source):
To obtain the motion and sound content of a live photo for display (using the PHLivePhotoView class), include the kUTTypeImage and kUTTypeLivePhoto identifiers in the allowed media types when configuring an image picker controller. When the user picks or captures a Live Photo, the editingInfo dictionary contains the livePhoto key, with a PHLivePhoto representation of the photo as the corresponding value.
I've set up my controller:
let camera = UIImagePickerController()
camera.sourceType = .camera
camera.mediaTypes = [UTType.image.identifier, UTType.livePhoto.identifier]
camera.delegate = context.coordinator
In the delegate I check for the Live Photo:
func imagePickerController(_ picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [UIImagePickerController.InfoKey : Any]) {
if let live = info[.livePhoto] as? PHLivePhoto {
// handle live photo
} else if let takenImage = info[.originalImage] as? UIImage, let metadata = info[.mediaMetadata] as? [AnyHashable:Any] {
// handle still photo
}
}
But I never get the Live Photo.
I've tried adding NSMicrophoneUsageDescription to the info.plist thinking it needs permissions for the microphone, but that did not help. Of course, I've added the NSCameraUsageDescription to give camera permissions.
Has anyone successfully captured Live Photos using UIImagePickerController?
Post not yet marked as solved
In a Photos extension, the function finishProject(completionHandler completion: () -> Void) of PHProjectExtensionController is not called when Photos quits. Is there anything I need to setup for Photos to call it?
When the user leaves the extension (and keeps Photos alive) the function is called.
Post not yet marked as solved
Does PHpicker change the hash of the image when being shared.
Post not yet marked as solved
Does Photospicker change the hash value of the photo after giving permission to third-party apps?
Post not yet marked as solved
The sample code in the Apple documentation found in PHCloudIdentifier does not compile in xCode 13.2.1.
Can the interface for identifier conversion be clarified so that the answer values are more accessible/readable. The values are 'hidden' inside a Result enum
It was difficult (for me) to rewrite the sample code because I made the mistake of interpreting the Result type as a tuple. Result type is really an enum.
Using the Result type as the return from library.cloudIdentifierMappings(forLocalIdentifiers: ) and .localIdentifierMappings(
for: )
puts the actual mapped identifiers inside the the enum where they need additional access via a .stringValue message or an evaluation of an element of the result enum.
For others finding the same compile issue, here is a working version of the sample code. This compiles in xCode 13.2.1.
func localId2CloudId(localIdentifiers: [String]) -> [String] {
var mappedIdentifiers = [String]()
let library = PHPhotoLibrary.shared()
let iCloudIDs = library.cloudIdentifierMappings(forLocalIdentifiers: localIdentifiers)
for aCloudID in iCloudIDs {
let cloudResult: Result = aCloudID.value
// Result is an enum .. not a tuple
switch cloudResult {
case .success(let success):
let newValue = success.stringValue
mappedIdentifiers.append(newValue)
case .failure(let failure):
// do error notify to user
}
}
return mappedIdentifiers
}
``` swift func
func cloudId2LocalId(assetCloudIdentifiers: [PHCloudIdentifier]) -> [String] {
// patterned error handling per documentation
var localIDs = [String]()
let localIdentifiers: [PHCloudIdentifier: Result<String, Error>] = PHPhotoLibrary.shared() .localIdentifierMappings(
for: assetCloudIdentifiers)
for cloudIdentifier in assetCloudIdentifiers {
guard let identifierMapping = localIdentifiers[cloudIdentifier] else {
print("Failed to find a mapping for \(cloudIdentifier).")
continue
}
switch identifierMapping {
case .success(let success):
localIDs.append(success)
case .failure(let failure) :
let thisError = failure as? PHPhotosError
switch thisError?.code {
case .identifierNotFound:
// Skip the missing or deleted assets.
print("Failed to find the local identifier for \(cloudIdentifier). \(String(describing: thisError?.localizedDescription)))")
case .multipleIdentifiersFound:
// Prompt the user to resolve the cloud identifier that matched multiple assets.
print("Found multiple local identifiers for \(cloudIdentifier). \(String(describing: thisError?.localizedDescription))")
// if let selectedLocalIdentifier = promptUserForPotentialReplacement(with: thisError.userInfo[PHLocalIdentifiersErrorKey]) {
// localIDs.append(selectedLocalIdentifier)
default:
print("Encountered an unexpected error looking up the local identifier for \(cloudIdentifier). \(String(describing: thisError?.localizedDescription))")
}
}
}
return localIDs
}
Post not yet marked as solved
I'm trying to implement a Gridview with photos stored in the photos app so the user can choose one picture and pick that as his profile picture. Before SwiftUI I used the collection view and Photos Kit to fetch the images and display them in a grid.
Now that I switched to SwiftUI I tried to use LazyVGrid. I am able to fetch all the photos of the user and display them in a Grid. However it uses a lot of memory. I had a memory leak before but now Instruments isn't showing any leak any more.
I thought it may be, that the Grid isn't really unloading the displayed images, when it gets invisible to the user. However if you scroll up and down multiple times it just uses a lot more memory than before. Like the grid is creating always new views and the old ones don't get deleted. Am I using something wrong or misunderstand the principles of LazyVGrid?
My current code
Post not yet marked as solved
I have enabled runtime concurrency warnings to check for future problems concerning concurrency: Build Setting / Other Swift Flags:
-Xfrontend -warn-concurrency -Xfrontend -enable-actor-data-race-checks
When trying to call the async form of PHPhotoLibrary.shared().performChanges{} I get the following runtime warning: warning: data race detected: @MainActor function at ... was not called on the main thread in the line containing performChanges.
My sample code inside a default Xcode multi platform app template is as follows:
import SwiftUI
import Photos
@MainActor
class FotoChanger{
func addFotos() async throws{
await PHPhotoLibrary.requestAuthorization(for: .addOnly)
try! await PHPhotoLibrary.shared().performChanges{
let data = NSDataAsset(name: "Swift")!.data
let creationRequest = PHAssetCreationRequest.forAsset()
creationRequest.addResource(with: .photo, data: data, options: PHAssetResourceCreationOptions())
}
}
}
struct ContentView: View {
var body: some View {
ProgressView()
.task{
try! await FotoChanger().addFotos()
}
}
}
You would have to have a Swift data asset inside the asset catalog to run the above code, but the error can even be recreated if the data is invalid.
But what am I doing wrong? I have not found a way to run perform changes, the block or whatever causes the error on the main thread.
PS: This is only test code to show the problem, don't mind the forced unwraps.
Post not yet marked as solved
I have been trying to load an image from the photo library on a swifui app. I am running Xcode 13.2.1 and building IOS 15.2
My code is as below
@Binding var image: UIImage?
func makeUIViewController(context: Context) -> PHPickerViewController {
var config = PHPickerConfiguration()
config.filter = .images
config.selectionLimit = 1
config.preferredAssetRepresentationMode = .compatible
let controller = PHPickerViewController(configuration: config)
controller.delegate = context.coordinator
return controller
}
func updateUIViewController(_ uiViewController: PHPickerViewController, context: Context) { }
func makeCoordinator() -> Coordinator {
Coordinator(self)
}
// Use a Coordinator to act as your PHPickerViewControllerDelegate
class Coordinator: NSObject, PHPickerViewControllerDelegate {
private let parent: PhotoPicker
init(_ parent: PhotoPicker) {
self.parent = parent
}
func picker(_ picker: PHPickerViewController, didFinishPicking results: [PHPickerResult]) {
picker.dismiss(animated: true)
print(results)
guard !results.isEmpty else {
return
}
guard let itemProvider = results.first?.itemProvider else { return }
print("Invoking getPhoto")
self.getPhoto2(from: itemProvider)
//parent.didFinishPicking(!results.isEmpty)
}
private func getPhoto2(from itemProvider: NSItemProvider) {
print("getPhoto")
if itemProvider.canLoadObject(ofClass: UIImage.self) {
itemProvider.loadObject(ofClass: UIImage.self) { image, error in
self.parent.image = image as? UIImage
print("Loaded Image \(error)")
}
}
}
}
}
On the console I see the following error
022-01-27 00:40:01.485085-0500 Vescense[3174:884964] [Picker] Showing picker unavailable UI (reason: still loading) with error: (null)
Further when I print the result I see
[PhotosUI.PHPickerResult(itemProvider: <PUPhotosFileProviderItemProvider: 0x2818fc980> {types = (
"public.jpeg",
"public.heic"
)}, assetIdentifier: nil)]
It doesn't appear like the error on loadObject has a value. And it's suspicious that assetIdentifier is nil.
Any thoughts on what I might be missing here would be most helpful.
Post not yet marked as solved
LivePhotos taken with iPhone13 cannot be saved on iOS11 iPhone
The application incorporates a function to send image data between iPhones and save it to PhotoLibrary on the receiving iPhone side.
We receive the image data and save it in PhotoLibrary from the URL where the data is located, following the method on Apple's official website to save LivePhotos in PhotoLibrary.
However, when I receive LivePhotos taken by iPhone13 and try to save them in the same way on iPhone7 (iOS11), I get the following error and cannot complete the process
"error The operation couldn’t be completed. (Cocoa error -1.)"
func saveLivePhotoToPhotosLibrary(stillImageData: Data, livePhotoMovieURL: URL) { PHPhotoLibrary.requestAuthorization { status in
guard status == .authorized else { return }
PHPhotoLibrary.shared().performChanges({
let creationRequest = PHAssetCreationRequest.forAsset()
creationRequest.addResource(with: .photo, data: stillImageData, options: nil)
let options = PHAssetResourceCreationOptions()
options.shouldMoveFile = true
creationRequest.addResource(with: .pairedVideo, fileURL: livePhotoMovieURL, options: options)
}) { success, error in
// Handle completion.
}
}
}
When I send the same LivePhotos data via AirDrop, I also get an error message and cannot migrate it.
Does iOS11 not support LivePhotos on the latest iPhones?
Inside
func picker(_ picker: didFinishPicking results:)
I am trying to obtain a UIImage from a Live Photo. I have parsed the results and filtered by regular still photos and live photos. For the still photos, I already have the UIImage but for the live photos I do not understand how to get from a PHLivePhoto to the PHAsset.
if let livePhoto = object as? PHLivePhoto {
DispatchQueue.main.async {
// what code do I insert here to get from PHLivePhoto to a UIImage? I need to extract a UIImage from a PHLivePhoto
}
}
Thanks for the help in advance!