Post not yet marked as solved
specifically:
https://www.macrumors.com/2022/06/15/ios-16-remove-subject-from-background/
Post not yet marked as solved
Environment: iOS 16 beta 2, beta 3. iPhone 11 Pro, 12 mini
Steps to reproduce:
Subscribe to Photo Library changes via PHPhotoLibraryChangeObserver, put some logs to track inserted/deleted objects:
func photoLibraryDidChange(_ changeInstance: PHChange) {
if let changeDetails = changes.changeDetails(for: allPhotosFetchResult) {
for insertion in changeDetails.insertedObjects {
print("🥶 INSERTED: ", insertion.localIdentifier)
}
for deletion in changeDetails.removedObjects {
print("🥶 DELETED: ", deletion.localIdentifier)
}
}
}
Save a photo to camera roll with PHAssetCreationRequest
Go to the Photo Library, delete the newly saved photo
Come back to the app and watch the logs:
🥶 INSERTED: 903933C3-7B83-4212-8DF1-37C2AD3A923D/L0/001
🥶 DELETED: 39F673E7-C5AC-422C-8BAA-1BF865120BBF/L0/001
Expected result: localIdentifier of the saved and deleted asset is the same string in both logs.
In fact: It's different.
So it appears that either the localIdentifier of an asset gets changed after successful saving, or it's a bug in the Photos framework in iOS 16. I've checked - in iOS 15 it works fine (IDs in logs match).
Post not yet marked as solved
Can't save gifs, it just saves frame of it as a photo.
Post not yet marked as solved
Quit f.e.Photos app.
Immediately reopen it.
Repeat steps 1-2 again several times.
App crashes without any dialog, impossible to share crash report.
Post not yet marked as solved
Where is the .isEditing option???
Post not yet marked as solved
Hi there.
I'm stuck with the creation of files with a wide dynamic range (Smart HDR?) as iPhone 12 and later creates. They look bright on the OLED screen. What I'm trying to do currently is to create a HEIF or JPEG image with an embedded auxiliary grayscale HDR Gain Map. I have a script in Swift. It works well because I can see that AuxiliaryImageType (for HEIF) or MPImage2 for (JPEG) tag but my file still looks dull and flat. I didn't find anything in the Documentation. What might be wrong?
Post not yet marked as solved
In my app, if I ask PHAccessLevel in .addOnly first,then in other page ask .readWirte.I can`t read my photos in my phone in this application life cycle. (cnm, why can not use chinese)
Post not yet marked as solved
Our
team has implemented PHPickerViewController to allow multiple photos to be
selected. It works fine in the simulator but when tested on actual iPad we have
this strange behaviour. The first time the gallery is displayed it works fine.
If we try and display the gallery again it displays a window with this message
“Unable to Load Items - [Try Again]”. If I tap [Try Again] or if I
use our Gallery button it does display the gallery. And it keeps cycling
through this behaviour. ie. every second attempt to display the gallery leads
to the “Unable to Load Items” window.
Every time we display the gallery we dismiss the old instance and create/initialize a
new instance of the controller. Our code is written in objective-c…
-(void)initPHPickerController
{
[self dismiss];
mPHPickerController = nullref;
PHPickerConfiguration *config = [[PHPickerConfiguration alloc] init];
config.selectionLimit = 0; // 0 represents no selection limit.
config.filter = [PHPickerFilter imagesFilter];
config.preferredAssetRepresentationMode = PHPickerConfigurationAssetRepresentationModeCurrent;
PHPickerViewController *pickerViewController = [[PHPickerViewController alloc] initWithConfiguration:config];
pickerViewController.delegate = self;
mPHPickerController = pickerViewController;
}
- (void)dismiss
{
if (mPHPickerController)
{
[+mPHPickerController dismissViewControllerAnimated:YES completion:nil];
}
...
}
Any suggestions on how to fix this ?
I have seen a post on stackoverflow that this may be a bug in iOS.
Post not yet marked as solved
【Use】 open func loadDataRepresentation(forTypeIdentifier typeIdentifier: String, completionHandler: @escaping (Data?, Error?) -> Void) -> Progress
if self.hasItemConformingToTypeIdentifier(UTType.webP.identifier) {
return try await self.loadDataRepresentation(forTypeIdentifier: UTType.webP.identifier)
}
At this point you can load webp Image。But it is impossible to judge this time webp through Data, because he is indistinguishable from jpeg。
extension Data {
var isWebP: Bool {
// Ensure the size of the data is large enough
// for us to properly check.
guard self.count >= 12 else {
return false
}
return withUnsafeBytes { bytes in
// The first 4 bytes are the ASCII letters "RIFF"
// Skipping 4 bytes for the file size, the next 4 bytes after
// that should read "WEBP"
if String(decoding: bytes[0..<4], as: UTF8.self) != "RIFF" ||
String(decoding: bytes[8..<12], as: UTF8.self) != "WEBP" {
return false
}
return true
}
}
Post not yet marked as solved
Code from wwdc20-10652 is used.
Open PHPickerViewController
Choose 1 photo
Do nothing and dismiss
Leak is shown as below.
@IBAction private func chooseImagePressed(_ sender: Any) {
if #available(iOS 14, *) {
var configuration = PHPickerConfiguration()
configuration.filter = .images
let picker = PHPickerViewController(configuration: configuration)
picker.delegate = self
present(picker, animated: true)
} else {
// Fallback on earlier versions
}
}
extension PhotosVC: PHPickerViewControllerDelegate {
@available(iOS 14, *)
func picker(_ picker: PHPickerViewController, didFinishPicking results: [PHPickerResult]) {
dismiss(animated: true)
}
}
Post not yet marked as solved
I use PHPhotoLibrary in iOS to save JPG files to the camera roll.
There is a difference in the date and time displayed in the "Photos" app depending on whether the 24-hour display setting in iOS is set to On or Off when saving.
When saving with "24-hour display: Off", the date and time in the Exif of the JGP file is displayed.
When saved with "24 Hour Display: On", the date and time when the file was saved is displayed.
JPG files are saved with the following code.
HPhotoLibrary.shared().performChanges({
PHAssetChangeRequest.creationRequestForAssetFromImage(atFileURL: jpgFileUrl)
}, completionHandler: { (success: Bool, error: Error?) in
print("success=\(success), error=\(String(describing: error))")
})
Is this an iOS specification or a bug? Or is there a problem with the code?
I would appreciate it if you could provide me with some information.
Thank you in advance.
Post not yet marked as solved
Apologies if this has been asked. I was reviewing the transcript iOS 16 camera improvement as it relates to depth and depth maps. It’s my understanding that said models including LIDAR scanners play a big role in the resulting depth maps that are captured when taking an image. I know this is an improvement from models that rely on Truedepth cameras, but how does this play into the new Lock Screen setup?
Is this effect created solely through software automations or does the presence of LIDAR and depth maps influence the results when it comes to creating the depth effect that pulls subjects to the front while allowing the background to remain in the foreground?
Thanks so much in advanced!
Post not yet marked as solved
Will the code for the SwiftUI PhotosPicker app shown during the session be made available in the code sample area? Trying to follow along and duplicate the app, but not all of the code is visible. Thanks!
Post not yet marked as solved
I try to get fullSizeImageURL and then calculate md5 of asset via url:
var assetUrl = syncUrl(asset: asset)
let assetMD5 = md5File(url: assetUrl)
To get url I use requestContentEditingInput. The problem is contentEditingInput can be empty. I have had cases where my app worked on my new iphone the first few times. But after some tries, this problem was gone and contentEditingInput was always not empty. Not sure but I think it is because all assets were cached.
So my questions are:
Why contentEditingInput can be empty?
How can to get asset url if contentEditingInput is empty?
How can I clear asset photos cache to reproduce this issue if it is related to cache?
Part of code which I use:
extension PHAsset {
func getURL(completionHandler: @escaping ((_ responseURL: URL?) -> Void)) {
let options: PHContentEditingInputRequestOptions = PHContentEditingInputRequestOptions()
options.canHandleAdjustmentData = { (adjustmeta: PHAdjustmentData) -> Bool in
return true
}
options.isNetworkAccessAllowed = true
self.requestContentEditingInput(with: options, completionHandler: { (contentEditingInput: PHContentEditingInput?, info: [AnyHashable: Any]) -> Void in
!!!!!!contentEditingInput can be empty
completionHandler(contentEditingInput?.fullSizeImageURL as URL?)
})
}
md5:
func md5File(url: URL, isCancelled: @escaping () -> Bool) -> Data? {
let bufferSize = 1024 * 1024
do {
// Open file for reading:
let file = try FileHandle(forReadingFrom: url)
defer {
file.closeFile()
}
// Create and initialize MD5 context:
var context = CC_MD5_CTX()
CC_MD5_Init(&context)
// Read up to `bufferSize` bytes, until EOF is reached, and update MD5 context:
while autoreleasepool(invoking: {
if isCancelled() { return false }
let data = file.readData(ofLength: bufferSize)
if data.count > 0 {
data.withUnsafeBytes {
_ = CC_MD5_Update(&context, $0.baseAddress, numericCast(data.count))
}
return true // Continue
} else {
return false // End of file
}
}) { }
if isCancelled() { return nil }
// Compute the MD5 digest:
var digest: [UInt8] = Array(repeating: 0, count: Int(CC_MD5_DIGEST_LENGTH))
_ = CC_MD5_Final(&digest, &context)
return Data(digest)
} catch {
print("Cannot open file:", error.localizedDescription)
return nil
}
}
Post not yet marked as solved
In iOS 14, Live Photo could display as video in the photo album generated memories, but in iOS 15, the Live Photo is display as still photo in the memories, made it more like a slide show instead of lively memories like it used to be. The great feature of Live Photo should be harness and display as video in the memories generated by the photo app.
Live Photo became still photo in memories, really downgraded the value of the photos and the memories.
Post not yet marked as solved
I'm trying to move a video I create from images within my app from a temporary path to the photo library.
I've verified that the movie exists by downloading the app data via devices/xcode and the movie then plays fine on my macbook.
I've tried:
UISaveVideoAtPathToSavedPhotosAlbum(
videoPath,
self,
#selector(self.video(_:didFinishSavingWithError:contextInfo:)),
nil)
with Error:
Optional(Error Domain=ALAssetsLibraryErrorDomain Code=-1 "Unknown error" UserInfo={NSLocalizedDescription=Unknown error, NSUnderlyingError=0x283684570 {Error Domain=ALAssetsLibraryErrorDomain Code=-1 "Unknown error" UserInfo={NSLocalizedDescription=Unknown error, NSUnderlyingError=0x283681860 {Error Domain=ALAssetsLibraryErrorDomain Code=-1 "Unknown error" UserInfo={NSLocalizedDescription=Unknown error, NSUnderlyingError=0x28366e490 {Error Domain=com.apple.photos.error Code=42001 "(null)"}}}}}})
and
PHPhotoLibrary.requestAuthorization { status in
// Return if unauthorized
guard status == .authorized else {
print("Error saving video: unauthorized access")
return
}
PHPhotoLibrary.shared().performChanges({
PHAssetChangeRequest.creationRequestForAssetFromVideo(atFileURL: videoURL as URL)
}) { success, error in
if !success {
print("Error saving video: \(String(describing: error))")
}
}
}
with Error:
Domain=ALAssetsLibraryErrorDomain Code=-1 "Unknown error" UserInfo= ...... {Error Domain=com.apple.photos.error Code=42001 "(null)"
both compile fine and are called, but end up giving me errors that do not help in the slightest.
I have a full help request on StackOverflow with a link to the project (that it does not let me post here): https://stackoverflow.com/questions/63575539/swift-ios-save-video-to-library
Post not yet marked as solved
UIImageWriteToSavedPhotosAlbum work well in iOS 9/10but crashed in iOS 11what 's problem?NSData* imgdata = [NSData dataWithContentsOfFile:path]; if (nil != imgdata) { UIImage* saveImg = [[UIImage alloc] initWithData:imgdata]; if (nil != saveImg) { UIImageWriteToSavedPhotosAlbum(saveImg, nil, nil, nil); return 1; } }
Post not yet marked as solved
Would like to use SwiftUI AsyncImage with Photos App Image that is associated with a String Name, possibly using a Core Data Model. A Picker might be used to determine which Image is associated with a user entered String Name. A List View would be used to display the saved Images and their names. Is there a way to use a Photos Image ID or path as a URL?
Post not yet marked as solved
I have been trying to load an image from the photo library on a swifui app. I am running Xcode 13.2.1 and building IOS 15.2
My code is as below
@Binding var image: UIImage?
func makeUIViewController(context: Context) -> PHPickerViewController {
var config = PHPickerConfiguration()
config.filter = .images
config.selectionLimit = 1
config.preferredAssetRepresentationMode = .compatible
let controller = PHPickerViewController(configuration: config)
controller.delegate = context.coordinator
return controller
}
func updateUIViewController(_ uiViewController: PHPickerViewController, context: Context) { }
func makeCoordinator() -> Coordinator {
Coordinator(self)
}
// Use a Coordinator to act as your PHPickerViewControllerDelegate
class Coordinator: NSObject, PHPickerViewControllerDelegate {
private let parent: PhotoPicker
init(_ parent: PhotoPicker) {
self.parent = parent
}
func picker(_ picker: PHPickerViewController, didFinishPicking results: [PHPickerResult]) {
picker.dismiss(animated: true)
print(results)
guard !results.isEmpty else {
return
}
guard let itemProvider = results.first?.itemProvider else { return }
print("Invoking getPhoto")
self.getPhoto2(from: itemProvider)
//parent.didFinishPicking(!results.isEmpty)
}
private func getPhoto2(from itemProvider: NSItemProvider) {
print("getPhoto")
if itemProvider.canLoadObject(ofClass: UIImage.self) {
itemProvider.loadObject(ofClass: UIImage.self) { image, error in
self.parent.image = image as? UIImage
print("Loaded Image \(error)")
}
}
}
}
}
On the console I see the following error
022-01-27 00:40:01.485085-0500 Vescense[3174:884964] [Picker] Showing picker unavailable UI (reason: still loading) with error: (null)
Further when I print the result I see
[PhotosUI.PHPickerResult(itemProvider: <PUPhotosFileProviderItemProvider: 0x2818fc980> {types = (
"public.jpeg",
"public.heic"
)}, assetIdentifier: nil)]
It doesn't appear like the error on loadObject has a value. And it's suspicious that assetIdentifier is nil.
Any thoughts on what I might be missing here would be most helpful.
Post not yet marked as solved
I've been working a while on removing duplicates in a 5k+ library. I have a decent program to find the dups and first move to a special subdirectory.
I want to make a library from then make it the system library so that I can see that on all apple devices.
Yes, I know the feedback site and hate it. There's no interactive help if you run beta and I've not been able to successfully back up to the shipping version.