Photos and Imaging

RSS for tag

Integrate still images and other forms of photography into your apps.

Posts under Photos and Imaging tag

84 Posts
Sort by:
Post not yet marked as solved
1 Replies
192 Views
tl;dr; Is there a way to ensure that only "Never" and "Read and Write" appear under Settings -> App Name -> Photos on devices running iOS 14 or higher? The Issue: The enhanced image library access permissions incorporated as part of iOS 14 are creating problems with a custom image selection flow created for an app and I'm just curious if there is a way to eliminate the "Selected Photos" option from the app settings until the app can be fully updated to offer users the enhanced security. I've removed the "Select Photos..." option from the iOS permission alert viewcontroller for the Images Library by setting the value for the PHPhotoLibraryPreventAutomaticLimitedAccessAlert key to 'true' as recommended in the Apple documentation: https://developer.apple.com/documentation/photokit/phauthorizationstatus/limited However, if the device is running iOS 14 or higher, the option for "Selected Photos" is still available when I go to the App's settings in the device's Settings menu. If a user interacts with permissions at this level the app does not function properly. I was wondering if anyone has experienced this as well and possibly come up with an interim solution.
Posted
by
Post not yet marked as solved
0 Replies
202 Views
Am using a Supervised Device with 15.4 OS Scenario: I have tried to push the Restriction payload to Device , with value true for "allowOpenFromUnmanagedToManaged" key . case 1: When i try to open a photo from Photo Library and try to open in with any Managed App ,the suggestions for managed app is not listed there. (Working as Expected) My Problem is: case 2: But when i open the Managed App and Try to add a photo , It allows me to open the Photo Library ,from where i can add it. If Sharing data from unmanaged to managed app is restricted, then it shouldn't be added in case2 ,Right? FYI: The managed App i have used in Outlook App Can Anyone Help me this strange Behaviour? Thanks In Advance
Posted
by
Post marked as solved
2 Replies
208 Views
let cell = collectionView.dequeueReusableCell(withReuseIdentifier: "PhoneDetailsCC", for: indexPath) as? PhoneDetailsCC     let imageUrl = phoneModel?.imageUrl[indexPath.row]     cell?.phoneDetailsImage.sd_setImage(with: URL(string: imageUrl!))     return cell!   } i am casting here cell image phoneDetailsImage and i use here sd_setimage to download inside array to come it other tableview each row also every cell to use model let imageUrl = phoneModel?.imageUrl[indexPath.row] this is and string array [string] and sdwebimage show inside url to convert image but i have another problem to it i am create an pdf in this every cell to use pdfkit beforo i used just 1 image and i prepared and send it other vc but i dont know how i send array of image to other vc i will share other code below import FirebaseStorage import FirebaseFirestore import SDWebImage import ProgressHUD import PDFKit class PhoneListViewController: UITableViewController {   @IBOutlet weak var phoneModelText: UITextView!   @IBOutlet weak var imeiAdressText: UITextView!   @IBOutlet weak var userNameText: UITextView!   @IBOutlet weak var idText: UITextView!   @IBOutlet weak var phoneNumberText: UITextView!   @IBOutlet weak var detailsText: UITextView!   @IBOutlet weak var dateText: UITextView!   @IBOutlet weak var priceText: UITextView!   @IBOutlet weak var adressText: UITextView!   @IBOutlet weak var imageView: UIImageView!   @IBOutlet weak var imageCollectionView: UICollectionView!       public var documentData: Data?   var phoneModel : PhoneModel?           override func viewDidLoad() {     super.viewDidLoad()     imageCollectionView.delegate = self     imageCollectionView.dataSource = self     view.backgroundColor? = UIColor.systemGray3     tableView.backgroundView = UIImageView(image: UIImage(named: "SplashScreen.jpeg"))     tableView.backgroundView?.alpha = 0.2     phoneModelText.text = phoneModel?.phoneModelText     imeiAdressText.text = phoneModel?.imeiAdressText     userNameText.text = phoneModel?.userNameText     idText.text = phoneModel?.idText     phoneNumberText.text = phoneModel?.phoneNumberText     detailsText.text = phoneModel?.detailsText     dateText.text = phoneModel?.dateText     priceText.text = phoneModel?.priceText     adressText.text = phoneModel?.adressText     imageCollectionView.register(UINib(nibName:"PhoneDetailsCC", bundle: Bundle.main), forCellWithReuseIdentifier: "PhoneDetailsCC")         }       @IBAction func printAction(_ sender: Any) {     let pdfPreview = PDFView()     if let data = documentData {       pdfPreview.document = PDFDocument(data: data)       pdfPreview.autoScales = true     }     view.addSubview(pdfPreview)   }   override func shouldPerformSegue(withIdentifier identifier: String, sender: Any?) -> Bool {     if       let _ = phoneModelText.text,       let _ = userNameText.text,       let _ = imageView.image, // this is i used before only 1 image and this is worked for me       let _ = adressText.text {       return true     }           let alert = UIAlertController(title: "Please Wait, Try again", message: "You Need to be wait downloading all image", preferredStyle: .alert)     alert.addAction(UIAlertAction(title: "OK", style: .default, handler: nil))     present(alert, animated: true, completion: nil)           return false   }       override func prepare(for segue: UIStoryboardSegue, sender: Any?) {     if segue.identifier == K.pdfSegue {       guard let vc = segue.destination as? PDFViewController else { return }               if let phoneM = phoneModelText.text,         let imeiA = imeiAdressText.text,         let nameS = userNameText.text,         let idN = idText.text,         let phoneN = phoneNumberText.text,         let adressT = adressText.text,         let detailS = detailsText.text,         let priceC = priceText.text,         let dateT = dateText.text,         let imageV = imageView.image               {         let pdfCreate = PDFCreate(phoneModel: phoneM, imeiAdress: imeiA, nameSurname: nameS, id: idN, phoneNumber: phoneN, adress: adressT, details: detailS, price: priceC, date: dateT, image: imageV)         vc.documentData = pdfCreate.createPdf()       } //this model use to create of the inside of pdf other text is passing well     }   } } extension PhoneListViewController: UICollectionViewDelegate,UICollectionViewDataSource {       func collectionView(_ collectionView: UICollectionView, numberOfItemsInSection section: Int) -> Int {     return phoneModel?.imageUrl.count ?? 0   }       func collectionView(_ collectionView: UICollectionView, didSelectItemAt indexPath: IndexPath) {     collectionView.deselectItem(at: indexPath, animated: true)              }       func collectionView(_ collectionView: UICollectionView, didDeselectItemAt indexPath: IndexPath) {     collectionView.deselectItem(at: indexPath, animated: true)             }       func collectionView(_ collectionView: UICollectionView, cellForItemAt indexPath: IndexPath) -> UICollectionViewCell {     let cell = collectionView.dequeueReusableCell(withReuseIdentifier: "PhoneDetailsCC", for: indexPath) as? PhoneDetailsCC     let imageUrl = phoneModel?.imageUrl[indexPath.row]     cell?.phoneDetailsImage.sd_setImage(with: URL(string: imageUrl!))     return cell!   }         } how i send it array of images to use sdwebimage inside of collection view data to other vc Thank you for answers...
Posted
by
Post not yet marked as solved
0 Replies
265 Views
Hi. I'd like to be able to do a flood fill on images, either UIImage or CGImage, and was wondering if there was a built in way to do this provided by Apple's standard frameworks? i.e. Take a bitmap image and specify a point and color and then make it fill the area with that color, no matter what shape it is. I've seen a few examples of algorithm code to do this, but they're quite large and complicated so am trying to avoid them.
Posted
by
Post not yet marked as solved
0 Replies
256 Views
Hi, If I generate an image from a symbol the resulting image is surrounded by some margins. For example this code: let image = UIImage(systemName: "suit.heart") will generate this image: As you can see there are some margins around the content. There is a way to build an image cropped to the actual content, like showed in the below example? Thank you
Posted
by
Post not yet marked as solved
0 Replies
305 Views
I have been unable to capture Live Photos using UIImagePickerController. I can capture still photos and even video (which is not my scenario but I checked just to make sure), but the camera does not capture live photos. The documentation suggests it should (source): To obtain the motion and sound content of a live photo for display (using the PHLivePhotoView class), include the kUTTypeImage and kUTTypeLivePhoto identifiers in the allowed media types when configuring an image picker controller. When the user picks or captures a Live Photo, the editingInfo dictionary contains the livePhoto key, with a PHLivePhoto representation of the photo as the corresponding value. I've set up my controller: let camera = UIImagePickerController() camera.sourceType = .camera camera.mediaTypes = [UTType.image.identifier, UTType.livePhoto.identifier] camera.delegate = context.coordinator In the delegate I check for the Live Photo: func imagePickerController(_ picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [UIImagePickerController.InfoKey : Any]) { if let live = info[.livePhoto] as? PHLivePhoto { // handle live photo } else if let takenImage = info[.originalImage] as? UIImage, let metadata = info[.mediaMetadata] as? [AnyHashable:Any] { // handle still photo } } But I never get the Live Photo. I've tried adding NSMicrophoneUsageDescription to the info.plist thinking it needs permissions for the microphone, but that did not help. Of course, I've added the NSCameraUsageDescription to give camera permissions. Has anyone successfully captured Live Photos using UIImagePickerController?
Posted
by
Post marked as solved
1 Replies
249 Views
Is it possible for apps to pull all photos from your gallery and upload them to their server if you grant the app full access to your gallery? I find it bizarre that this would be possible, even if Apple request permission to access their photos. I looked through ImagePickerController and it seem that users must select a photo/video to receive access through its delegate. However, on most forums like Quorum users say apps can upload your entire library if you give them full access. If this is true can someone outline how it's possible?
Posted
by
Post marked as solved
2 Replies
369 Views
For example, when Apple Engineers design something new like PHPickerViewController, I imagine that they test all the functionalities and to test all these functions, they create applications. Where can I download the code of those applications? I mean, code that is already tested and works from Apple Engineers. I'm sure they have tons of tested code, which would be very useful for us. Thanks!
Posted
by
Post not yet marked as solved
3 Replies
1.2k Views
I have been trying to load an image from the photo library on a swifui app. I am running Xcode 13.2.1 and building IOS 15.2 My code is as below     @Binding var image: UIImage?     func makeUIViewController(context: Context) -> PHPickerViewController {         var config = PHPickerConfiguration()         config.filter = .images         config.selectionLimit = 1         config.preferredAssetRepresentationMode = .compatible         let controller = PHPickerViewController(configuration: config)         controller.delegate = context.coordinator         return controller     }     func updateUIViewController(_ uiViewController: PHPickerViewController, context: Context) { }     func makeCoordinator() -> Coordinator {         Coordinator(self)     }     // Use a Coordinator to act as your PHPickerViewControllerDelegate     class Coordinator: NSObject, PHPickerViewControllerDelegate {         private let parent: PhotoPicker         init(_ parent: PhotoPicker) {             self.parent = parent         }         func picker(_ picker: PHPickerViewController, didFinishPicking results: [PHPickerResult]) {             picker.dismiss(animated: true)             print(results)             guard !results.isEmpty else {                 return             }             guard let itemProvider = results.first?.itemProvider else { return }             print("Invoking getPhoto")             self.getPhoto2(from: itemProvider)             //parent.didFinishPicking(!results.isEmpty)         }         private func getPhoto2(from itemProvider: NSItemProvider) {             print("getPhoto")                       if itemProvider.canLoadObject(ofClass: UIImage.self) {                 itemProvider.loadObject(ofClass: UIImage.self) { image, error in                     self.parent.image = image as? UIImage                     print("Loaded Image \(error)")                 }             }         } } } On the console I see the following error 022-01-27 00:40:01.485085-0500 Vescense[3174:884964] [Picker] Showing picker unavailable UI (reason: still loading) with error: (null) Further when I print the result I see [PhotosUI.PHPickerResult(itemProvider: <PUPhotosFileProviderItemProvider: 0x2818fc980> {types = (     "public.jpeg",     "public.heic" )}, assetIdentifier: nil)] It doesn't appear like the error on loadObject has a value. And it's suspicious that assetIdentifier is nil. Any thoughts on what I might be missing here would be most helpful.
Posted
by
Post not yet marked as solved
0 Replies
432 Views
G'day, I'm working on a share extension for our iOS app where the users are able to select multiple photos or videos and choose to share it conversations inside the app. Similar to other chat apps are doing. I ran into a problem where the file URL extracted through the inputItem's attachment are non existent for some of the files. For example share extension gives file URL such as this, file:///var/folders/4r/qlw_jjvj3w7gh5mssgzhx15w0000gp/T/com.apple.Photos/ShareKit-Exports/2918015E-B07C-4F35-9D98-86B58464DE88/B699E8DE-4965-4C0F-9D9B-4956E5E02730/IMG_1234.jpg But there is no such file in that location. Instead there is a file named IMG_1234.png (or different file extension), for certain items in user's photo library. How does this happen? Am I supposed to query the files inside the parent directory and locate truely existing file from it by ignoring the returned file name or some other way. When the same file is shared from other apps such as airdrop, mail, notes, they seem to have no problem opening up this file and the saved file (e.g. from airdrop) has identical file name and properties (IMG_1234.png) at final destination. Below is code snippet on how I process the extension context in my share extension. // I'm getting first input item here, I've checked and there are no more than 1 input item for the affected assets that are causing problem guard let itemProviders = (extensionContext?.inputItems.first as? NSExtensionItem)?.attachments else { return } var items: [Attachment] = [] // Attachment is a custom type that contains file url or image data itemProviders.forEach { itemProvider in if itemProvider.hasItemConformingToTypeIdentifier(kUTTypeImage as String) { let desiredTypeIdentifier: String = kUTTypeImage as String itemProvider.loadItem(forTypeIdentifier: desiredTypeIdentifier, options: nil) { data, _ in if let url = data as? URL { items.append(.init(data: .file(url))) // the URL returned above is incorrect for the affected items. It's reproducible every time. } else if let image = data as? UIImage { items.append(.init(data: .memory(image))) } } } } Regards,
Posted
by
Post marked as solved
1 Replies
422 Views
Inside func picker(_ picker: didFinishPicking results:) I am trying to obtain a UIImage from a Live Photo. I have parsed the results and filtered by regular still photos and live photos. For the still photos, I already have the UIImage but for the live photos I do not understand how to get from a PHLivePhoto to the PHAsset. if let livePhoto = object as? PHLivePhoto {   DispatchQueue.main.async { // what code do I insert here to get from PHLivePhoto to a UIImage? I need to extract a UIImage from a PHLivePhoto } } Thanks for the help in advance!
Posted
by
Post not yet marked as solved
0 Replies
384 Views
I am using the default HelloPhotogrammetry app you guys made: https://developer.apple.com/documentation/realitykit/creating_a_photogrammetry_command-line_app/ My system originally did not fit the specs because of a GPU issue to run this command line. To solve this issue I bought the Apple supported eGPU Black Magic to allow the graphics issue to function. Here is the error when I run it despite the eGPU: apply_selection_policy_once: prefer use of removable GPUs (via (null):GPUSelectionPolicy->preferRemovable) I have deduced that there needs to be this with the application running it: https://developer.apple.com/documentation/bundleresources/information_property_list/gpuselectionpolicy I tried modifying the Terminal.plist to the updated value - but there was no luck with it. I believe the CL within Xcode needs to have the updated value -- I need help on that aspect to be able to allow the system to use the eGPU. I did create a PropertyList within the MacOS app and added GPUSelectionPolicy with preferRemovable, and I am still having issues with the same above error. Please advice. Also -- to note, I did try to temporary turn off the Prefer External GPU within Terminal -- and it was doing the processing of the Photogrammetry but it was taking awhile to process (>30 mins plus.) I ended up killing that task. I did have a look at Activity Monitor and I did see that my internal GPU was being used, not my eGPU which is what I am trying to use. Previously -- when I did not have the eGPU plugged in - I would be getting an error saying that my specs did not meet criteria, so it was interesting to see that it assumed my Mac had criteria (which it technically did) it just did processing on the less powerful GPU.
Posted
by
Post not yet marked as solved
0 Replies
329 Views
I haven't be able to generate a HEIC photo with IPTC metadata, in JPEG I get it but no in HEIC. I'm using: CIContext method heifRepresentation(of: image, format: CIFormat.RGBA8, colorSpace: colorSpace, options: options) to generate the photo data, image is a CIImage and has the IPTC metadata but the final photo doesn't have it. If I use: CIContext method jpegRepresentation(of: image, colorSpace: ColorSpace.deviceRGB, options: options) the final JPG photo has the IPTC information. Anyone with the same issue or with an idea about what's going on?
Posted
by
Post not yet marked as solved
1 Replies
308 Views
There seems to be a difference to the way iPadOS and macOS handle raw image files (tested with Fujifilm X-T3 uncompressed RAF file - included on the compatible list). Running the following code (with url set to the file location of the RAF file) on the iPad displays the preview jpeg embedded in the RAF file whereas on the mac the raw data are converted: AsyncImage(url: item.url) { phase in       if let image = phase.image {        image.resizable().aspectRatio(contentMode: .fit)       } else if phase.error != nil {        Color.red       } else {        Color.blue       }      } Am I missing something or is this the expected behaviour and is it documented somewhere?
Posted
by
Post not yet marked as solved
2 Replies
493 Views
Hey all I’m new to the world of Xcode and building apps. I am wondering if it were possible to have an app run in the background and have it listen to the photos library and if it finds a new photo has been added it would automatically upload that to some type of rest api call without the user having to interact/do it manually? This will only be used in my house hold so I’m not looking to get it into the App Store or go through any approval process that apple does if it were going to be in the App Store. So before I spend too much time looking around - is this possible with iOS 14+? Only thing I have come across that remotely sounds like something that would work would be this PHPhotoLibraryChangeObserver but I’m not sure if the user has to interact with it in order for it to be used or not? I’m open to any suggestions you more experienced Xcode programmers have about the above.
Posted
by
Post not yet marked as solved
0 Replies
240 Views
I compared with several options to use get auxiliary images from CIImage. These options leak AVSemanticSegmentationMatte when using debug memory graph CIImage.init(data: data, options: [.auxiliarySemanticSegmentationSkinMatte: true]) CIImage.init(data: data, options: [.auxiliarySemanticSegmentationHairMatte: true]) CIImage.init(data: data, options: [.auxiliarySemanticSegmentationTeethMatte: true]) Other options .auxiliaryDisparity and .auxiliaryPortraitEffectsMatte do not leak AVDepthData nor AVPortraitEffectsMatte.
Posted
by