Post not yet marked as solved
Preset document picker view
let picker = DocumentPickerViewController(
		supportedTypes: ["public.image", "png", "jpg", "jpeg", "tiff", "tif", "bmp", "bfm", "public.png", "public.jpeg", "customDoc"],
		onPick: { url in
				print("url : \(url)")
				if url.pathExtension == "customDoc"	{
						self.importProject(fileURL: url)
				}
				else {
						if (FileManager.default.fileExists(atPath: url.path)) {
								if let image =	UIImage(contentsOfFile: url.path){
self.importImage(image)
								}
								else {
										print("ERROR loading image \(url.path)")
								}
						}
						self.dismiss(animated: false)
				}
},
		onDismiss: {
				print("dismiss")
		 })
present(picker, animated: false)
I haven't gotten drag and drop to work yet.
Post not yet marked as solved
Moderators: This was supposed to be posted in the Universal App Quickstart forum. How do I edit or remove a post?
Just to follow up, I got this to work. I'm able to save a Portrait Mode photo on macOS.I found an unrelated bug in my code that was causing my error in CGImageDestinationFinalize.Here is what I found out: Setting the image colorspace to CGColorSpace.sRGB appears to be incompatible with a Portrait Mode photo with embedded depth data. I'm looking into this to see if this is a bug, or I had an implementation bug.
Post not yet marked as solved
I've discovered this as well and it appears to be a bug. I submitted a radar for this (45061334).
Post not yet marked as solved
I think the way to save both files at once is to create a PHAdjustmentData object with your blurred settings, then save the original Photo with PHContentEditingOutput. This creates the second edited photo (with an E in the filename). With Image Capture, you can see the two files.[[PHPhotoLibrary sharedPhotoLibrary] performChanges:^{
// Make a change request for adding an asset.
PHAssetChangeRequest *changeRequest =
[PHAssetChangeRequest creationRequestForAssetFromImageAtFileURL:originalJPEGFileURL];
// Make a content editing output for use with the change request.
PHObjectPlaceholder *placeholder = changeRequest.placeholderForCreatedAsset;
PHContentEditingOutput *contentEditingOutput =
[[PHContentEditingOutput alloc] initWithPlaceholderForCreatedAsset:placeholder];
// Apply content adjustments to the newly created asset.
contentEditingOutput.adjustmentData = adjustmentData;
[adjustedJPEGData writeToURL:contentEditingOutput.renderedContentURL atomically:YES];
changeRequest.contentEditingOutput = contentEditingOutput;
} completionHandler:^(BOOL success, NSError *error) {
if (!success) NSLog(@"Can't create asset: %@", error);
}];What I still need to figure out is how to get a saved photo to show up in the PHAssetCollectionTypeSmartAlbum. I don't know how to programmatically force it to show up in there. If I save a photo, edit it in Photos, and turn on Portrait, then it shows up in the album. I found a few keys in the depthMetaData, but setting these didn't work: I found these keys in kCGImageAuxiliaryDataInfoMetadata... xmlns:depthBlurEffect="http://ns.apple.com/depthBlurEffect/1.0/"><depthBlurEffect:SimulatedAperture>4.5</depthBlurEffect:SimulatedAperture><depthBlurEffect:RenderingParameters>...
Post not yet marked as solved
Any luck with this? The docs state: The CGColorSpace must be kCGColorSpaceModelRGBI tried this with no luck so far.
Post not yet marked as solved
I took a look at the sample dog photo and verified that it does have a disparity map in its depth data. This is the default for anything taken with the built in Portrait mode camera. Once you get the AVDepthData data, you can get the kCGImageAuxiliaryDataTypeDisparity data. Hope this helps!James
Post not yet marked as solved
How are you reading the depth data? The iPhone 7 Plus saves disparity data by default (closer items are brighter). Are you converting it to the depth data type when loading?