Post

Replies

Boosts

Views

Activity

Reply to CoreML inference on iOS HW uses only CPU on CoreMLTools imported Pytorch model
This particular issue was caused by the upsampling path of the model having torch.nn.ConvTranspose2d operations with stride (16,1) and kernel (16,1) - for some reason ANE-compiler did not like them. Resolved by changing 16x to 4x+4x operation instead. Furthermore, ANE is 16-bit only. CoreMLTools may compile your 32-bit model without issues but loading the runtime model on iOS/macOS will not be assigned on ANE.
1w
Reply to Is saving portrait effects matte with image data to PHPhotoLibrary possible?
This was finally resolved. Here are some notes in case someone makes the same mistakes when e.g. experimenting with AVCam source code. It turned out the portrait matte was indeed written to PHPhotoLibrary with the image, but reading it back with PHImagePickerController needed some attention. When accessing the image data with NSItemProvider.loadDataRepresentation(forTypeIdentifier: representation) the representation identifier has to be the same format and type the file was saved. For example my photo container was of type .heic. Now, even though NSItemProvider.hasRepresentationConforming(toTypeIdentifier: representation...) listed type identifiers, UTType.heic UTType.jpeg as supported, only the first could load the matte correctly. If I tried to access the data with .jpeg identifier portrait matte was not loaded, but strange error, "Could not create a bookmark: NSError: Cocoa 257 "The file couldn’t be opened because you don’t have permission to view it" kept appearing on Xcode console. The second mistake was capturing and storing live photo data with portrait matte. It seems possible, but at least iOS Photos app can't handle those photos properly in edit mode and saving the edit always fails in error "Photo could not be saved, please try again later." So, don't enable live photo options in capture settings when capturing portrait matte.
Aug ’21