Core Image

RSS for tag

Use built-in or custom filters to process still and video images using Core Image.

Core Image Documentation

Posts under Core Image tag

50 Posts
Sort by:
Post not yet marked as solved
2 Replies
1.4k Views
Hello.I have a problem with the built-in QR code detection of a vCard 4.0 in iOS.As described in https://tools.ietf.org/html/rfc6350 vCard Version 4.0 is always encoded with UTF-8. But it seems that the built-in QR code reader of iOS isn't able to decode the special characters correctly. Or am I missing something?Here is an example. Encode it with any QR code generator in the Internet and point your iOS Camera app at it. You will see that all special characters are not displayed correctly.BEGIN:VCARDVERSION:4.0N:T€st;BjörnORG:ÖÜÄTEL;CELL:+12 (34) 567890ADR:;;Blà St.;Blè Town;;12345;ç-LandURL:https://www.test123.comEMAIL;WORK;INTERNET:test@test123.comEND:VCARD
Posted
by bbbln.
Last updated
.
Post not yet marked as solved
0 Replies
266 Views
Is there a simple way in the newer iOS versions to make all black pixels transparent for an image? I see a whole bunch of super complicated stuff online on how to do this for colors other than black involving color cube mapping etc, but it seems like it should be a 1 line filter operation. Any ideas from the Apple experts?
Posted
by diffent.
Last updated
.
Post not yet marked as solved
1 Replies
580 Views
Hello,I would like to use CIColorMap filter.I am using it but the results are not that one I would execpt to be.It takes 2 parametersInputImage - what is quite clear andinputGradientImage - which makes me a bit confusingI could find some examples but I would like to know the algoritm how this filter works like.Can I change 2 or 3 colors and once.I don't exaclty understand why I need gradiet image if I would like to chane one color for another.And when the color in this gragient image begins wnd where it ends.RegardsPeter
Posted
by pbuczma.
Last updated
.
Post marked as solved
1 Replies
272 Views
I am beginning to work in Core Image and applying some filters. Most that I have encountered are fairly easy to understand but a few are confusing me. As I need a mono image I came across two: CIMaximumComponent and the corresponding CIMinimumComponent. There are others too but the description of these two is pretty sparse. Can someone expand just a little so I can understand the difference? It mentions max(r,g,b) and min of the same, what is the source of r,g,b and how is the result applied?
Posted Last updated
.
Post marked as solved
1 Replies
313 Views
import SwiftUI struct CIFilterDemoView: View {   let monalisa: UIImage = UIImage(named: "monalisa")!       @State var filtedImage: CGImage?       func testEVFilter() {     guard let rawFilter = CIRAWFilter(imageData: monalisa.pngData()) else { return }     rawFilter.setValue(3, forKey: kCIInputEVKey)     guard let image = rawFilter.outputImage else { return }     guard let cgImage = CIContext().createCGImage(image, from: image.extent) else { return }     self.filtedImage = cgImage   }       func testMatte() {     guard let rawFilter = CIRAWFilter(imageData: monalisa.pngData()) else { return }     guard let hairImage = rawFilter.semanticSegmentationHairMatte else { return }     guard let cgImage = CIContext().createCGImage(hairImage, from: hairImage.extent) else { return }     self.filtedImage = cgImage   }       var body: some View {     List {       Image(uiImage: monalisa)               HStack {         Button {           testEVFilter()         } label: {           Text("Test EV")         }         .buttonStyle(.borderedProminent)                   Button {           testMatte()         } label: {           Text("Test Matte")         }         .buttonStyle(.borderedProminent)       }               if let filtedImage, let uiImage = UIImage(cgImage: filtedImage) {         Image(uiImage: uiImage)           .background(Color.red)       }     }   } } struct CIFilterDemoView_Previews: PreviewProvider {   static var previews: some View {     CIFilterDemoView()   } } When I run testMatte() get this error log: 2023-01-26 19:42:45.682098+0800 DemoApp[28473:1085301] -[CIRAWFilterImpl semanticSegmentationHairMatte]: unrecognized selector sent to instance 0x7ff0a8740130 2023-01-26 19:42:45.682583+0800 DemoApp[28473:1085301] [General] -[CIRAWFilterImpl semanticSegmentationHairMatte]: unrecognized selector sent to instance 0x7ff0a8740130 2023-01-26 19:42:45.688117+0800 DemoApp[28473:1085301] [General] ( 0  CoreFoundation           0x00007ff80ca8d3eb __exceptionPreprocess + 242 1  libobjc.A.dylib           0x00007ff80c5d9e25 objc_exception_throw + 48 2  CoreFoundation           0x00007ff80cb2452b -[NSObject(NSObject) __retain_OA] + 0 3  CoreFoundation           0x00007ff80c9f762b ___forwarding___ + 1324 4  CoreFoundation           0x00007ff80c9f7068 _CF_forwarding_prep_0 + 120 5  DemoApp               0x00000001016584af $s7DemoApp08CIFilterA4ViewV9testMatteyyF + 527 6  DemoApp               0x00000001016598c8 $s7DemoApp08CIFilterA4ViewV4bodyQrvg7SwiftUI05TupleD0VyAE5ImageV_AE6HStackVyAGyAE0D0PAEE11buttonStyleyQrqd__AE015PrimitiveButtonL0Rd__lFQOyAE0N0VyAE4TextVG_AE017BorderedProminentnL0VQo__AWtGGAmEE10background_20ignoresSafeAreaEdgesQrqd___AE4EdgeO3SetVtAE05ShapeL0Rd__lFQOyAI_AE5ColorVQo_SgtGyXEfU_AXyXEfU_yycfU1_ + 40 7  SwiftUI               0x00007ffa0e439df8 __swift_memcpy160_4 + 111106 8  SwiftUI               0x00007ffa0e43a799 __swift_memcpy160_4 + 113571 9  SwiftUI               0x00007ffa0e43a70b __swift_memcpy160_4 + 113429 10 SwiftUI               0x00007ffa0ec1c9d7 __swift_memcpy36_4 + 41329 11 SwiftUI               0x00007ffa0e84b8e4 objectdestroy.142Tm + 42261 12 SwiftUI               0x00007ffa0e84b8f8 objectdestroy.142Tm + 42281 13 SwiftUI               0x00007ffa0e84b8e4 objectdestroy.142Tm + 42261 14 SwiftUI               0x00007ffa0e57a829 block_destroy_helper.15 + 49718 15 SwiftUI               0x00007ffa0e57a18c block_destroy_helper.15 + 48025 16 SwiftUI               0x00007ffa0e6eb2d4 _callVisitToolbarContentType2 + 4283 17 SwiftUI               0x00007ffa0eeffd7d _callVisitStyleContextType2 + 11403 18 SwiftUI               0x00007ffa0eefe3f6 _callVisitStyleContextType2 + 4868 19 SwiftUI               0x00007ffa0eefe4da _callVisitStyleContextType2 + 5096 20 SwiftUI               0x00007ffa0eefdccd _callVisitStyleContextType2 + 3035 21 UIKitCore              0x00007ff9182ce09a -[UIGestureRecognizer _componentsEnded:withEvent:] + 162 22 UIKitCore              0x00007ff917b91620 -[UITouchesEvent _sendEventToGestureRecognizer:] + 776 23 UIKitCore              0x00007ff917b73c1e -[UIGestureEnvironment _deliverEvent:toGestureRecognizers:usingBlock:] + 247 24 UIKitCore              0x00007ff917b73655 -[UIGestureEnvironment _updateForEvent:window:] + 188 25 UIKitCore              0x00007ff917b7333f -[UIWindow sendEvent:] + 5301 26 UIKitCore              0x00007ff917b7131c -[UIApplication sendEvent:] + 984 27 UIKit                0x00007ffb2c015b78 -[UIApplicationAccessibility sendEvent:] + 85 28 UIKitCore              0x00007ff917b6f145 __dispatchPreprocessedEventFromEventQueue + 10186 29 UIKitCore              0x00007ff91883f061 __processEventQueue + 8273 30 UIKitCore              0x00007ff918837724 __eventFetcherSourceCallback + 249 31 CoreFoundation           0x00007ff80ca14b78 __CFRUNLOOP_IS_CALLING_OUT_TO_A_SOURCE0_PERFORM_FUNCTION__ + 17 32 CoreFoundation           0x00007ff80ca14b27 __CFRunLoopDoSource0 + 157 33 CoreFoundation           0x00007ff80ca14901 __CFRunLoopDoSources0 + 212 34 CoreFoundation           0x00007ff80ca1357b __CFRunLoopRun + 929 35 CoreFoundation           0x00007ff80ca12b60 CFRunLoopRunSpecific + 560 36 HIToolbox              0x00007ff816367766 RunCurrentEventLoopInMode + 292 37 HIToolbox              0x00007ff816367576 ReceiveNextEventCommon + 679 38 HIToolbox              0x00007ff8163672b3 _BlockUntilNextEventMatchingListInModeWithFilter + 70 39 AppKit               0x00007ff80fb63233 _DPSNextEvent + 909 40 AppKit               0x00007ff80fb620b4 -[NSApplication(NSEvent) _nextEventMatchingEventMask:untilDate:inMode:dequeue:] + 1219 41 AppKit               0x00007ff80fb546f7 -[NSApplication run] + 586 42 AppKit               0x00007ff80fb28727 NSApplicationMain + 817 43 AppKit               0x00007ff80fde9856 _NSApplicationMainWithInfoDictionary + 16 44 UIKitMacHelper           0x00007ff823ab35d3 UINSApplicationMain + 1413 45 UIKitCore              0x00007ff9178df688 UIApplicationMain + 144 46 SwiftUI               0x00007ffa0eeb7787 __swift_memcpy93_8 + 11978 47 SwiftUI               0x00007ffa0eeb7632 __swift_memcpy93_8 + 11637 48 SwiftUI               0x00007ffa0e52aedf __swift_memcpy195_8 + 12258 49 DemoApp               0x0000000101656cee $s7DemoApp0abB0V5$mainyyFZ + 30 50 DemoApp               0x0000000101656df9 main + 9 51 dyld                0x00007ff80c606310 start + 2432 )
Posted
by Abenx.
Last updated
.
Post not yet marked as solved
2 Replies
511 Views
I take a picture using the iPhone's camera. The taken resolution is 3024.0 x 4032. I then have to apply a watermark to this image. After a bunch of trial and error, the method I decided to use was taking a snapshot of a watermark UIView, and drawing that over the image, like so: // Create the watermarked photo. let result: UIImage=UIGraphicsImageRenderer(size: image.size).image(actions: { _ in   image.draw(in: .init(origin: .zero, size: image.size))   let watermark: Watermark = .init(     size: image.size,     scaleFactor: image.size.smallest / self.frame.size.smallest   )   watermark.drawHierarchy(in: .init(origin: .zero, size: image.size), afterScreenUpdates: true) }) Then with the final image — because the client wanted it to have a filename as well when viewed from within the Photos app and exported from it, and also with much trial and error — I save it to a file in a temporary directory. I then save it to the user's Photo library using that file. The difference as compared to saving the image directly vs saving it from the file is that when saved from the file the filename is used as the filename within the Photos app; and in the other case it's just a default photo name generated by Apple. The problem is that in the image saving code I'm getting the following error: [Metal] 9072 by 12096 iosurface is too large for GPU And when I view the saved photo it's basically just a completely black image. This problem only started when I changed the AVCaptureSession preset to .photo. Before then there was no errors. Now, the worst problem is that the app just completely crashes on drawing of the watermark view in the first place. When using .photo the resolution is significantly higher, so the image size is larger, so the watermark size has to be commensurately larger as well. iOS appears to be okay with the size of the watermark UIView. However, when I try to draw it over the image the app crashes with this message from Xcode: So there's that problem. But I figured that could be resolved by taking a more manual approach to the drawing of the watermark then using a UIView snapshot. So it's not the most pressing problem. What is, is that even after the drawing code is commented out, I still get the iosurface is too large error. Here's the code that saves the image to the file and then to the Photos library: extension UIImage {   /// Save us with the given name to the user's photo album.   /// - Parameters:   ///  - filename: The filename to be used for the saved photo. Behavior is undefined if the filename contain characters other than what is represented by this regular expression [A-Za-z0-9-_]. A decimal point for the file extension is permitted.   ///  - location: A GPS location to save with the photo.   fileprivate func save(_ filename: String, _ location: Optional<Coordinates>) throws {           // Create a path to a temporary directory. Adding filenames to the Photos app form of images is accomplished by first creating an image file on the file system, saving the photo using the URL to that file, and then deleting that file on the file system.     //   A documented way of adding filenames to photos saved to Photos was never found.     // Furthermore, we save everything to a `tmp` directory as if we just tried deleting individual photos after they were saved, and the deletion failed, it would be a little more tricky setting up logic to ensure that the undeleted files are eventually     // cleaned up. But by using a `tmp` directory, we can save all temporary photos to it, and delete the entire directory following each taken picture.     guard       let tmpUrl: URL=try {         guard let documentsDirUrl=NSSearchPathForDirectoriesInDomains(.documentDirectory, .userDomainMask, true).first else {           throw GeneralError("Failed to create URL to documents directory.")         }         let url: Optional<URL> = .init(string: documentsDirUrl + "/tmp/")         return url       }()     else {       throw GeneralError("Failed to create URL to temporary directory.")     }           // A path to the image file.     let filePath: String=try {               // Reduce the likelihood of photos taken in quick succession from overwriting each other.       let collisionResistantPath: String="\(tmpUrl.path(percentEncoded: false))\(UUID())/"               // Make sure all directories required by the path exist before trying to write to it.       try FileManager.default.createDirectory(atPath: collisionResistantPath, withIntermediateDirectories: true, attributes: nil)               // Done.       return collisionResistantPath + filename     }()     // Create `CFURL` analogue of file path.     guard let cfPath: CFURL=CFURLCreateWithFileSystemPath(nil, filePath as CFString, CFURLPathStyle.cfurlposixPathStyle, false) else {       throw GeneralError("Failed to create `CFURL` analogue of file path.")     }           // Create image destination object.     //     // You can change your exif type here.     //   This is a note from original author. Not quite exactly sure what they mean by it. Link in method documentation can be used to refer back to the original context.     guard let destination=CGImageDestinationCreateWithURL(cfPath, UTType.jpeg.identifier as CFString, 1, nil) else {       throw GeneralError("Failed to create `CGImageDestination` from file url.")     }           // Metadata properties.     let properties: CFDictionary={               // Place your metadata here.       // Keep in mind that metadata follows a standard. You can not use custom property names here.       let tiffProperties: Dictionary<String, Any>=[:]               return [         kCGImagePropertyExifDictionary as String: tiffProperties       ] as CFDictionary     }()           // Create image file.     guard let cgImage=self.cgImage else {       throw GeneralError("Failed to retrieve `CGImage` analogue of `UIImage`.")     }     CGImageDestinationAddImage(destination, cgImage, properties)     CGImageDestinationFinalize(destination)             // Save to the photo library.     PHPhotoLibrary.shared().performChanges({       guard let creationRequest: PHAssetChangeRequest = .creationRequestForAssetFromImage(atFileURL: URL(fileURLWithPath: filePath)) else {         return       }       // Add metadata to the photo.       creationRequest.creationDate = .init()       if let location=location {         creationRequest.location = .init(latitude: location.latitude, longitude: location.longitude)       }     }, completionHandler: { _, _ in       try? FileManager.default.removeItem(atPath: tmpUrl.absoluteString)     })   } } If anyone can provide some insight as to what's causing the iosurface is too large error and what can be done to resolve it, that'd be awesome.
Posted Last updated
.
Post not yet marked as solved
1 Replies
411 Views
In trying to create a CIAffineTransform filter on MacOS X you are supposed to encode an NSAffineTransform using Objective-C code like: [NSValue valueWithBytes:&xform objCType:@encode(NSAffineTransform)]; But in Swift the Objective-C @encode directive does not exist in Swift. How do you encode an NSAffineTransform into an NSValue in Swift so that it will be acceptable to Core Image?
Posted
by eascot.
Last updated
.
Post not yet marked as solved
4 Replies
1.4k Views
Hello there 👋 I've noticed a different behavior between iOS 15 and iOS 16 using CIFilter and SpriteKit. Here is a sample code where I want to display a text and apply a blurry effect on the same text in the back of it. Here is the expected behavior (iOS 15): And the broken behavior on iOS 16: It looks like the text is rotated around the x-axis and way too deep. Here is the sample code: import UIKit import SpriteKit class ViewController: UIViewController {     var skView: SKView?     var scene: SKScene?     override func viewDidLoad() {         super.viewDidLoad()         skView = SKView(frame: view.frame)         scene = SKScene(size: skView?.bounds.size ?? .zero)         scene?.backgroundColor = UIColor.red         view.addSubview(skView!)         skView!.presentScene(scene)         let neonNode = SKNode()         let glowNode = SKEffectNode()         glowNode.shouldEnableEffects = true         glowNode.shouldRasterize = true         let blurFilter = CIFilter(name: "CIGaussianBlur")         blurFilter?.setValue(20, forKey: kCIInputRadiusKey)         glowNode.filter = blurFilter         glowNode.blendMode = .alpha         let labelNode = SKLabelNode(text: "MOJO")         labelNode.fontName = "HelveticaNeue-Medium"         labelNode.fontSize = 60         let labelNodeCopy = labelNode.copy() as! SKLabelNode         glowNode.addChild(labelNode)         neonNode.addChild(glowNode)         neonNode.addChild(labelNodeCopy)         neonNode.position = CGPoint(x: 200, y: 200)         scene?.addChild(neonNode) } }
Posted Last updated
.
Post marked as solved
4 Replies
635 Views
I want to use CIFilter to create a CGImageRef, but when I get cgimage buffer , it is empty CIFilter<CITextImageGenerator> * filter = [CIFilter textImageGeneratorFilter];   filter.text = @"This is a test text";   filter.fontName = @"HoeflerText-Regula";   filter.fontSize = 12;   filter.scaleFactor = 1.0;   CIImage *image = filter.outputImage;   CIContext *context = [CIContext contextWithOptions:nil];   CGImageRef resultRef = [context createCGImage:image fromRect:image.extent];   UIImage *resultImage = [UIImage imageWithCGImage:resultRef];       CFDataRef data = CGDataProviderCopyData(CGImageGetDataProvider(resultRef));   const unsigned char * buffer = CFDataGetBytePtr(data); And then I could not generate MTLTexture with this cgimage  MTKTextureLoader *loader = [[MTKTextureLoader alloc] initWithDevice:self.device];   NSError*error;   id<MTLTexture> fontTexture = [loader newTextureWithCGImage:resultRef                      options:@{     MTKTextureLoaderOptionOrigin : MTKTextureLoaderOriginFlippedVertically,     MTKTextureLoaderOptionSRGB : @(NO)   }                       error:&error];    How can I finish my work? Any suggestions about this question I am appreciate.
Posted
by JLTG.
Last updated
.
Post marked as solved
1 Replies
533 Views
With macOS 13, the CIColorCube and CIColorCubeWithColorSpace filters gained the extrapolate property for supporting EDR content. When setting this property, we observe that the outputImage of the filter sometimes (~1 in 3 tries) just returns nil. And sometimes it “just” causes artifacts to appear when rendering EDR content (see screenshot below). The artifacts even appear sometimes when extrapolate was not set. input | correct output | broken output This was reproduced on Intel-based and M1 Macs. All of our LUT-based filters in our apps are broken in this way and we could not find a workaround for the issue so far. Does anyone experice the same?
Posted Last updated
.
Post not yet marked as solved
0 Replies
514 Views
I'm running into hard crashes (EXC_BAD_ACCESS) when calling CIContext.writeHEIF10Representation or CIContext.heif10Representation from multiple threads. By contrast, concurrent access to writeHEIFRepresentation works fine. Does anyone know any other way to write a CIImage to 10-bit HEIF? I've tried several alternatives using CVPixelBuffer and MTLTexture without success. While I've filed a report through Feedback Assistant, I'm looking for a workaround. Writing thousands of 10-but hdr heif images within a single thread is an absolute throughput killer, whereas I can write any other image format without concurrency issues. Thanks!
Posted
by mallman.
Last updated
.
Post not yet marked as solved
2 Replies
815 Views
Hi, I have my app on the App Store for about 3 month. Today, I created an update and started to test via TestFlight. However, the image quality of the PNG files are really bad (though haven't changed anything!) There are items in the image that shouldn't be, ... - to me, it looks like the quality of the images were downsampled or whatever. I've included screenshots on how it looks like. On my Mac, the images are correct, also on the device emulator. It only appears when distributing the app ... might there be something wrong with the created archives? Anyone else experienced something like that, how can I fix this? Thanks for your help, Mario 1st Image: how it looks like on the simulator. 2nd Image: how it looks on the iPhone. please note how the buttons appear
Posted
by Mario_mh.
Last updated
.
Post not yet marked as solved
1 Replies
516 Views
The following code just does not behaves the same way previous to iOS 16 and with iOS 16. The blur effect does not seem to work correctly in iOS 16. class GameScene: SKScene { override func didMove(to view: SKView) { let shapeNode = SKShapeNode(circleOfRadius: 30) shapeNode.fillColor = .green shapeNode.strokeColor = .clear addChild(shapeNode) let blurredShapeNode = SKShapeNode(circleOfRadius: 30) blurredShapeNode.fillColor = .red blurredShapeNode.strokeColor = .clear let effectNode = SKEffectNode() addChild(effectNode) effectNode.addChild(blurredShapeNode) let blurAngle = NSNumber(value: 0) effectNode.filter = CIFilter( name: "CIMotionBlur", parameters: [kCIInputRadiusKey: 30, kCIInputAngleKey: blurAngle]) } }
Posted
by chepiok.
Last updated
.
Post not yet marked as solved
0 Replies
508 Views
A few of our users reported that images saved with our apps disappear from their library in Photos after a few seconds. All of them own a Mac with an old version of macOS, and all of them have iCloud syncing enabled for Photos. Our apps use Core Image to process images. Core Image will transfer most of the input's metadata to the output. While we thought this was generally a good idea, this seems to be causing the issue: The old version of Photos (or even iPhoto?) that is running on the Mac seems to think that the output image of our app is a duplicate of the original image that was loaded into our app. As soon as the iCloud sync happens, the Mac removes the image from the library, even when it's in sleep mode. When the Mac is turned off or disconnected from the internet, the images stay in the library—until the Mac comes back online. This seems to be caused by the output's metadata, but we couldn't figure out what fields are causing the old Photos to detect the new image as duplicate. It's also very hard to reproduce without installing an old macOS on some machine. Does anyone know what metadata field we need to change to not be considered a duplicate?
Posted Last updated
.
Post not yet marked as solved
1 Replies
961 Views
I'm processing a 4K video with a complex Core Image pipeline that also invokes a neural style transfer Core ML model. This works very well, but sometimes, for very few frames, the model execution fails with the following error messages: Execution of the command buffer was aborted due to an error during execution. Internal Error (0000000e:Internal Error) Error: command buffer exited with error status. The Metal Performance Shaders operations encoded on it may not have completed. Error: (null) Internal Error (0000000e:Internal Error) <CaptureMTLCommandBuffer: 0x280b95d90> -> <AGXG15FamilyCommandBuffer: 0x108f143c0> label = <none> device = <AGXG15Device: 0x106034e00> name = Apple A16 GPU commandQueue = <AGXG15FamilyCommandQueue: 0x1206cee40> label = <none> device = <AGXG15Device: 0x106034e00> name = Apple A16 GPU retainedReferences = 1 [espresso] [Espresso::handle_ex_plan] exception=Espresso exception: "Generic error": Internal Error (0000000e:Internal Error); code=1 status=-1 [coreml] Error computing NN outputs -1 [coreml] Failure in -executePlan:error:. It's really hard to reproduce it since it only happens occasionally. I also didn't find a way to access that Internal Error mentioned, so I don't know the real reason why it fails. Any advice would be appreciated!
Posted Last updated
.
Post not yet marked as solved
0 Replies
812 Views
More and more iOS devices can capture content with high/extended dynamic range (HDR/EDR) now, and even more devices have screens that can display that content properly. Apple also gave us developers the means to correctly display and process this EDR content in our apps on macOS and now also on iOS 16. There are a lot of EDR-related sessions from WWDC 2021 and 2022. However, most of them focus on HDR video but not images—even though Camera captures HDR images by default on many devices. Interestingly, those HDR images seem to use a proprietary format that relies on EXIF metadata and an embedded HDR gain map image for displaying the HDR effect in Photos. Some observations: Only Photos will display those metadata-driven HDR images in their proper brightness range. Files, for instance, does not. Photos will not display other HDR formats like OpenEXR or HEIC with BT.2100-PQ color space in their proper brightness. When using the PHPicker, it will even automatically tone-map the EDR values of OpenEXR images to SDR. The only way to load those images is to request the original image via PHAsset, which requires photo library access. And here comes my main point: There is no API that enables us developers to load iPhone HDR images (with metadata and gain map) that will decode image + metadata into EDR pixel values. That means we cannot display and edit those images in our app the same way as Photos. There are ways to extract and embed the HDR gain maps from/into images using Image I/O APIs. But we don't know the algorithm used to blend the gain map with the image's SDR pixel values to get the EDR result. It would be very helpful to know how decoding and encoding from SDR + gain map to HDR and back works. Alternatively (or in addition), it would be great if common image loading APIs like Image I/O and Core Image would provide APIs to load those images into an EDR image representation (16-bit float linear sRGB with extended values, for example) and write EDR images into SDR + gain map images so that they are correctly displayed in Photos. Thanks for your consideration! We really want to support HDR content in our image editing apps, but without the proper APIs, we can only guess how image HDR works on iOS.
Posted Last updated
.
Post not yet marked as solved
0 Replies
857 Views
Hello, since the release of iOS 16 we see a crash EXC_BAD_ACCESS KERN_INVALID_ADDRESS 0x0000000000000158 when invoking CIContext.startTask(toRender image: CIImage, to destination: CIRenderDestination) in the custom video compositor of our app. The crash happens exclusively on iOS16. The same call runs fine on iOS 15 and lower. By looking at the crashlytics logs, the crash occurs mostly on iPhone12 (~50% of the occurrences). We are not able to reproduce the bug, as it occurs very randomly. Any suggestion on how to fix this? Or is it a regression on the new OS? Stack trace: Thread 7 name: Thread 7 Crashed: 0 AGXMetalG14 0x00000002060e5b2c AGX::ResourceGroupUsage<AGX::G14::Encoders, AGX::G14::Classes, AGX::G14::ObjClasses>::setTexture(AGXG14FamilyTexture const*, ResourceGroupBindingType, unsigned int) + 40 (agxa_texture_template.h:423) 1 AGXMetalG14 0x000000020601d428 -[AGXG14FamilyComputeContext setTexture:atIndex:] + 168 (agxa_compute_template.hpp:3119) 2 CoreImage 0x000000019b5eb048 CIMetalRenderToTextures + 744 (CIMetalUtils.m:1348) 3 CoreImage 0x000000019b6de5a4 CI::MetalContext::compute_quad(unsigned int, CI::MetalMainProgram const*, CGSize const&, void const**, unsigned long, CI::Dimensions, CI::Dimensions) + 864 (context-metal.mm:1206) 4 CoreImage 0x000000019b6df0e4 CI::MetalContext::render_node(CI::TileTask*, CI::ProgramNode*, CGRect const&, CGRect const&, void const**, __IOSurface**, unsigned long) + 1352 (context-metal.mm:1463) 5 CoreImage 0x000000019b6e0208 CI::MetalContext::render_intermediate_node(CI::TileTask*, CI::ProgramNode*, CGRect const&, CI::intermediate_t*, bool, void () block_pointer) + 472 (context-metal.mm:1621) 6 CoreImage 0x000000019b6e340c CI::Context::recursive_render(CI::TileTask*, CI::roiKey const&, CI::Node*, bool) + 3584 (context.cpp:477) 7 CoreImage 0x000000019b6e2bb4 CI::Context::recursive_render(CI::TileTask*, CI::roiKey const&, CI::Node*, bool) + 1448 (context.cpp:402) 8 CoreImage 0x000000019b6e3c78 CI::Context::render(CI::ProgramNode*, CGRect const&) + 160 (context.cpp:535) 9 CoreImage 0x000000019b7502e4 ___ZN2CI23image_render_to_surfaceEPNS_7ContextEPNS_5ImageE6CGRectP11__IOSurfacePKNS_17RenderDestinationE_block_invoke + 72 (render.cpp:2595) 10 CoreImage 0x000000019b754078 CI::recursive_tile(CI::RenderTask*, CI::Context*, CI::RenderDestination const*, char const*, CI::Node*, CGRect const&, CI::PixelFormat, CI::swizzle_info const&, CI::TileTask* (CI::ProgramNode*, CGR... + 4428 (render.cpp:1824) 11 CoreImage 0x000000019b74ebe4 CI::tile_node_graph(CI::Context*, CI::RenderDestination const*, char const*, CI::Node*, CGRect const&, CI::PixelFormat, CI::swizzle_info const&, CI::TileTask* (CI::ProgramNode*, CGRect) block_pointer) + 444 (render.cpp:1929) 12 CoreImage 0x000000019b74fa54 CI::image_render_to_surface(CI::Context*, CI::Image*, CGRect, __IOSurface*, CI::RenderDestination const*) + 1916 (render.cpp:2592) 13 CoreImage 0x000000019b62e7b4 -[CIContext(CIRenderDestination) _startTaskToRender:toDestination:forPrepareRender:forClear:error:] + 2084 (CIRenderDestination.mm:1943) Crash report: report.crash
Posted
by dimo94.
Last updated
.
Post not yet marked as solved
0 Replies
1.1k Views
Is this accessible from Swift directly? Visual Look Up Lift subject from background Lift the subject from an image or isolate the subject by removing the background. This works in Photos, Screenshot, Quick Look, Safari, and more. Source: macOS Ventura Preview - New Features - Apple I see that Shortcuts now has a native Remove Background command that wasn't there in iOS 25 or MacOS 12. Is there any way to call that from Swift besides x-callback url schemes?
Posted Last updated
.