Posts

Post not yet marked as solved
2 Replies
0 Views
I encountered the same. When I re-watched the session, I heard Brad say the following: And I need to ensure here that my extension's app group is prefixed by the MachServiceName in order for it to pass validation. I changed the CMIOExtensionMachServiceName in the extension's Info.plist to the same value as my app's and extension's app group name and suddenly it passed validation. I'm not sure if this is the intended configuration. Maybe Apple's Extensions team can clarify what the "prefixed" means here and what the values should look like.
Post not yet marked as solved
1 Replies
0 Views
I guess this is happening because CIPhotoEffectTonal also has an effect on the transparent parts of the image. If you want to limit the effect to the visible part, you can simply crop the image after the effect is applied and before you blend it over the background: video1FilteredImage = [video1FilteredImage imageByCroppingToRect:video1FilteredImage.extent]; By the way: instead of using CIConstantColorGenerator you can simply get a colored image like this: CIImage *colorBackgroundImage = [CIImage imageWithColor:inputColor];
Post marked as solved
1 Replies
0 Views
I think the whole RAW development is now handled by the new CIRAWFilter. This doesn't have an activeKeys property anymore. Instead, you can check for each property individually if it's supported or not. For instance with isContrastSupported or isMoireReductionSupported.
Post marked as solved
2 Replies
0 Views
You can use the QR code as a mask and blend it with a solid color to effectively colorize it. So you can, for example, create a black and a white version and use the register(...) method of UIImage to "bundle" them both into one dynamic image: let qrCode = filter.outputImage!.transformed(by: transform) // Use the QR code as a mask for blending with a color. // Note that we need to invert the code for that, so the actual code becomes white // and the background becomes black, because white = let color through, black = transparent. let maskFilter = CIFilter.blendWithMask() maskFilter.maskImage = qrCode.applyingFilter("CIColorInvert") // create a version of the code with black foreground... maskFilter.inputImage = CIImage(color: .black) let blackCIImage = maskFilter.outputImage! // ... and one with white foreground maskFilter.inputImage = CIImage(color: .white) let whiteCIImage = maskFilter.outputImage! // render both images let blackImage = context.createCGImage(blackCIImage, from: blackCIImage.extent).map(UIImage.init)! let whiteImage = context.createCGImage(whiteCIImage, from: whiteCIImage.extent).map(UIImage.init)! // use black version for light mode qrImage = blackImage // assign the white version to be used in dark mode qrImage.imageAsset?.register(whiteImage, with: UITraitCollection(userInterfaceStyle: .dark)) return qrImage
Post marked as solved
1 Replies
0 Views
I recommend using Core Image to read, write, and modify EXR images. Core Image can already natively open PNG and EXR files and write PNG data and files. However, there is no convenient way for writing EXR data of files. That's why we wrote extensions for doing exactly that. You can find them over at Github.
Post marked as solved
1 Replies
0 Views
You can simply initialize a CIImage with the texture and pass that to the kernel: let ciImage = CIImage(mtlTexture: texture) The documentation also mentions what you need to do to let Core Image render into a Metal texture. If you want to incorporate a Metal processing step into a Core Image pipeline instead, I recommend you check out CIImageProcessorKernel.
Post marked as solved
2 Replies
0 Views
It should work when you remove the -I $MTL_HEADER_SEARCH_PATHS part from your custom build rule. Though it is mentioned in the WWDC video, it actually causes problems when MTL_HEADER_SEARCH_PATHS is empty. See this answer to a related question. Usually, you don't need that parameter if you don't have a complicated file graph or external dependencies.
Post not yet marked as solved
2 Replies
0 Views
I don't exactly know why you are getting all zeros here, but there is a much simpler way to access the data using CIContext.render(_:toBitmap:...). Please check out the implementation - https://github.com/DigitalMasterpieces/CoreImageExtensions/blob/main/Sources/CIContext%2BValueAccess.swift in this small helper package - https://github.com/DigitalMasterpieces/CoreImageExtensions I wrote that contains useful Core Image extensions.
Post not yet marked as solved
1 Replies
0 Views
What do you mean by "factory image"?
Post marked as solved
1 Replies
0 Views
I'm not totally sure, but I think it's not possible to just add some random data to the end of an image file like this. Photos has its own database for storing the images and I guess they perform some kind of sanitizing/cleaning when adding new entries. You can add custom data to a PHAsset that is stored alongside the image using PHAdjustmentData, but this is meant for storing "a description of the edits made to an asset's photo, video, or Live Photo content, which allows your app to reconstruct or revert the effects of prior editing sessions." So you would be able to read this data back, but only in an app that understands it. It won't be accessible when you just export the image out of Photos as a JPEG, for instance. And the amount of data you can store this way is also limited. However, you might be able to store the data in the image's (EXIF) metadata somewhere. This seems to me the appropriate place.
Post not yet marked as solved
1 Replies
0 Views
The problem is that NSImage doesn't have the .cgImage accessor that UIImage has. It is a function with optional parameters, so to get the CGImage from an NSImage you need to do this instead: swift let cgImage = self.cgImage()
Post not yet marked as solved
4 Replies
0 Views
@IanOllmann There seems to be a bug in the AppleEXR encoder, causing a BAD_ACCESS when encoding images with height 16. Could you please have a look? (FB9080694) Thanks!
Post not yet marked as solved
2 Replies
0 Views
Just wanted to cross-reference StackOverflow - https://stackoverflow.com/questions/66914332/in-swift-filter-ciareaminmax-provide-incorrect-output here.
Post not yet marked as solved
1 Replies
0 Views
In your kernel, colors are usually normalized to [0.0 ... 1.0], based on the underlying color space. So even if values are stored in 10-bit inters in a texture, your shader will get them as normalized floats. I emphasized the color space above because it is used when translating the colors from the source into those normalized values. When you are using the default sRGB color space, the wide gamut from the HDR source doesn't fit into the sRGB [0.0 ... 1.0] spectrum. That's why you may get values outside that range in your kernel. This is actually useful in most cases because most filter operations that are designed for sRGB still work then. The color invert example above, however, is not. You have two options here that I know of: You can change the workingColorSpace of the CIContext you are using to the HDR color space of the input: let ciContext = CIContext(options: [.workingColorSpace: CGColorSpace(name: CGColorSpace.itur_2020)!]) Then all color values should be capped to [0.0 ... 1.0] in your kernel, where 0.0 is the darkest HDR color value and 1.0 is the brightest. You can safely perform the inversion with 1.0 - x then. However, keep in mind that some other filters will then not produce the correct result because they assume the input to be in (linear) sRGB—Core Image's default. The second option is that you convert ("color match") the input into the correct color space before passing it into your kernel and back to working space again before returning: return kernelOutput.matchedToWorkingSpace(from: colorSpace)
Post marked as solved
1 Replies
0 Views
I'm not sure if you are able to access sub-images using NSImage. However, you should be able to do so with a CGImageSource: let source = CGImageSourceCreateWithURL(newURL, nil) let numSubImages = CGImageSourceGetCount(source) for i in 0..<numSubImages { &#9;&#9;let subImage = CGImageSourceCreateImageAtIndex(source, i, nil) &#9;&#9;// subImage is a CGImage, you can convert it to an NSImage if you prefer: &#9;&#9;let nsImage = NSImage(cgImage: subImage, size: NSZeroSize) &#9;&#9;// handle image... }