The problem might be this: Core Image uses 16-bit float RGBA as the default working format. That means that, whenever it needs an intermediate buffer for the rendering, it will create a 4-channel 16-bit float surface to render into. This also meant that your 1-channel unsigned integer values will automatically be mapped to float values in 0.0...1.0. That's probably where you lose precision. There are a few options to circumvent this: You could set the workingFormat context option to .L8 or .R8. However, this means all intermediate buffers will have that format. If you want to mix processing of the segmentation mask with other images, this won't work. If you only want to process the mask separately, you can set up a separate CIContext with this option. Note, however, that most built-in CIFilters assume a floating-point working format and might not perform well with this format. You can process your segmentation map with Metal (as you suggested) as part of your CIFilter pipeline using a CIImag
Topic:
Machine Learning & AI
SubTopic:
General
Tags: