I am using [CIImage imageWithTexture:size:flipped:colorSpace] to create CIImage objects which are then fed as input(s) into Core Image filters and the result rendered using an OpenGL-based CIContext with a NULL working color space. This works fine in iOS 9 but in iOS 10 beta 8 the rendering is significantly slower or fails completely. When it fails completely the output is black and there is a console message like the following:
Failed to render 750000 pixels because a CIKernel's ROI function did not allow tiling.
This failure happens with both built-in Core Image filters such as CIBlendWithMask as well as my own filter with custom CIKernels. The failure doesn't occur if the input CIImage objects are created from CGImage or CVPixelBuffer objects. It fails more often when the input textures are large (long dimension around 4000 pixels but less than 4096) and when the CIFilter(s) are taking multiple images as input.
If I build exactly the same code using Xcode 7 and the iOS 9 SDK, it runs fine on the same iOS 10 device (An iPhone 6s).
Anyone else run into this issue? Any viable workaround?