Save texture as Tiff

I'm trying to save metal textures in a lossless compressed format. I've tried png and tiff, but I run into the same problem: the pixel data changes after save and load when we have transparency. Here is the code I use to save a Tiff:

import ImageIO
import UIKit
import Metal
import MobileCoreServices

extension MTLTexture {
    func saveAsLosslessTIFF(url: URL) throws {
        guard let context = CIContext() else { return }
        guard let colorSpace = CGColorSpace(name: CGColorSpace.linearSRGB) else { return }
        guard let ciImage = CIImage(mtlTexture: self, options: [.colorSpace : colorSpace]) else { return }
        guard let cgImage = context.createCGImage(ciImage, from: ciImage.extent) else { return }
        
        // create a dictionary with TIFF compression options
        let tiffCompression_LZW = 5
        
        let options: [String: Any] = [
            kCGImagePropertyTIFFCompression as String: tiffCompression_LZW,
            kCGImagePropertyDepth as String: depth,
            kCGImagePropertyPixelWidth as String: width,
            kCGImagePropertyPixelHeight as String: height,
        ]
        
        let fileDestination = CGImageDestinationCreateWithURL(url as CFURL, kUTTypeTIFF, 1, nil)
        guard let destination = fileDestination else {
            throw RuntimeError("Unable to create image destination.")
        }
        
        CGImageDestinationAddImage(destination, cgImage, options as CFDictionary)
        
        if !CGImageDestinationFinalize(destination) {
            throw RuntimeError("Unable to save image to destination.")
        }
    }
}

I can then load the texture like this:

func loadTexture(url:URL) throws  -> MTLTexture {
        let usage = MTLTextureUsage(rawValue: MTLTextureUsage.renderTarget.rawValue | MTLTextureUsage.shaderRead.rawValue | MTLTextureUsage.shaderWrite.rawValue)
        
        return try loader.newTexture(URL:url,options:[MTKTextureLoader.Option.textureUsage:usage.rawValue,MTKTextureLoader.Option.origin:MTKTextureLoader.Origin.flippedVertically.rawValue])
    }

After saving and then loading the texture again, I want to get back the exact same texture. And I do, if there is no transparency. Transparent pixels, however, are transformed in a way that I don't understand. Here is an example pixel:

[120, 145, 195, 170] -> [144, 174, 234, 170]

My first guess would be that something is trying to undo a pre-multiplied alpha that never happened. But the numbers don't seem to work out. For example, if that were the case I'd expect 120 to go to (120 * 255) / 170 = 180 , not 144.

Any idea what I am doing wrong?

Replies

My first guess would be that something is trying to undo a pre-multiplied alpha that never happened.

Yes, probably. I have a vague recollection that CGImage's default behaviour is not what you expect and you need to set a flag somewhere.

Call CGImageGetAlphaInfo and see if it claims to be premultiplied or not.

But the numbers don't seem to work out.

Is that something to do with gamma? I notice that you have set linearSRGB as the colour space.

Thanks. Yes, the alpha info is indeed "premultipliedLast" but I can't see how I can change that. Any idea where to set that flag?

The flag seems to be when you create the CIContext CIContext(options: [CIContextOption.outputPremultiplied: false]) After making that change, Alpha info no longer say its premultiplied, but I'm still getting a more or less the same result:

[120, 145, 195, 170] -> [145, 174, 233, 170]

What values do the bytes in the TIFF have?

I can't see a direct way to read the bytes in the tiff. I've tried:

func printTiffBytes(url:URL) {
    if let imageSource = CGImageSourceCreateWithURL(url as CFURL, nil),
       let image = CGImageSourceCreateImageAtIndex(imageSource, 0, nil) {

        let width = image.width
        let height = image.height
        let bitsPerComponent = image.bitsPerComponent
        let bytesPerRow = image.bytesPerRow
        let totalBytes = height * bytesPerRow
       
       let colorSpace = CGColorSpaceCreateDeviceRGB()
      //  let colorSpace = CGColorSpace(name: CGColorSpace.linearSRGB)!

        //let bitmapInfo = CGBitmapInfo(rawValue: CGImageAlphaInfo.last.rawValue)  // Error! unsupported parameter combination
        let bitmapInfo = CGBitmapInfo(rawValue: CGImageAlphaInfo.premultipliedLast.rawValue)
        
        let data = UnsafeMutableRawPointer.allocate(byteCount: totalBytes, alignment: MemoryLayout<UInt8>.alignment)
        defer { data.deallocate() }
        
        if let context = CGContext(data: data,
                                   width: width,
                                   height: height,
                                   bitsPerComponent: bitsPerComponent,
                                   bytesPerRow: bytesPerRow,
                                   space: colorSpace,
                                   bitmapInfo: bitmapInfo.rawValue) {
        
            context.draw(image, in: CGRect(x: 0.0, y: 0.0, width: CGFloat(width), height: CGFloat(height)))
            
            let buffer = data.assumingMemoryBound(to: UInt8.self)
            
            for i in 0..<totalBytes {
                print(buffer[i])
            }
        }
    }

}

but I cannot get it to wrk with CGImageAlphaInfo.last so I'm not all that confident this is really what's in the tiff. But for the test pixel [120, 145, 195, 170] I get [155,116 ,97,170] (I guess RGB order is reversed?)

I can't see a direct way to read the bytes in the tiff.

Export it off the device and decode it with e.g. ImageMagick.

ImageMagick give this:

(233,174,145,170) #E9AE91AA srgba(233,174,145,0.666667)

So it seems the problem is the saving rather than the loading.

How are you reading and comparing pixel values (in the first example)? Looking at the value before/after - the ratio is a constant 1:1.2, so it can't be a matter of gamma encoding (where the ratio would be different depending on the channel value, since gamma curve isn't linear).

However, while this ratio indeed doesn't match the alpha value, it does suspiciously match the alpha value had it been gamma-encoded - 1 / [(170/255)^0.4545] ~ 1.2 So I'm guessing somewhere along the way the alpha channel is being gamma encoded (or assumed to be) where it shouldn't.

So more info is needed regarding the mechanism of measurement (possibly also the pixel format of the textures being used)

I'm reading the texture data like this:

let region = MTLRegionMake2D(0, 0, w, h)    
var a = Array<UInt8>(repeating:0, count: 4*w*h)
    
texture.getBytes(&a, bytesPerRow: (4 * MemoryLayout<UInt8>.size * w), from: region, mipmapLevel: 0)

I create textures like this

 let descriptor = MTLTextureDescriptor.texture2DDescriptor(pixelFormat:.bgra8Unorm_srgb , width: width, height: height, mipmapped: false)
        
        descriptor.usage =   MTLTextureUsage(rawValue: MTLTextureUsage.renderTarget.rawValue | MTLTextureUsage.shaderRead.rawValue | MTLTextureUsage.shaderWrite.rawValue)
        let texture  = device.makeTexture(descriptor: descriptor)!

And for testing I'm filling the textures with known pixel data like this:

var a:[UInt8] = ...
texture.replace(region: region, mipmapLevel: 0, withBytes: &a, bytesPerRow: (4 * MemoryLayout<UInt8>.size * w))

Then I load the textures from the tiff data with MTKTextureLoader:

func loadTexture(data:Data) throws  -> MTLTexture {
        let usage = MTLTextureUsage(rawValue: MTLTextureUsage.renderTarget.rawValue | MTLTextureUsage.shaderRead.rawValue | MTLTextureUsage.shaderWrite.rawValue)
 
        return try loader.newTexture(data: data,options:[MTKTextureLoader.Option.textureUsage:usage.rawValue,MTKTextureLoader.Option.origin:MTKTextureLoader.Origin.flippedVertically.rawValue])
    }