I'm using core data to store images with alpha. I save data encoded with heic, but when I try use the images in NSCollectionView the decoding process is too slow: 0.2 sec for a (256px by 256px)
I notice that performance is better sometimes if it is only one or two images, or if all images are from the same size, or if the images are stored as app assets, or if I try to decode the same data a second time. It's like there is some kind of macOS cache, I don't know.
But for general proposes, I can't get a razonable decoding performance when I use heic images with alpha in NSCollectionView
This is only on the mac (macbook pro 2019) since on the iPhone decoding time is fine.
Am I doing something wrong?
I appreciate any help
This are the steps to reproduce the problem:
1) Create 4 or 5 heic very different images with alpha at different sizes
2) Get the images files data and decode it with:
// Do not include the picture as an asset in the project. We are reading data and not the image directly to emulate core data
let imgDataA = Data(contentsOf:anyUrl)
let imgDataB = ...
let imgDataC = ...
let imgDataD = ...
let imgDataE = ...
readImage(data:imgDataA)
readImage(data:imgDataB)
readImage(data:imgDataC)
readImage(data:imgDataD)
readImage(data:imgDataE)
func readImage(data:Data){
let sourceOpt = [kCGImageSourceShouldCache : false] as CFDictionary
let imgOpt = [kCGImageSourceShouldCacheImmediately: true] as CFDictionary
let source = CGImageSourceCreateWithData(data as CFData , sourceOpt)!
let timer = Date()
let image = CGImageSourceCreateImageAtIndex(source, 0, imgOpt)
print("Time" ,-timer.timeIntervalSinceNow, image?.width, image?.height)
}
3) Check times print. Sometimes is 0.01 sec (ok) and other times is 0.20 sec (not useful for browsing)