How is it possible to enable EDR on Apple TV without AVFoundation for custom HDR video playback? The use case is a custom video player for HDR playback via VideoToolbox and Metal, which seem to render colors correctly on iOS but not on tvOS.
All related documentation and WWDC sessions describe APIs that are unavailable for tvOS:
let metalLayer = CAMetalLayer()
metalLayer.wantsExtendedDynamicRangeContent = true
metalLayer.edrMetadata = CAEDRMetadata.hdr10(minLuminance: 0.0, maxLuminance: 1000, opticalOutputScale: 100)
What's the alternative path for tvOS to have correct system tone mapping for a setup like:
metalLayer.pixelFormat = .rgba16Float // (or .bgr10_xr)
metalLayer.colorspace = CGColorSpace(name: CGColorSpace.itur_2100_PQ)
Video format: HEVC, YUV 4:2:0 10bit, BT.2020 PQ.
We do set the preferredDisplayCriteria on AVDisplayManager and thus video range matching is in place.
WWDC Ref: https://developer.apple.com/videos/play/wwdc2022/110565?time=557
EDR
RSS for tagEDR is Apple's High Dynamic Range representation and rendering pipeline.
Posts under EDR tag
5 Posts
Sort by:
Post
Replies
Boosts
Views
Activity
I'm playing around with HDR images in iOS. I'm able to load an HDR image, but it isn't displaying with the expected "pop" I see of the same image in the Photos app. I exported the photo from Photos (on the Mac) using Export Unmodified. I reimported to confirm Photos shows the "pop."
I'm trying both UIImage and CIImage, with the same results. The below is tested on my iPhone 14 Pro (not Simulator).
The Storyboard for the code below has three UIImageViews (top, middle, and bottom).
let fileURL = Bundle.main.url(forResource: "IMG_6972", withExtension:"HEIC")!
var config = UIImageReader.Configuration()
config.prefersHighDynamicRange = true
let imageReader = UIImageReader(configuration: config)
let topImage = imageReader.image(contentsOf: fileURL)!
topImageView.preferredImageDynamicRange = .high
topImageView.image = topImage
// The image appears, looks SDR
print("topImage.isHighDynamicRange: \(topImage.isHighDynamicRange)") // true
let ciimage = CIImage(contentsOf: fileURL)!
bottomImageView.image = UIImage(ciImage: ciimage)
// The image appears, looks SDR - identical to topImage
print("color space: \(ciimage.colorSpace!)") // (kCGColorSpaceICCBased; kCGColorSpaceModelRGB; Display P3)
let gainMap = CIImage(contentsOf: fileURL, options: [.auxiliaryHDRGainMap: true])!
middleImageView.image = UIImage(ciImage: gainMap)
// The gain map image appears as expected
Additionally, for comparison, I tried:
let sdrImage = UIImage(named: "IMG_6972.HEIC")! // UIImage.named doesn't support HDR
print("sdrImage.isHighDynamicRange: \(sdrImage.isHighDynamicRange)")``` // false, as expected
bottomImageView.image = sdrImage
and the image matches the other two, further confirming the HDR version of the image isn't displaying in either UIImageReader or CIImage versions above.
I also track the UIScreen's use of EDR with:
print("currentEDR: \(UIScreen.main.currentEDRHeadroom)")
print("maxEDR: \(UIScreen.main.potentialEDRHeadroom)")
maxEDR is 8.0. currentEDR is 1.0 at launch, 1.3 after a 1.0 second delay. Which makes me think it might be showing a little HDR, but not enough to be noticeable.
Is there something special I need to do to set the UIViewController, UIScreen, or UIApplication (etc) to be an an "HDR Mode" or similar?
I have attempted to use VideoMaterial with HDR HLS stream, and also a TextureResource.DrawableQueue with rgba16Float in a ShaderGraphMaterial.
I'm capturing to 64RGBAHalf with AVPlayerItemVideoOutput and converting that to rgba16Float.
I don't believe it's displaying HDR properly or behaving like a raw AVPlayer.
Since we can't configure any EDR metadata or color space for a RealityView, how do we display HDR video? Is using rgba16Float supposed to be enough?
Is expecting the 64RGBAHalf capture to handle HDR properly a mistake and should I capture YUV and do the conversion myself?
Thank you
I can't get CoreImage to render an HDR image file with correct colors to a CAMetalLayer on macOS 14. I'm comparing the result with NSImageView and the SupportingHDRImagesInYourApp 'HDRDemo23' sample code, which use CVPixelBuffer. With CAMetalLayer, the images are displayed as HDR (definitely more highlights), but they're darker with some kind saturation increase & color shift.
Files I've tested include the sample ISO HDR files in the SupportingHDRImagesInYourApp sample code. Methods I've tried to render to CAMetalLayer include:
Modifying the GeneratingAnAnimationWithACoreImageRenderDestination sample code's ContentView so it uses HDRDemo23/example-ISO-HDR-images/image_01.heic, loaded using CIImage(contentsOf:)
Creating a test AppKit app that uses MTKView and CIRenderDestination the same way. I have NSImageView and the MTKView in the same window for comparison.
Using CIRAWFilter > CIRenderDestination > IOSurface > MTKView/CAMetalLayer
All these methods produce the image with the exact same appearance; a dark HDR image with some saturation/color shift.
The only clue I've found is that when using the Metal Debugger on the test AppKit app, the CAMetalLayer's 'Present' shows the 'input' thumbnail is HDR without the color shift, but the 'output' thumbnail looks like what I actually see. I tried changing the color profile on the layer to various things but nothing looked more correct.
I've tried this on two Macs, an M1 Mac Studio with an LG display, and a MacBook Air M2. The MacBook Air shows the same color shift, but since it has less dynamic range overall there isn't as much difference between NSImageView and MTKView.
Related APIs: CAMetalLayer, WantedExtendedDynamicRangeContent, potentialEDRHeadroom, currentEDRHeadroom, [UIScreen mainScreen].brightness.
Premise: I am a Video Player developer. The information I got from Apple’s development documentation about currentEDRHeadroom is that if I set WantedExtendedDynamicRangeContent = YES on a View, then the value of currentEDRHeadroom will be increased according to the current system brightness by a ratio that is not less than the brightness of potentialEDRHeadroom.
This is not a problem when at least ios16.
Question: Device: iPad pro 12.9 2022 model, iPadOS 17.2. turn off Reference Mode, turn off Night Shift mode, turn off automatic brightness (not turning it off actually has the same result),
use the same picture or video.
step:
(1): Set [UIScreen mainScreen].brightness to 1.0 (or use system tools to adjust the system brightness to 100%), create the UIView where CAMetalLayer is located and set its parameter WantedExtendedDynamicRangeContent=YES,
Start rendering. At this time, the potentialEDRHeadroom is always equal to 16 (this has always been the case in the iPad Pro 12.9 2022 model, which should be set by the hardware). At this time, the value of the currentEDRHeadroom is about 3.7-3.9, which means it can provide standard 100% brightness 3-4 times the brightness, My picture at this time is that there are no over exposure of pixels in the entire picture.
(2): Set [UIScreen mainScreen].brightness to 0.8 (or use system tools to adjust the brightness to 80%) or lower, such as 50%, create the UIView where CAMetalLayer is located and set its parameter WantedExtendedDynamicRangeContent=YES, At this time, I set [UIScreen mainScreen].brightness to 1.0 (or use system tools to adjust the brightness to 100%). Now the maximum value of currentEDRHeadroom can only be about 1.79. My pictures is that the entire picture or video has over exploded pixels.
Bug or Feature: I have the impression that this problem did not exist in iPadOS 16, and from Apple's documentation and the courses in Developer.app, there is no mention that the range of currentEDRHeadroom is related to the system brightness before creating CAMetalLayer. The documentation and tutorials talk about real-time system brightness. I want to know if this is a bug in iOS 17.2 or will it be designed this way in the future?