Why VideoMaterial can't show transparency on Apple Vision Pro

https://developer.apple.com/documentation/realitykit/videomaterial
The documentation: "Video materials support transparency if the source video’s file format also supports transparency."

I have a transparency video(Hand.mov, HEVC with alpha), I can show the video with transparency background correctly on Vision Pro Simulates, but on physic Device the video has a black background. I'm sure the video format is ok because I can see get the texture from video and display it on an UnlitMaterial.

How can I show the transparency video correctly with the RealityKit/VideoMaterial?

Answered by Vision Pro Engineer in 869000022

Hey @XanderXu,

I'm not able to replicate what you are seeing. VideoMaterial provided with a MPEG-4 AAC, HEVC with Alpha should display with transparency on the simulator and on the device.

I'd greatly appreciate it if you could open a bug report, include a sample that replicates the issue and a sysdiagnose from your device, and post the FB number here once you do.

Bug Reporting: How and Why? has tips on creating your bug report.

Thanks,
Michael

Accepted Answer

Hey @XanderXu,

I'm not able to replicate what you are seeing. VideoMaterial provided with a MPEG-4 AAC, HEVC with Alpha should display with transparency on the simulator and on the device.

I'd greatly appreciate it if you could open a bug report, include a sample that replicates the issue and a sysdiagnose from your device, and post the FB number here once you do.

Bug Reporting: How and Why? has tips on creating your bug report.

Thanks,
Michael

VideoCustomCompositor.swift:

import Foundation
import AVFoundation
import RealityKit

enum CustomCompositorError: Int, Error, LocalizedError {
    case ciFilterFailedToProduceOutputImage = -1_000_001
    case notSupportingMoreThanOneSources
    
    var errorDescription: String? {
        switch self {
        case .ciFilterFailedToProduceOutputImage:
            return "CIFilter does not produce an output image."
        case .notSupportingMoreThanOneSources:
            return "This custom compositor does not support blending of more than one source."
        }
    }
}

nonisolated
final class VideoCustomCompositor: NSObject, AVVideoCompositing, @unchecked Sendable {
    
    private var isCancelled = false
    private var request: AVAsynchronousVideoCompositionRequest?
    var sourcePixelBufferAttributes: [String: any Sendable]? = [
        String(kCVPixelBufferPixelFormatTypeKey): [kCVPixelFormatType_32BGRA],
        String(kCVPixelBufferMetalCompatibilityKey): true // Critical!
    ]
    var requiredPixelBufferAttributesForRenderContext: [String: any Sendable] = [
        String(kCVPixelBufferPixelFormatTypeKey):[kCVPixelFormatType_32BGRA],
        String(kCVPixelBufferMetalCompatibilityKey): true
    ]
 
    func renderContextChanged(_ newRenderContext: AVVideoCompositionRenderContext) {
        return
    }
    func cancelAllPendingVideoCompositionRequests() {
        isCancelled = true
        request?.finishCancelledRequest()
        request = nil
    }
    func startRequest(_ request: AVAsynchronousVideoCompositionRequest) {
        self.request = request
        isCancelled = false  // Reset cancellation status
//        guard let outputPixelBuffer = request.renderContext.newPixelBuffer() else {
//            print("No valid pixel buffer found. Returning.")
//            request.finish(with: CustomCompositorError.ciFilterFailedToProduceOutputImage)
//            return
//        }
        
        guard let requiredTrackIDs = request.videoCompositionInstruction.requiredSourceTrackIDs, !requiredTrackIDs.isEmpty else {
            print("No valid track IDs found in composition instruction.")
            return
        }
        
        let sourceCount = requiredTrackIDs.count
        
        if sourceCount > 1 {
            request.finish(with: CustomCompositorError.notSupportingMoreThanOneSources)
            return
        }
        
        if sourceCount == 1 {
            let sourceID = requiredTrackIDs[0]
            let sourceBuffer = request.sourceFrame(byTrackID: sourceID.value(of: Int32.self)!)!
            request.finish(withComposedVideoFrame: sourceBuffer)
        }
        
//        request.finish(withComposedVideoFrame: outputPixelBuffer)
    }

}
Why VideoMaterial can't show transparency on Apple Vision Pro
 
 
Q