Data Buffers and Analog video

I am working with a device that is sending analog video to an iOS device in the YUY2 format, which is kCVPixelFormatType_422YpCbCr8_yuvs in Core Video. The video is sent via http one frame at a time. This signal is similar to another device I have implemented in the past which gets one frame at a time of MPEG video.


What I have done historically is fill a (NS)Data object with each frame as it arrives and then render the frame from that Data object as a UIImage. This works great with MPEG but does not work with the YUY2 data.


After doing a bit of research, I think I need to treat the data as a CVPixelBufferPool

and render from that in this manner, but I don't know how to convert the Data to a CVPixelBufferPool
. Of course, I could be barking up the wrong tree. I would greatly appreciate guidance on this matter, either converting Data to a CVPixelBufferPool or an alternative approach. Thanks in advance.

You don’t need to convert the data at all (or if there’s a conversion, you can have CoreVideo do it for you).


if you do something similar to the post you linked, you can just copy the raw bytes out of the Data object into the pixel buffer’s storage. You would then pass the pixel buffer on to whatever is writing out frames. You just have to make sure that you configure the pixel buffer to expect the same pixel format (and any other relevant bitmap format parameters, such as color space) to match the incoming data exactly.


Note that, unlike that code sample, you probably should not create a new pixel buffer for each incoming frame, but use a pool of pixel buffers. The writer should return the pixel buffers to the pool after use, and you should wait for and use an available pixel buffer for the next incoming frame.

Thanks! So I am trying the following in a test app, in the ViewController, but it's not working. Any ideas?


    var url: URL!
    var buffer: Data!

    override func viewDidLoad() {
        super.viewDidLoad()
        let path = Bundle.main.path(forResource: "gst-video", ofType: "yuy2")!
        self.url = URL(fileURLWithPath: path)

        // According to a hex editor, each frame is 614400 long
        let data = try! Data(contentsOf: self.url)
        let sub = data.subdata(in: 0..<614400)
        self.buffer = sub
        
    }
    
    override func viewDidAppear(_ animated: Bool) {
        super.viewDidAppear(animated)
        
        var pixelBufferOut: CVPixelBuffer?
        let attributes : [NSObject:AnyObject] = [
            kCVPixelBufferCGImageCompatibilityKey : true as AnyObject,
            kCVPixelBufferCGBitmapContextCompatibilityKey : true as AnyObject
        ]
        CVPixelBufferCreateWithBytes(kCFAllocatorDefault, 640, 480, kCVPixelFormatType_422YpCbCr8_yuvs, UnsafeMutablePointer(&buffer), 8, nil, nil, attributes as CFDictionary, &pixelBufferOut)
        
        guard let pixelBuffer = pixelBufferOut else {
            return
        }
        
        let ciImage = CIImage(cvPixelBuffer: pixelBuffer)
        let ciContext = CIContext(options: nil)
        let frameImage = ciContext.createCGImage(ciImage, from: ciImage.extent)
        
        let frame = UIImage(cgImage: frameImage!)

The frame is not a valid video frame, it's a bunch of color bands.

614400 / (640*0) comes out as 2, or two bytes per pixel in the input. kCVPixelFormatType_422YpCbCr8_yuvs looks to be a 1-byte-per-pixel format.


In any case, I was suggesting that you should not create an image at all. Instead, if there's no data conversion needed, you should simply lock the pixel buffer base address, copy the data bytes into it, and pass the pixel buffer on to the output writer.


However, you want to avoid creating a pixel buffer for every frame (which is likely to be a bit slow, if that matters), you will likely need to create a pixel buffer pool of 2 or more pixel buffers, using them as they become available.

Thanks for the further reply, Quincey. I found the reference identifying kCVPixelFormatType_422YpCbCr8_yuvs as YUY2 at a random link, it may be incorrect. Not sure how to determine the correct one though.


Could you perhaps demonstrate the code? I'm relatively junior, and I'm having trouble picturing exactly what you are recommending, though I am extremely eager to learn. The methods for creating a CVPixelBuffer all require the PixelFormatType like kCVPixelFormatType_422YpCbCr8_yuvs, and I can't see how to copy bytes into a CVPixelBuffer other than at creation. Or do i lock the base address and use memcpy?

There's no simple code sample, because the details depend on the format of the data.


>> Or do i lock the base address and use memcpy?


Yes, you can do exactly that. There are two things to think about to get this right:


1. Avoid buffer overflow. You can easily get the rowBytes and height from the pixel buffer, and multiply them to find out how big it is. If the size doesn't exactly match the input, you need to think about why, since that suggests the format doesn't match. The only complication here is if you discover that the pixel buffer is planar, in which case you might have to do 4 separate copies.


2. Match pixel formats. This is a bit harder, because it requires you to know what pixel format you have in the input, and what pixel format is expected in the output. You might find some help by working through the Wikipedia page on the subject: en.wikipedia.org/wiki/YUV


It's fairly normal that the hard part of a problem like this is figuring out the naming. Once you've done that, copying the data should be relatively easy.

Thanks! I'll see what i can do. I appreciate your taking the tme.

Data Buffers and Analog video
 
 
Q