I am working with a device that is sending analog video to an iOS device in the YUY2 format, which is kCVPixelFormatType_422YpCbCr8_yuvs in Core Video. The video is sent via http one frame at a time. This signal is similar to another device I have implemented in the past which gets one frame at a time of MPEG video.
What I have done historically is fill a (NS)Data object with each frame as it arrives and then render the frame from that Data object as a UIImage. This works great with MPEG but does not work with the YUY2 data.
After doing a bit of research, I think I need to treat the data as a CVPixelBufferPool
and render from that in this manner, but I don't know how to convert the Data to a CVPixelBufferPool
. Of course, I could be barking up the wrong tree. I would greatly appreciate guidance on this matter, either converting Data to a CVPixelBufferPool or an alternative approach. Thanks in advance.