How to render multiple videos simultaneously using 'AVPlayerItemVideoOutput'?

I have a 3d scene rendered by Metal on iOS device. The goal is to have multiple videos being played and mapped on the surfaces in the scene. I am using 'AVPlayerItemVideoOutput' to extract video frames and everything works as expected when there is one video played.

The problem is that as soon as the second video is played simultaneously with the first one using the same exact method (i.e. extracting frames with 'AVPlayerItemVideoOutput') the first 'AVPlayerItemVideoOutput' object returns false when 'hasNewPixelBufferForItemTime' method is called. I am creating completely separate 'AVPlayer', 'AVPlayerItem', AVPlayerItemVideoOutput, etc., instances for each video played.

Is this a limitation or there is something wrong with the setup? Is there any alternative to achieve the goal?
This is unexpected. It would be helpful if you created a ticket via Feedback Assistant with a project reproducing this problem. If you repost the FB number you get, we can forward it to our CoreMedia Playback team to investigate.
If you are able to create a project reproducing this problem, please include any media necessary.
How to render multiple videos simultaneously using 'AVPlayerItemVideoOutput'?
 
 
Q