AVPlayer & AVPlayerItemVideoOutput playback problem on El Capitan

Hi,


We have an OpenGL based slideshow application (Boinx FotoMagico) that uses AVFoundation for movie playback. Our existing code worked fine until the last release of Yosemite, but with the release of El Capitan it broke and is really unreliable now. Movie playback sometimes works, depending on the machine. On some Macs it always works, on others almost never, while on yet other machines it works aboud 50% of the time. This leads me to believe we have a timing related problem. Has anybody encountered this problem and can shed some light on this. Am I doing something fundamentall wrong? Is this a known issue in 10.11? Are there any workarounds?

Here's part of the relevant code. We have a class that wraps movie playback and access to the OpenGL textures. To prepare for playback the -load method is being called on a background queue:


- (void) load
{
     // Create a player...

     NSURL* url = [NSURL fileURLWithPath:self.path];
     self.player = [AVPlayer playerWithURL:url];
     [self.player setVolume:0.0];


     // Get metadata from its asset...

     self.playerItem = self.player.currentItem;
     AVAsset* asset = self.playerItem.asset;

     if (asset)
     {
          // Duration...

          CMTime duration = asset.duration;
          _duration = CMTimeGetSeconds(duration);
          _timescale = duration.timescale;

          NSArray* tracks = [asset tracksWithMediaType:AVMediaTypeVideo];

          if (tracks.count > 0)
          {
               AVAssetTrack* track = [tracks objectAtIndex:0];

               // Size...

               CGSize size = track.naturalSize;
              _movieSize = NSMakeSize(size.width,size.height);
              _physicalWidth = _logicalWidth = _movieSize.width;
              _physicalHeight = _logicalHeight = _movieSize.height;

               // Orientation: find out if the video was shot vertically and needs to rotated to be viewed correctly...

               CGAffineTransform transform = [track preferredTransform];
               double radians = atan2(transform.b,transform.a);
               double degrees = radians * 180.0 / M_PI;
               _rotationOffset = -degrees;
          }
     }

     // Create a video output, and add it to the playerItem...

     NSDictionary* attributes =
     @{
          (NSString*)kCVPixelBufferPixelFormatTypeKey: @(kCVPixelFormatType_32ARGB),
//        (NSString*)kCVPixelBufferBytesPerRowAlignmentKey: @1,
          (NSString*)kCVPixelBufferOpenGLCompatibilityKey: @YES
     };

     self.output = [[AVPlayerItemVideoOutput alloc] initWithPixelBufferAttributes:attributes];
     [self.playerItem addOutput:self.output];

     // Make sure we are notified once the player is ready for playback...

     [self.playerItem addObserver:self forKeyPath:kStatusKey options:NSKeyValueObservingOptionInitial context:&kStatusKey];
     [self.player addObserver:self forKeyPath:kStatusKey options:NSKeyValueObservingOptionInitial context:&kStatusKey];

     self.isPrepared = YES;
}


In the -observeValueForKeyPath: method we check whether the AVPlayer and AVPlayerItem are ready for playback. Once they are we seek to the correct starting time and set a flag, that is observed by our engine. Once the flag is YES, the engine calls -play on AVPlayer.


// Check if enough of the movie has loaded so that we can start playing...

- (void) observeValueForKeyPath:(NSString*)inKeyPath ofObject:(id)inObject change:(NSDictionary*)inChange context:(void*)inContext
{
     if (inContext == &kStatusKey)
     {
          __weak typeof(self) weakSelf = self;
          AVPlayer* player = self.player;
          AVPlayerItem* playerItem = self.playerItem;

          if (player.status == AVPlayerStatusReadyToPlay && playerItem.status == AVPlayerItemStatusReadyToPlay)
          {
               CMTime time = CMTimeMakeWithSeconds(_inPoint,_timescale);

               [player seekToTime:time completionHandler:^(BOOL inFinished)
               {
                    weakSelf.isReadyToPlay = YES;
               }];
          }
     }
     else
     {
          [super observeValueForKeyPath:inKeyPath ofObject:inObject change:inChange context:inContext];
     }
}


Our rendering engine uses a display link to draw OpenGL content. For movies the render method will be called to get the movie frame textures. This is where the problem occurs: when movie playback doesn't work, we always get NO from hasNewPixelBufferForItemTime: - and it doesn't recover from this situation. Like I mentioned before, it sometimes works, sometimes it doesn't. It seems to be completely random and doesn't depend on the movie file itself.


// If a new image is available then copy it. Use CoreVideo to get the frames...

- (BOOL) render
{
     BOOL isNewImageAvailable = NO;
     AVPlayerItem* playerItem = self.playerItem;
     AVPlayerItemVideoOutput* output = self.output;

     if (playerItem != nil && output != nil && self.isStarted)
     {
          CFTimeInterval t = CACurrentMediaTime();
          CMTime itemTime = [output itemTimeForHostTime:t];

          if ([output hasNewPixelBufferForItemTime:itemTime])
          {
               CMTime presentationTime = kCMTimeZero;
               CVPixelBufferRef buffer = [output copyPixelBufferForItemTime:itemTime itemTimeForDisplay:&presentationTime];

               if (buffer)
               {
                    CVOpenGLTextureRef texture = NULL;
                    CVPixelBufferLockBaseAddress(buffer,kCVPixelBufferLock_ReadOnly); 
                    CVReturn err = CVOpenGLTextureCacheCreateTextureFromImage(kCFAllocatorDefault,self.textureCache,buffer,0,&texture);

                    if (texture)
                    {
                         if (err == noErr)
                         {
                              self.texture = texture;
                              self.tilesAreLoaded = YES;
                              isNewImageAvailable = YES;
                         }

                         CVOpenGLTextureRelease(texture);
                    }

                    CVOpenGLTextureCacheFlush(self.textureCache, 0);
                    CVPixelBufferUnlockBaseAddress(buffer,kCVPixelBufferLock_ReadOnly);
                    CVPixelBufferRelease(buffer);
               }
          }
     }

     return isNewImageAvailable;
}


Hope that somebody can shed light on this issue. Any help is greatly appreciated.


Peter


P.S. Radar Bug ID is 23593655

Having the exact same problem.


Edit: As for workarounds, AVPlayerLayer seems to work reliably but there is a constant stream of error messages to the console about the FBO being incomplete (which it's not).

Are you suggesting that I attach an AVPlayerLayer even though it's useless in my OpenGL based rendering? Will attaching this AVPlayerLayer make the original AVPlayerItemVideoOutput behave correctly then? Should any side effects be expected?

Sorry, no. What I'm saying is that you can use AVPlayerLayer instead of AVPlayerItemVideoOutput to get video frames into OpenGL.

Did you happen to make any progress with this?

This probably isn't a great workaround, but it's maybe a little better and could help shed some light on the issue. When this problem happens, I remove the output, wait a frame, and re-add it. Some percetange of the time, the problem goes away. It typically takes between 1 and a dozen frames before I start seeing video and as awful as this is, it's better than nothing.


I suppose it's too much to hope that Apple decides to address this issue. Was there ever any movement on your radar bug report?


  CMTime outputItemTime = [output itemTimeForHostTime:CACurrentMediaTime()];
  if ([output hasNewPixelBufferForItemTime:outputItemTime])
  {
       CVPixelBufferRef pix = [output copyPixelBufferForItemTime:outputItemTime itemTimeForDisplay:nil];
       last = [CIImage imageWithCVPixelBuffer:pix];
       CVPixelBufferRelease(pix);
  }
  else if (last == nil)     // Keep trying until we get a frame
  {
       if (movie.currentItem.outputs.count == 0)
            [movie.currentItem addOutput:output];
       else
            [movie.currentItem removeOutput:output];
  }

We have a very similar setup to baumgartner (AVPlayer outputing to AVPlayerItemVideoOutput, frames driven by Displaylink).


This has been working perfectly for about 2 years now, on 10.8, 10.9 and 10.10. With the release of 10.11, we are experiencing the exact same problems with hasNewPixelBufferForItemTime returning NO, even though AVPlayerItemStatus is AVPlayerStatusReadyToPlay.


The workaround is to release the player and output resources, and reload the asset until we get a valid frame (similar to the workaround mentioned above). obviously this causes noticable stalls when loading assets.


We have been forced to reccommend our users not upgrade to 10.11 until this gets sorted out.


Thanks for any advice.

Hello all,


Got a similar problem : it's very random, but from our experience (hardly testing), I can say it happens on all Macs, and only on OSX El Capitan.

The best workaround we found, is the same as Lagnat : if the problem happens, remove the AVPlayerItemVideoOutput, create a new one then add it to the AVPlayerItem.


I created the report #24725691 on bugreport.apple.com (not sure if everyone can access it)


Best. Philippe

Hello all.


I got the same problem with my implementation. After trying the solutions proposed here, I think I finally found the relable way to do things


The AVPlayerItemVideoOutput must be created AFTER the AVPlayerItem status is ready to play.

So

  1. Create player & player item, dispatch queue and display link
  2. Register observer for AVPlayerItem status key
  3. On status AVPlayerStatusReadyToPlay, create AVPlayerItemVideoOutput and start display link


Thanks to all for the inspiration

Renaud

Awesome! This worked! Actually, we still can create AVPlayerItemVideoOutput at the begining and just make sure that it is added to the AVPlayerItem after the status is ready. Thank you Renaud!

Oh wow. Thank you Renaud.


This is an issue that has plagued one of our applications on iOS about 2% of all of our video plays for months and this is the first non-hack fix that's actually worked. (many just recreated everything if there were 100 NOs or other hacks).


As Nandiin said, as long as it's added after the AVPlayerStatusReadyToPlay, everything works absolutely fine.

I believe there is a problem in the texturing that stems from a misunderstanding of who owns the GL textures created by CVOpenGLTextureCacheCreateTextureFromImage. I believe the TextureCache is the owner of the GL texture object. Therefore, when you call CVOpenGLTextureRelease shortly after taking the textureName, the GL texture is in fact returned to the pool of textures, so CoreVideo can immediately reuse it. You must hold on the to the reference count of CVOpenGLTextureRef until you are done with the GL texture owned by the CVOpenGLTextureRef.


Also, I believe the call to CVPixelBufferLockBaseAddress is unnecessary because for an OpenGL compatible CVPixelBuffer, the backing will be an IOSurfaceRef which means the pixel data is already on the GPU and the GL texture will be created via a GPU copy (blit). The call to CVPixelBufferLockBaseAddress will cause the IOSurfaceRef to be mapped into system memory which is a very expensive operation on some GPU models. If CVOpenGLTextureCacheCreateTextureFromImage needs to lock the pixel buffer I'm sure it will do so. In my implementations which use XXXTextureCacheCreateFromImage (iOS,Mac,Metal) I simply pass the CVPixelBufferRef without locking the base address and it's working fine.


There is a similar misuse of the CoreVideo APIs in this metal sample code which uses the Metal equivalent API CVMetalTextureCacheCreate. This code sorta works on iOS but is totally broken on Mac because the id<MTLTexture> returned from the call to CVMetalTextureGetTexture(textureRef) is really only valid until the CVMetalTextureCache decides to recycle the underlying storage which can be very soon since on the Mac it looks like the pool is only two textures.

https://developer.apple.com/library/content/samplecode/MetalVideoCapture/Listings/MetalVideoCapture_AAPLRenderer_mm.html


I ported AVBasicVideoOutput to Metal on both Mac & iOS but am having some initial problems YCrCb texturing under OpenGL on the desktop (which is how * found this thread). I believe Apple is preferring to use Metal internally and if you can switch to Metal I'm sure you'll be pleased with the increased performance and simplified macOS/iOS unified rendering code.

https://developer.apple.com/library/content/samplecode/AVBasicVideoOutput/Listings/AVBasicVideoOutput_APLViewController_m.html

Regarding my recent adventures to port AVBasicVideoOutput to Metal and Desktop GL, I believe the fastest texturing path on OSX is to use kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange and use IOSurface backed CVPixelBuffers. Then, create two textures, one for each of the two planes in the IOSurfaceRef and use a GLSL shader to combine them.


NSDictionary* pixelBufferAttributesBiPlanarYCrCb = [NSDictionary dictionaryWithObjectsAndKeys:

@(kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange),(id)kCVPixelBufferPixelFormatTypeKey,

@(YES),(id)kCVPixelBufferOpenGLCompatibilityKey,

@(YES),(id)kCVPixelBufferIOSurfaceOpenGLTextureCompatibilityKey,

[NSDictionary dictionary],(id)kCVPixelBufferIOSurfacePropertiesKey,

nil];


I'm fairly new to CoreProfile GL, but it appears that CGLTexImageIOSurface2D only works with rectangle textures:

// luma/luminance texture, full resolution

CGLTexImageIOSurface2D(glContext,GL_TEXTURE_RECTANGLE,GL_R8,(GLsizei)textureSize.width,(GLsizei)textureSize.height,GL_RED,GL_UNSIGNED_BYTE,surfaceRef, 0);

// chroma texture, subsampled

CGLTexImageIOSurface2D(glContext,GL_TEXTURE_RECTANGLE, GL_RG8, (GLsizei)planeSize.width, (GLsizei)planeSize.height, GL_RG, GL_UNSIGNED_BYTE, surfaceRef, 1);

This is much closer conceptually to the working code on EAGL that apple provides in AVBasicVideoOutput.

Hi DrXibber,


Were you able to get AVPlayer + OpenGL interop working?


I've been trying to figure out how to use AVPlayer and render/share OpenGL (not ES) textures, but I haven't found a way to get this working.


I have some code which starts playback (I can hear audio) but the textures are all "black" (I guess they are simply invalid).


UPDATE:

Just after posting this I realised that I was using GL_TEXTURE_2D instead of GL_TEXTURE_RECTANGLE. When I changed to GL_TEXTURE_RECTANGLE I saw decoded frames. Though, peformance is still way behind what I would expect. But this might be caused because I'm requesting RGBA32 frames. I'll experiment with YUV.


Best

N

I've just started getting this problem too on iOS11.0 Only in a particular circumstance I can't figure out.... When I render to my main view it works, but when I render to another view it doesn't.


I know this is a macos question, but the code to set up is almost the same. I wrote a test case.


#import <XCTest/XCTest.h>
#import <AVFoundation/AVFoundation.h>
#import <GLKit/GLKit.h>
@interface AVPlayerTestTests : XCTestCase
//
@property (nonatomic)   EAGLContext *context;
@end
@implementation AVPlayerTestTests
- (void)setUp {
    [super setUp];
    self.context = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES2];
    [EAGLContext setCurrentContext:self.context];
    /
}
- (void)tearDown {
    //
    [super tearDown];
}
- (void) testAVPlayer
{
    NSDictionary *pbOptions = @{
                                (NSString *)kCVPixelBufferPixelFormatTypeKey        : [NSNumber numberWithInt:kCVPixelFormatType_32BGRA],
                                /
                                (NSString *)kCVPixelBufferOpenGLESCompatibilityKey  : @YES
                                };
    AVPlayerItemVideoOutput *output = [[AVPlayerItemVideoOutput alloc] initWithPixelBufferAttributes:pbOptions];
    XCTAssertNotNil(output);
    NSURL *fileURL = [[NSBundle bundleForClass:self.class] URLForResource:@"SampleVideo_1280x720_10mb" withExtension:@"mp4"];
    AVPlayerItem *playerItem = [AVPlayerItem playerItemWithURL:fileURL];
    AVPlayer *player = [AVPlayer playerWithPlayerItem:playerItem];
    if (playerItem.status != AVPlayerItemStatusReadyToPlay) {
        [self keyValueObservingExpectationForObject:playerItem
                                            keyPath:@"status"
                                            handler:
         ^BOOL(id  _Nonnull observedObject, NSDictionary * _Nonnull change) {
             AVPlayerItem *oPlayerItem = (AVPlayerItem *)observedObject;
             switch (oPlayerItem.status) {
                 case AVPlayerItemStatusFailed:
                 {
                     XCTFail(@"Video failed");
                     return YES;
                 }
                     break;
                 case AVPlayerItemStatusUnknown:
                     return NO;
                     break;
                 case AVPlayerItemStatusReadyToPlay:
                     return YES;
                     break;
             }
         }];
        [self waitForExpectationsWithTimeout:100 handler:nil];
    }
    if (playerItem.status == AVPlayerItemStatusReadyToPlay) {
        [playerItem addOutput:output];
        player.rate = 1.0;
        player.muted = YES;
        [player play];
        CMTime vTime = [output itemTimeForHostTime:CACurrentMediaTime()];
        //
        BOOL foundFrame = [output hasNewPixelBufferForItemTime:vTime];
        XCTAssertTrue(foundFrame);
        if (!foundFrame) {
            //
            for (int i = 0; i < 10; i++) {
                sleep(1);
                vTime = [output itemTimeForHostTime:CACurrentMediaTime()];
                foundFrame = [output hasNewPixelBufferForItemTime:vTime];
                if (foundFrame) {
                    NSLog(@"Got frame at %i", i);
                    break;
                } else {
                    NSLog(@"Current time = %f", CACurrentMediaTime());
                    NSLog(@"Calculate time = %lld", vTime.value);
                }
                if (i == 9) {
                    XCTFail(@"Failed to acquire");
                }
            }
        }
    }
}
@end


I noticed that when it's not working, you get an unusual time from itemTimeForHostTime, either 0 or very large, so I think that's the key to the problem, why is it not returning a sensible time? (EDIT: this was a red herring, I am getting correct times)

I made a gitHub repo with a demo project for iOS (It wouldn't take much to make add a MacOS target).


It has a single test that currently fails.


https://github.com/seriouscyrus/AVPlayerTest


I also raised a bug back in February, but it has had zero feedback: 30555311

AVPlayer & AVPlayerItemVideoOutput playback problem on El Capitan
 
 
Q