AVPlayer & AVPlayerItemVideoOutput playback problem on El Capitan

Hi,


We have an OpenGL based slideshow application (Boinx FotoMagico) that uses AVFoundation for movie playback. Our existing code worked fine until the last release of Yosemite, but with the release of El Capitan it broke and is really unreliable now. Movie playback sometimes works, depending on the machine. On some Macs it always works, on others almost never, while on yet other machines it works aboud 50% of the time. This leads me to believe we have a timing related problem. Has anybody encountered this problem and can shed some light on this. Am I doing something fundamentall wrong? Is this a known issue in 10.11? Are there any workarounds?

Here's part of the relevant code. We have a class that wraps movie playback and access to the OpenGL textures. To prepare for playback the -load method is being called on a background queue:


- (void) load
{
     // Create a player...

     NSURL* url = [NSURL fileURLWithPath:self.path];
     self.player = [AVPlayer playerWithURL:url];
     [self.player setVolume:0.0];


     // Get metadata from its asset...

     self.playerItem = self.player.currentItem;
     AVAsset* asset = self.playerItem.asset;

     if (asset)
     {
          // Duration...

          CMTime duration = asset.duration;
          _duration = CMTimeGetSeconds(duration);
          _timescale = duration.timescale;

          NSArray* tracks = [asset tracksWithMediaType:AVMediaTypeVideo];

          if (tracks.count > 0)
          {
               AVAssetTrack* track = [tracks objectAtIndex:0];

               // Size...

               CGSize size = track.naturalSize;
              _movieSize = NSMakeSize(size.width,size.height);
              _physicalWidth = _logicalWidth = _movieSize.width;
              _physicalHeight = _logicalHeight = _movieSize.height;

               // Orientation: find out if the video was shot vertically and needs to rotated to be viewed correctly...

               CGAffineTransform transform = [track preferredTransform];
               double radians = atan2(transform.b,transform.a);
               double degrees = radians * 180.0 / M_PI;
               _rotationOffset = -degrees;
          }
     }

     // Create a video output, and add it to the playerItem...

     NSDictionary* attributes =
     @{
          (NSString*)kCVPixelBufferPixelFormatTypeKey: @(kCVPixelFormatType_32ARGB),
//        (NSString*)kCVPixelBufferBytesPerRowAlignmentKey: @1,
          (NSString*)kCVPixelBufferOpenGLCompatibilityKey: @YES
     };

     self.output = [[AVPlayerItemVideoOutput alloc] initWithPixelBufferAttributes:attributes];
     [self.playerItem addOutput:self.output];

     // Make sure we are notified once the player is ready for playback...

     [self.playerItem addObserver:self forKeyPath:kStatusKey options:NSKeyValueObservingOptionInitial context:&kStatusKey];
     [self.player addObserver:self forKeyPath:kStatusKey options:NSKeyValueObservingOptionInitial context:&kStatusKey];

     self.isPrepared = YES;
}


In the -observeValueForKeyPath: method we check whether the AVPlayer and AVPlayerItem are ready for playback. Once they are we seek to the correct starting time and set a flag, that is observed by our engine. Once the flag is YES, the engine calls -play on AVPlayer.


// Check if enough of the movie has loaded so that we can start playing...

- (void) observeValueForKeyPath:(NSString*)inKeyPath ofObject:(id)inObject change:(NSDictionary*)inChange context:(void*)inContext
{
     if (inContext == &kStatusKey)
     {
          __weak typeof(self) weakSelf = self;
          AVPlayer* player = self.player;
          AVPlayerItem* playerItem = self.playerItem;

          if (player.status == AVPlayerStatusReadyToPlay && playerItem.status == AVPlayerItemStatusReadyToPlay)
          {
               CMTime time = CMTimeMakeWithSeconds(_inPoint,_timescale);

               [player seekToTime:time completionHandler:^(BOOL inFinished)
               {
                    weakSelf.isReadyToPlay = YES;
               }];
          }
     }
     else
     {
          [super observeValueForKeyPath:inKeyPath ofObject:inObject change:inChange context:inContext];
     }
}


Our rendering engine uses a display link to draw OpenGL content. For movies the render method will be called to get the movie frame textures. This is where the problem occurs: when movie playback doesn't work, we always get NO from hasNewPixelBufferForItemTime: - and it doesn't recover from this situation. Like I mentioned before, it sometimes works, sometimes it doesn't. It seems to be completely random and doesn't depend on the movie file itself.


// If a new image is available then copy it. Use CoreVideo to get the frames...

- (BOOL) render
{
     BOOL isNewImageAvailable = NO;
     AVPlayerItem* playerItem = self.playerItem;
     AVPlayerItemVideoOutput* output = self.output;

     if (playerItem != nil && output != nil && self.isStarted)
     {
          CFTimeInterval t = CACurrentMediaTime();
          CMTime itemTime = [output itemTimeForHostTime:t];

          if ([output hasNewPixelBufferForItemTime:itemTime])
          {
               CMTime presentationTime = kCMTimeZero;
               CVPixelBufferRef buffer = [output copyPixelBufferForItemTime:itemTime itemTimeForDisplay:&presentationTime];

               if (buffer)
               {
                    CVOpenGLTextureRef texture = NULL;
                    CVPixelBufferLockBaseAddress(buffer,kCVPixelBufferLock_ReadOnly); 
                    CVReturn err = CVOpenGLTextureCacheCreateTextureFromImage(kCFAllocatorDefault,self.textureCache,buffer,0,&texture);

                    if (texture)
                    {
                         if (err == noErr)
                         {
                              self.texture = texture;
                              self.tilesAreLoaded = YES;
                              isNewImageAvailable = YES;
                         }

                         CVOpenGLTextureRelease(texture);
                    }

                    CVOpenGLTextureCacheFlush(self.textureCache, 0);
                    CVPixelBufferUnlockBaseAddress(buffer,kCVPixelBufferLock_ReadOnly);
                    CVPixelBufferRelease(buffer);
               }
          }
     }

     return isNewImageAvailable;
}


Hope that somebody can shed light on this issue. Any help is greatly appreciated.


Peter


P.S. Radar Bug ID is 23593655

Well I got it working in my app, through a combination of the above, and a bug in my code that was pausing the player when I wasn't expecting it.


However, my test still does not work, I can't see anything wrong with the code above, it just doesn't seem to work in a unit test environment.

I think I'm having this issue on OS X with CVMetalTextureCacheCreate. It's unclear to me from your explanation what the fix for this would be?

AVPlayer & AVPlayerItemVideoOutput playback problem on El Capitan
 
 
Q