Selectively reading sample buffers from specific time ranges and then writing them to an asset writer - why is the AVPlayer stuck loading?

Given an AVAsset, I'm performing a Vision trajectory request on it and would like to write out a video asset that only contains frames with trajectories (filter out downtime in sports footage where there's no ball moving).

I'm unsure what would be a good approach, but as a starting point I tried the following pipeline:

  1. Copy sample buffer from the source AVAssetReaderOutput.
  2. Perform trajectory request on a vision handler parameterized by the sample buffer.
  3. For each resulting VNTrajectoryObservation (trajectory detected), use its associated CMTimeRange to configure a new AVAssetReader set to that time range.
  4. Append the time range constrained sample buffer to one AVAssetWriterInput until the forEach is complete.

In code:

private func transferSamplesAsynchronously(from readerOutput: AVAssetReaderOutput,
                                       to writerInput: AVAssetWriterInput,
                                       onQueue queue: DispatchQueue,
                                       sampleBufferProcessor: SampleBufferProcessor,
                                       completionHandler: @escaping () -> Void) {
/*
 The writerInput continously invokes this closure until finished or
 cancelled. It throws an NSInternalInconsistencyException if called more
 than once for the same writer.
*/
writerInput.requestMediaDataWhenReady(on: queue) {
    var isDone = false

    /*
     While the writerInput accepts more data, process the sampleBuffer
     and then transfer the processed sample to the writerInput.
    */
    while writerInput.isReadyForMoreMediaData {
        if self.isCancelled {
            isDone = true
            break
        }

        // Get the next sample from the asset reader output.
        guard let sampleBuffer = readerOutput.copyNextSampleBuffer() else {
            // The asset reader output has no more samples to vend.
            isDone = true
            break
        }

        let visionHandler = VNImageRequestHandler(cmSampleBuffer: sampleBuffer, orientation: self.orientation, options: [:])
        
        do {
            try visionHandler.perform([self.detectTrajectoryRequest])
            if let results = self.detectTrajectoryRequest.results {
                try results.forEach { result in
                    let assetReader = try AVAssetReader(asset: self.asset)
                    assetReader.timeRange = result.timeRange
                    
                    let trackOutput = AVTrackOutputs.firstTrackOutput(ofType: .video, fromTracks: self.asset.tracks,
                                                                           withOutputSettings: nil)
                    
                    
                    assetReader.add(trackOutput)
                    
                    assetReader.startReading()
                    
                    guard let sampleBuffer = trackOutput.copyNextSampleBuffer() else {
                        // The asset reader output has no more samples to vend.
                        isDone = true
                        return
                    }
                    
                    // Append the sample to the asset writer input.
                    guard writerInput.append(sampleBuffer) else {
                        /*
                         The writer could not append the sample buffer.
                         The `readingAndWritingDidFinish()` function handles any
                         error information from the asset writer.
                        */
                        isDone = true
                        return
                    }
                }
            }
        } catch {
            print(error)
        }
    }

    if isDone {
        /*
         Calling `markAsFinished()` on the asset writer input does the
         following:
         1. Unblocks any other inputs needing more samples.
         2. Cancels further invocations of this "request media data"
         callback block.
        */
        writerInput.markAsFinished()

        /*
         Tell the caller the reader output and writer input finished
         transferring samples.
         */
        completionHandler()
    }
}
}

private func readingAndWritingDidFinish(assetReaderWriter: AVAssetReaderWriter,
                                    completionHandler: @escaping FinishHandler) {
if isCancelled {
    completionHandler(.success(.cancelled))
    return
}

// Handle any error during processing of the video.
guard sampleTransferError == nil else {
    assetReaderWriter.cancel()
    completionHandler(.failure(sampleTransferError!))
    return
}

// Evaluate the result reading the samples.
let result = assetReaderWriter.readingCompleted()
if case .failure = result {
    completionHandler(result)
    return
}

/*
 Finish writing, and asynchronously evaluate the results from writing
 the samples.
*/
assetReaderWriter.writingCompleted { result in
    completionHandler(result)
    return
}
}

When run I get the following:

No error is caught in the first catch clause, and none are caught in private func readingAndWritingDidFinish(assetReaderWriter: AVAssetReaderWriter, completionHandler: @escaping FinishHandler), the completion handler is called.

Help with any of the following questions would be appreciated:

  • What is causing what appears to be indefinite loading?
  • How might I isolate the problem further?
  • Am I misusing or misunderstanding how to selectively read from time ranges of AVAssetReader objects?
  • Should I forego the AVAssetReader / AVAsssetWriter route entirely, and use the time ranges with AVAssetExportSession instead? I don't know how the two approaches compare, or what to consider when choosing between the two.

Replies

It looks like you may be missing a call to AVAssetWriter.finishWriting