I’ve been struggling with a very frustrating issue using the new iOS 26 Swift Concurrency APIs for video processing. My pipeline reads frames using AVAssetReader, processes them via CIContext (Lanczos upscale), and then appends the result to an AVAssetWriter using the new PixelBufferReceiver.
The Problem: The execution randomly stops at the ]await append(...)] call. The task suspends and never resumes.
It is completely unpredictable: It might hang on the very first run, or it might work fine for 4-5 runs and then hang on the next one.
It is independent of video duration: It happens with 5-second clips just as often as with long videos.
No feedback from the system: There is no crash, no error thrown, and CPU usage drops to zero. The thread just stays in the suspended state indefinitely.
If I manually cancel the operation and restart the VideoEngine, it usually starts working again for a few more attempts, which makes me suspect some internal resource exhaustion or a deadlock between the GPU context and the writer's input.
The Code: Here is a simplified version of my processing loop:
private func proccessVideoPipeline(
readerOutputProvider: AVAssetReaderOutput.Provider<CMReadySampleBuffer<CMSampleBuffer.DynamicContent>>,
pixelBufferReceiver: AVAssetWriterInput.PixelBufferReceiver,
nominalFrameRate: Float,
targetSize: CGSize
) async throws {
while !Task.isCancelled, let payload = try await readerOutputProvider.next() {
let sampleBufferInfo: (imageBuffer: CVPixelBuffer?, presentationTimeStamp: CMTime) = payload.withUnsafeSampleBuffer { sampleBuffer in
return (sampleBuffer.imageBuffer, sampleBuffer.presentationTimeStamp)
}
guard let currentPixelBuffer = sampleBufferInfo.imageBuffer else {
throw AsyncFrameProcessorError.missingImageBuffer
}
guard let pixelBufferPool = pixelBufferReceiver.pixelBufferPool else {
throw NSError(domain: "PixelBufferPool", code: -1, userInfo: [NSLocalizedDescriptionKey: "No pixel buffer pool available"])
}
let newPixelBuffer = try pixelBufferPool.makeMutablePixelBuffer()
let newCVPixelBuffer = newPixelBuffer.withUnsafeBuffer({ $0 })
try upscale(currentPixelBuffer, outputPixelBuffer: newCVPixelBuffer, targetSize: targetSize )
let presentationTime = sampleBufferInfo.presentationTimeStamp
try await pixelBufferReceiver.append(.init(unsafeBuffer: newCVPixelBuffer), with: presentationTime)
}
}
Does anyone know how to fix it?
General
RSS for tagExplore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I received my approval for FairPlay Streaming (FPS) and was getting things organized and then my computer crashed. So...
Yes, I lost the 32 digit Account Security Key (ASk) that I was warned not to lose repeatedly...
I understand that I can't query apple for the existing ASk. I don't see where I can delete the existing cert to request another one. So I assume I'll need to start from scratch either with another FPS approval process.
Can someone please direct me on next steps for this boneheaded situation.
Thank you
With the help of Claude and Codex, I've tried upgrading a screen magnification, capture, and pixel peeping app (SnoopX, from Stanford's software repository) to be HDR-aware. For images that are truly HDR (displayed in Apple Photos, Google Photos in Chrome, or Mac Preview), and on an HDR monitor with plenty of EDR headroom (Apple XDR display), I should see pixel values well above 1.0. Apple's Digital Color Meter does. However, my app does not. I think the code is doing all the right things with ScreenCaptureKit, but it never returns values above 1.0. Has anybody gotten this to work? Here is what Codex says about the code it helped me build in my app:
• Set captureDynamicRange:
We set config.captureDynamicRange = SCCaptureDynamicRangeHDRLocalDisplay (when available).
Note: Gemini’s SCCaptureModeHDRLocalDisplay isn’t a real symbol — the actual API is SCCaptureDynamicRange.
• Use preset:
We try initWithPreset: SCStreamConfigurationPresetCaptureHDRStreamLocalDisplay, but your runtime crashes on initWithPreset: (unrecognized selector), so we fall back to manual config. That means your system SDK headers mention the preset, but the runtime doesn’t implement it.
• Pixel format:
We explicitly set kCVPixelFormatType_64RGBAHalf. Your logs show the stream buffer is 64RGBAHalf.
• Enable HDR on display:
EDR headroom logs show current=6.40, so HDR is active.
• Color space:
We set Extended Linear Display P3 for stream, and we forced the screenshot sampler to convert into extended‑linear P3 before reading. The screenshot itself reports Extended sRGB, but conversion doesn’t yield >1.0 either.
Despite all of that, the OS is still delivering tone‑mapped values (max 1.0000) for both the stream and the screenshot path.
So I don’t think there’s a missing knob here; it looks like a system‑level limitation on Tahoe for desktop capture.