AVAssetResourceLoaderDelegate and TS files error: Error Domain=CoreMediaErrorDomain Code=-12881 "custom url not redirect"


I've used AVAssetResourceLoaderDelegate and sucessfully implemented ways to store mp3 and mp4 files.

The problem starts when I try to implement the same thing for HLS. The master playlists (m3u8) are working perfectly fine but when I try the same for .ts files (video segments) I get this error: Error Domain=CoreMediaErrorDomain Code=-12881 "custom url not redirect"

I've tried everything and even using the redirect approach to a filepath URL fails.

The only possible solution is to have an internal HTTP server but that doesn't make sense to me. If I'm implementing AVAssetResourceLoaderDelegate then I should be apple to feed AVFoundation the data needed to play a video, right?

Thanks in advance.

Help for this would be much appreciated.


Apple won't allow you redirect the segment to customer protocol, only the m3u8 can.

Not sure if you can do like handle the http protocol manually and filter the segment request... but any http request will come to you first.

I am also facing the same issue .. I also want to play the cached .ts segment if available otherwise from the server . But getting the same error mention above.

Any solution will helpful
AVFoundation requires the ability to load HLS media segments directly in order to drive its bit rate adaptation. What is your use case for loading media data yourself? There might be a different way to accomplish what you are trying to do.
Thanks for your reply. I am creating a custom playlists (m3u8) & passing it to player and it is working fine with the custom scheme.
In this method "shouldWaitForLoadingOfRequestedResource" when there is a call for ts segment with the custom scheme .
I am replacing the custom scheme with the https.

 NSURLRequest *redirect = nil;
NSString *url = loadingRequest.request.URL.absoluteString;
 NSRange tldr = [url rangeOfString:@"//"];

    if (![url hasPrefix:@"http"] && tldr.location != NSNotFound) {
        url = [url stringByReplacingCharactersInRange:NSMakeRange(0, tldr.location) withString:@"https:"];
    redirect = [NSURLRequest requestWithURL:[NSURL URLWithString:url]];
 [loadingRequest setRedirect:redirect];
  NSHTTPURLResponse *response = [[NSHTTPURLResponse alloc] initWithURL:[redirect URL] statusCode:302 HTTPVersion:nil headerFields:nil];
             [loadingRequest setResponse:response];
             [loadingRequest finishLoading];

and it is working fine ..

My question is how I can pass downloaded ts file. if there is a downloaded ts then return that otherwise do above step

For offline (downloaded) cases you should use AVFoundation to download and play your HLS assets (AVAssetDownloadTask, AVAggregateAssetDownloadTask and friends)

There will be a WWDC2020 session posted on Friday, "Discover how to download and play HLS offline" that you may wish to check out.
Thanks for your quick reply. Suppose I have multiple .ts files for a video, some of .ts are pre cached & some .ts needs to fetch from the server . I want to play like this.

I need help..

AVAssetDownloadTask and friends support play-while-download. You can ask it to begin downloading TS files and then start playback before they are all downloaded. You can also specify different media selections to download different languages etc.

Is there a particular reason you want to control your own download?
Thanks for your quick reply. I am creating manifest file by myself and also downloading all the .ts files in advance . When user clicks on a video for play then I will pass manifest file data to player using Assetdelegate and ts data if file has downloaded otherwise with
valid url so that play can play that ts file from server not from cache .
I am using custom schemes so that I am getting call-backs in asset delegates .

I have already manifest files data & dummy m3u8 url with custom scheme.

I want to pre-cache some portion of video .

Okay, I understand all that. What I'm missing is why you can't do all that using AVAssetDownloadTask, including downloading the .ts files in advance.
Okay. I will try with AVAssetDownloadTask.
I have a question can we do like this using AVAssetResourceLoadingRequest for .ts

 NSString * fileExists = [[DownloadManager sharedManager] fileFullPathOfURL:contentURL];
    if ([[NSFileManager defaultManager]fileExistsAtPath:fileExists] && loadingRequest.dataRequest) {
NSData *tsData  = [NSData dataWithContentsOfURL:[NSURL fileURLWithPath:fileExists]];
 [loadingRequest.dataRequest respondWithData:tsData];
        [loadingRequest finishLoading];
        return TRUE;

If I am returning the cached .ts file data then I am getting the error: Error Domain=CoreMediaErrorDomain Code=-12881 "custom url not redirect"

The only response to an AVAssetResourceLoadingRequest for a segment (.ts) file that AVPlayer will accept is a redirect to an HTTP URL. respondWithData will be rejected.
Thanks for support. I want to know exactly this that we can not use respondWithData for .ts file . Now I will try to use AVAssetDownloadTask.

First off, this thread has been very helpful!

The only response to an AVAssetResourceLoadingRequest for a segment (.ts) file that AVPlayer will accept is a redirect to an HTTP URL. respondWithData will be rejected.

I am wondering if the same limitation exists with .fmp4 files?

I want to feed real-time video delivered using WebRTC into an AVPlayerLayer that is backing AVPictureInPictureController. I am doing this since AVPictureInPictureController does not support AVSampleBufferDisplayLayer, which is how I would think to solve this problem. After watching the WWDC 2020 sessions, I am using the AVAssetWriterSegmentation APIs to transcode the input and then generating the m3u8.

Users can tolerate some delay if the content is a presentation screen share, but I don't want to run a local webserver when I already have the fmp4s and m3u8 in memory. The transfer of the segments is going to be very quick since it will be happening on device.

Thanks again for the helpful discussion.
Yes, the same limitation exists with any media segment (TS, fmp4, packed audio, WebVTT, whatever).

You should write a bug describing your use case to ask for AVPictureInPictureController support in AVSampleBufferDisplayLayer.

Until you get that you can look into using LL-HLS vended from an HTTP server running on the device. HLS isn't really suited to low-latency applications but LL-HLS is much better.

Note that there's no particular reason that a web server needs to hit the disk. At the end of the day it's just an application that is listening on a TCP port and responding to HTTP GET requests.
I have an open bug report here: FB7747223 AVPictureInPictureController does not work with AVSampleBufferDisplayLayer