HTTP Live Streaming

RSS for tag

Send audio and video over HTTP from an ordinary web server for playback on Mac, iOS, and tvOS devices using HTTP Live Streaming (HLS).

Posts under HTTP Live Streaming tag

49 Posts

Post

Replies

Boosts

Views

Activity

How to consume video from an RTSP service?
Hi,It seems like it's pretty easy to consume HTTP Live Streaming content in an iOS app. Unfortunately, I need to consume media from an RTSP server. It seems to me that this is a very similar thing, and that all of the underpinnings for doing it ought to be present in iOS, but I'm having a devil of a time figuring out how to make it work without doing a lot of programming.For starters, I know that there are web-based services that can consume an RTSP stream and rebroadcast it as an HTTP Live Stream that can be easily consumed by the media players in iOS. This won't work for me because my application needs to function in an environment where there is no internet access (it's on a private Wifi network where the only other thing on the network is the device that is serving the RTSP stream).Having read everything I can get my hands on and exploring third-party and open-source solutions, I've compiled the following list of ideas:1. Using an iOS build of the open-source ffmpeg library, which supports RTSP, I've come up with a test app that can receive the RTSP packets, decode them, create UIImages out of the frames, and display those frames on-screen. This provides a crude player, but performance is poor, most likely because ffmpeg can't take advantage of any hardware acceleration. It also doesn't provide me with any way to integrate the video stream into AVFoundation, so I'm on my own as far as saving the stream to a file, transcoding it, etc.2. I know that the AVURLAsset class doesn't directly support the RTSP scheme. Since I have access to the undecoded RTSP packets via ffmpeg, I've thought it should be possible to implement RTSP support myself via a custom NSURLProtocol, essentially fooling AVFoundation into reading those packets as if they originated in a file. I'm not sure if this would work, since the raw packets coming from the RTSP server might lack the headers that would otherwise be present in data being read from a file. I'm not even sure if AVFoundation would recognize my custom protocol.3. If a protocol doesn't work, I've considered that I might be able to implement my own local HTTP Live Streaming server that converts the RTSP packets into an HTTP stream that the media players can read. This sounds like a terribly convoluted solution to the problem, at best, and very difficult at worst.4. Going back to solution (1), if I could speed up the decoding by using some iOS CoreVideo function instead of ffmpeg, this solution might be okay. However, I can't find any documentation for CoreVideo on iOS (Apple only documents it for OS X).5. I'm certainly willing to license a third-party solution if it works well and provides good performance. Unfortunately, everything I've found so far is pretty crummy and mostly just leverages ffmpeg and/or VLC. What is most disappointing to me is that nobody seems to be able or willing to provide a solution that neatly integrates with AVFoundation. I really want to make my RTSP stream available as an AVAsset so I can use it with AVFoundation players and other classes -- I don't want to build an app that relies on custom third-party code for everything.Any ideas, tips, advice would be greatly appreciated.Thanks,Frank
9
1
16k
2d
tvOS 26 - AVPlayer.preventsDisplaySleepDuringVideoPlayback not working
Hi guys, after updating to tvOS 26 it is not possible to disable screensaver using AVPlayer.preventsDisplaySleepDuringVideoPlayback. We are streaming TV programs continuously and when player is in full creen we disable screensaver by setting preventsDisplaySleepDuringVideoPlayback = true When leaving player screen and navigating to Home or EPG where player continues to play in the background we set preventsDisplaySleepDuringVideoPlayback = false to enable screen saver in which case after set time screensaver activates. Disabling screensaver appears to be working only when starting the app for the first time, but when making the first transition out of the player and calling preventsDisplaySleepDuringVideoPlayback = false for the first time, then any subsequent change of preventsDisplaySleepDuringVideoPlayback to true has no effect. Result is that player is playing in full screen, but after set period (e.g. 2 minutes) screen saver activates, which is very bad user experience making TV app unusable. I tried using Xcode 26 and targetting tvOS 17 and higher. I installed Xcode 16.4 and rebuilt the app, but I see the same problem with screensaver. Xcode 16.4 and tvOS 17 target worked before. It appears to be related to tvOS 26 not to SDK itself. Is there perhaps a new API to disable screensaver? Is preventsDisplaySleepDuringVideoPlayback obsolete or could it be intention to disallow developers disabling screensaver? Do we need to notify all users to set screensaver time to higher value or Never if they want to watch TV for all day without touching remote? Not sure if it is a bug in tvOS 26 or purposely changed behavior. Does anybody know? Thanks.
3
0
96
6d
On iOS26, in our video playback app(use AVPlayer), the sound and video are out of sync when playing after seeking.
Our app plays TS files on an iPhone. The app fragments the TS files, creates an M3U8 playlist, converts them to HLS(HTTP Live Streaming), and then uses AVPlayer to play the video content. On a device running iOS 26, after starting playback and seeking, restarting playback causes the video and audio to be out of sync (by about 2-3 seconds depending on the situation). This also occurs on iPadOS/macOS 26. This issue was not observed prior to iOS 18. We are trying to fix this issue on the app side, but we have the following questions: The behavior of AVPlayer is different between iOS 26 and previous versions. Has there been any change that could be considered? Or is it a bug? We tried pausing before seeking, but it didn’t seem to have any effect. Are there any APIs or workarounds that can improve this? We would appreciate it if you could tell us any other helpful documents or URLs.
0
0
139
2w
Getting CoreMediaErrorDomain -15628 playback failure in iOS 26 (AVPlayer, HLS stream)
Hi, After updating to iOS 26, our app is experiencing playback failures with AVPlayer. The same code and streams work fine on iOS 18 and earlier. Error: Domain [CoreMediaErrorDomain] Code [-15628] Description [The operation couldn’t be completed.] Underlying Error Domain [(null)] Code [0] Description [(null)] Environment: iOS version: iOS 26 Stream type: HLS (m3u8) with segment (.ts) files Observed behaviour: We don’t have concrete steps to reproduce the issue, but so far, we have observed that this error tends to occur under low network conditions.
0
0
83
2w
EXT-X-DISCONTINUITY misalignment
We encounter issue with avplayer in case of EXT-X-DISCONTINUITY misalignment between audio and video produced after insertion of gaps. The initial objective is to introduce an EXT-X-DISCONTINUITY in audio playlist after some missing segments (EXT-X-GAP) which durations are aligned to video segments durations, to handle irregular audio durations. Please find below an example of corresponding video and audio playlists: video: #EXTM3U #EXT-X-VERSION:7 #EXT-X-MEDIA-SEQUENCE:872524632 #EXT-X-INDEPENDENT-SEGMENTS #EXT-X-TARGETDURATION:2 #USP-X-TIMESTAMP-MAP:MPEGTS=7096045027,LOCAL=2025-05-09T12:38:32.369100Z #EXT-X-MAP:URI="hls/StreamingBasic-video=979200.m4s" #EXT-X-PROGRAM-DATE-TIME:2025-05-09T12:38:32.369111Z #EXTINF:2.002, no desc hls/StreamingBasic-video=979200-872524632.m4s #EXTINF:2.002, no desc hls/StreamingBasic-video=979200-872524633.m4s #EXTINF:2.002, no desc hls/StreamingBasic-video=979200-872524634.m4s #EXTINF:2.002, no desc hls/StreamingBasic-video=979200-872524635.m4s #EXTINF:2.002, no desc hls/StreamingBasic-video=979200-872524636.m4s ## Media sequence discontinuity #EXT-X-GAP #EXTINF:2.002, no desc hls/StreamingBasic-video=979200-872524637.m4s ## Media sequence discontinuity #EXT-X-GAP #EXTINF:2.002, no desc hls/StreamingBasic-video=979200-872524638.m4s #EXT-X-PROGRAM-DATE-TIME:2025-05-09T12:38:46.383111Z #EXTINF:2.002, no desc hls/StreamingBasic-video=979200-872524639.m4s #EXTINF:2.002, no desc hls/StreamingBasic-video=979200-872524640.m4s audio: EXTM3U #EXT-X-VERSION:7 #EXT-X-MEDIA-SEQUENCE:872524632 #EXT-X-INDEPENDENT-SEGMENTS #EXT-X-TARGETDURATION:2 #USP-X-TIMESTAMP-MAP:MPEGTS=7096045867,LOCAL=2025-05-09T12:38:32.378400Z #EXT-X-MAP:URI="hls/StreamingBasic-audio_99500_eng=98800.m4s" #EXT-X-PROGRAM-DATE-TIME:2025-05-09T12:38:32.378444Z #EXTINF:2.0053, no desc hls/StreamingBasic-audio_99500_eng=98800-872524632.m4s #EXTINF:2.0053, no desc hls/StreamingBasic-audio_99500_eng=98800-872524633.m4s #EXTINF:2.0053, no desc hls/StreamingBasic-audio_99500_eng=98800-872524634.m4s #EXTINF:1.984, no desc hls/StreamingBasic-audio_99500_eng=98800-872524635.m4s #EXTINF:2.0053, no desc hls/StreamingBasic-audio_99500_eng=98800-872524636.m4s ## Media sequence discontinuity #EXT-X-GAP #EXTINF:2.002, no desc hls/StreamingBasic-audio_99500_eng=98800-872524637.m4s ## Media sequence discontinuity #EXT-X-GAP #EXTINF:2.002, no desc hls/StreamingBasic-audio_99500_eng=98800-872524638.m4s #EXT-X-DISCONTINUITY #EXT-X-PROGRAM-DATE-TIME:2025-05-09T12:38:46.778444Z #EXTINF:1.6213, no desc hls/StreamingBasic-audio_99500_eng=98800-872524639.m4s #EXTINF:2.0053, no desc hls/StreamingBasic-audio_99500_eng=98800-872524640.m4s In this case playback is broken with avplayer. Is it conformed to Http Live Streaming? Is it an avplayer bug? What are the guidelines to handle such gaps?
0
0
130
Jul ’25
Race issue - Custom AVAssetResourceLoaderDelegate cannot work with EXT-X-SESSION-KEY
TL;DR How to solve possible racing issue of EXT-X-SESSION-KEY request and encrypted media segment request? I'm having trouble using custom AVAssetResourceLoaderDelegate with video manifest containing VideoProtectionKey(VPK). My master manifest contains rendition manifest url and VPK url. When not using custom resource delegate, everything works fine. My custom resource delegate is implemented in way where it first append prefix to scheme of the master manifest url before creating the asset. And during handling master manifest, it puts back original scheme, make the request, modify the scheme for rendition manifest url in the response content by appending the same prefix again, so that rendition manifest request also goes into custom resource loader delegate. Same goes for VPK request. The AES-128 key is stored in memory within custom resource loader delegate object. So far so good. The VPK is requested before segment request. But the problem comes where the media segment requests happen. The media segment request url from rendition manifest goes into custom resource loader as well and those are encrypted. I can see segment request finish first then the related VPK requests kick in after a few seconds. The previous VPK value is cached in memory so it is not network causing the delay but some mechanism that I'm not aware of causing this. So could anyone tell me what would be the proper way of handling this situation? The native library is handling it well so I just want to know how. Thanks in advance!
2
0
83
Jun ’25
FairPlay HLS Downloaded Asset Fails to Play on First Attempt When Offline, Works on Retry
Hello, We're seeing an intermittent issue when playing back FairPlay-protected HLS downloads while the device is offline. Assets are downloaded using AVAggregateAssetDownloadTask with FairPlay protection. After download, asset.assetCache.isPlayableOffline == true. On first playback attempt (offline), ~8% of downloads fail. Retrying playback always works. We recreate the asset and player on each attempt. During the playback setup, we try to load variants via: try await asset.load(.variants) This call sometimes fails with: Error Domain=NSURLErrorDomain Code=-1009 “The Internet connection appears to be offline.” UserInfo={NSUnderlyingError=0x105654a00 {Error Domain=NSURLErrorDomain Code=-1009 “The Internet connection appears to be offline.” UserInfo={NSDescription=The Internet connection appears to be offline.}}, NSErrorFailingURLStringKey=file:///private/var/mobile/Containers/Data/Application/2DDF9D7C-9197-46BE-8690-C23EE75C9E90/Library/com.apple.UserManagedAssets.XVvqfh/Baggage_9DD4E2D3F9C0E68F.movpkg/, NSErrorFailingURLKey=file:///private/var/mobile/Containers/Data/Application/2DDF9D7C-9197-46BE-8690-C23EE75C9E90/Library/com.apple.UserManagedAssets.XVvqfh/Baggage_9DD4E2D3F9C0E68F.movpkg/, NSURL=file:///private/var/mobile/Containers/Data/Application/2DDF9D7C-9197-46BE-8690-C23EE75C9E90/Library/com.apple.UserManagedAssets.XVvqfh/Baggage_9DD4E2D3F9C0E68F.movpkg/, AVErrorFailedDependenciesKey=( “assetProperty_HLSAlternates” ), NSLocalizedDescription=The Internet connection appears to be offline.} This variant load is used to determine available audio tracks, check for Dolby support, and apply user language preferences. After this step, the AVPlayerItem also fails via Combine’s publisher for .status. However, retrying the entire process immediately after (same offline conditions, same asset path, new AVURLAsset) results in successful playback. Assets are represented using the following class: public class DownloadedAsset: AVURLAsset { public let id: String public let localFileUrl: URL public let fairplayLicenseUrlString: String? public let drmToken: String? var isProtected: Bool { return fairplayLicenseUrlString != nil } public init(id: String, localFileUrl: URL, fairplayLicenseUrlString: String?, drmToken: String?) { self.id = id self.localFileUrl = localFileUrl self.fairplayLicenseUrlString = fairplayLicenseUrlString self.drmToken = drmToken super.init(url: localFileUrl, options: nil) } } We use user-selected quality levels to control bitrate and multichannel (e.g. Dolby 5.1) downloads: let downloadQuality = UserDefaults.standard.downloadVideoQuality let bitrate: Int let shouldDownloadMultichannelTracks: Bool switch downloadQuality { case .dataSaver: shouldDownloadMultichannelTracks = false bitrate = 596564 case .standard: shouldDownloadMultichannelTracks = false bitrate = 1503844 case .best: shouldDownloadMultichannelTracks = true bitrate = 7038970 } var selections = multichannelIdentifiedMediaSelections if !shouldDownloadMultichannelTracks { selections = selections.filter { !$0.isMultichannel } } let task = session.aggregateAssetDownloadTask( with: asset, mediaSelections: selections.map { $0.mediaSelection }, assetTitle: title, assetArtworkData: nil, options: [AVAssetDownloadTaskMinimumRequiredMediaBitrateKey: bitrate] ) Seen on devices running iOS 16, iOS 17, and iOS 18. What could cause the initial failure of an otherwise valid, offline-ready FairPlay HLS asset? Could .load(.variants) internally trigger a failed network resolution, even when offline? Is there an internal caching or initialization behavior in AVFoundation that might explain why the second attempt works? Any guidance would be appreciated.
1
0
151
Jun ’25
CoreMediaErrorDomain -12888: Bandwidth down-stepping when using 2sec segment duration
the problem is when using HLS live stream with AVPlayer on iOS/ tvOS the player chooses first highest bandwidth then slowly steps down to lowest (within 1-3min) and eventually steps up again then repeats to step down. the AVPlayer error log sends events: errorStatusCode: -12888, errorDomain: Optional("CoreMediaErrorDomain"), errorComment: Optional("The operation couldn't be completed. (CoreMediaErrorDomain error -12888 - Playlist File unchanged for longer than 1.5 * target duration we use standard segments in CMAF format, 2sec duration #EXTM3U #EXT-X-VERSION:6 #EXT-X-TARGETDURATION:2 #EXT-X-MEDIA-SEQUENCE:147065903 #EXT-X-MAP:URI="video_1_4660000_init.mp4?device_profile=cmaf_cbcs_verimatrix_cei%26seg_size=2%26cmaf=2" #EXT-X-PROGRAM-DATE-TIME:2025-04-30T12:51:07 #EXTINF:2.000, video_1_4660000_t17460174670001555.mp4?device_profile=cmaf_cbcs_verimatrix_cei%26seg_size=2%26cmaf=2 #EXTINF:2.000, video_1_4660000_t17460174690001555.mp4?device_profile=cmaf_cbcs_verimatrix_cei%26seg_size=2%26cmaf=2 #EXTINF:2.000, video_1_4660000_t17460174710001555.mp4?device_profile=cmaf_cbcs_verimatrix_cei%26seg_size=2%26cmaf=2 when using 6sec segments the player stays stable at highest bandwidth. is there a way to avoid this error? in AVPlayer or HLS configuration?
2
0
192
May ’25
AVAssetWriterInputTaggedPixelBufferGroupAdaptor Hanging With Tagged Buffers
We've successfully implemented an AVAssetWriter to produce HLS streams (all code is Objective-C++ for interop with existing codebase) but are struggling to extend the operations to use tagged buffers. We're starting to wonder if the tagged buffers required for an MV-HEVC signal are fully supported when producing HLS segments in a live-stream setting. We generate a live stream of data using something like: UTType *t = [UTType typeWithIdentifier:AVFileTypeMPEG4]; m_writter = [[AVAssetWriter alloc] initWithContentType:t]; // - videoHint describes HEVC and width/height // - m_videoConfig includes compression settings and, when using MV-HEVC, // the correct keys are added (i.e. kVTCompressionPropertyKey_MVHEVCVideoLayerIDs) // The app was throwing an exception without these which was // useful to know when we got the configuration right. m_video = [[AVAssetWriterInput alloc] initWithMediaType:AVMediaTypeVideo outputSettings:m_videoConfig sourceFormatHint:videoHint]; For either path we're producing CVPixelBufferRefs that contain the raw pixel information (i.e. 32BGRA) so we use an adapter to make that as simple as possible. If we use a single view and a AVAssetWriterInputPixelBufferAdaptor things work out very well. We produce segments and the delegate is called. However, if we use the AVAssetWriterInputTaggedPixelBufferGroupAdaptor as exampled in the SideBySideToMVHEVC demo project, things go poorly. We create the tagged buffers with something like: CMTagCollectionRef collections[2]; CMTag leftTags[] = { CMTagMakeWithSInt64Value( kCMTagCategory_VideoLayerID, (int64_t)0), CMTagMakeWithSInt64Value( kCMTagCategory_StereoView, kCMStereoView_LeftEye) }; CMTagCollectionCreate( kCFAllocatorDefault, leftTags, 2, &(collections[0]) ); CMTag rightTags[] = { CMTagMakeWithSInt64Value( kCMTagCategory_VideoLayerID, (int64_t)1), CMTagMakeWithSInt64Value( kCMTagCategory_StereoView, kCMStereoView_RightEye) }; CMTagCollectionCreate( kCFAllocatorDefault, rightTags, 2, &(collections[1]) ); CFArrayRef tagCollections = CFArrayCreate( kCFAllocatorDefault, (const void **)collections, 2, &kCFTypeArrayCallBacks ); CVPixelBufferRef buffers[] = {*b, *alt}; CFArrayRef b = CFArrayCreate( kCFAllocatorDefault, (const void **)buffers, 2, &kCFTypeArrayCallBacks ); CMTaggedBufferGroupRef bufferGroup; OSStatus res = CMTaggedBufferGroupCreate( kCFAllocatorDefault, tagCollections, b, &bufferGroup ); Perhaps there's something about this OBJC code that I've buggered up? Hopefully! Anyways, when I submit this tagged bugger group to the adaptor: if (![mvVideoAdapter appendTaggedPixelBufferGroup:bufferGroup withPresentationTime:pts]) { // report error... } Appending does not raise any errors - eventually it just hangs on us and we never return from it... Real Issue: So either: The delegate assigned to the AVAssetWriter doesn't fire its assetWriter callback which should produce the segments The adapter hangs on the appendTaggedPixelBufferGroup before a segment is ready to be completed (but succeeds for a number of buffer groups before this happens). This is the same delegate class that's assigned to the non multi view code path if MV-HEVC is turned off which works perfectly.
1
0
68
Apr ’25
Issues with "AVMetricEventStreamPublisher Discover Media Performance Metrics in AVFoundation" Example Code
Hi everyone! I’ve been working with AVFoundation and trying to use the AVMetricEventStreamPublisher to discover media performance metrics, as described in the Apple documentation. https://developer.apple.com/cn/videos/play/wwdc2024/10113/?time=508 However, when following the example code, I’m not getting the expected results. The performance metrics for both audio and video don’t seem to be captured properly. Has anyone successfully used this example code? If so, could you share your experience or any solutions you’ve found? Any tips or insights would be greatly appreciated. Thanks in advance! Ps. the example code: AVPlayerItem *item = ... AVMetricEventStream *eventStream = [AVMetricEventStream eventStream]; id subscriber = [[MyMetricSubscriber alloc] init]; [eventStream setSubscriber:subscriber queue:mySerialQueue] [eventStream subscribeToMetricEvent:[AVMetricPlayerItemLikelyToKeepUpEvent class]]; [eventStream subscribeToMetricEvent:[AVMetricPlayerItemPlaybackSummaryEvent class]]; [eventStream addPublisher:item];
1
0
65
Apr ’25
Video on Safari iOS - UI/UX of Shadow Content User Agent
Hi, when I display an HTML page with a on Safari iOS, I get a nice UI. Great! At the first look I see a video frame with an arrow-in-a-circle button in the middle. Very nice. I click on the arrow and I get a fullscreen view while the video begins to play. I watch the video then I pause it then I click on the top-left x button. So I go back to my html page and the video is perfectly there as it was before. But, there is an annoying new detail. The video frame is really dark, it still presents all the controls and a "different" arrow button to play it again. In other words that nice video-frame, that nice picture, is not longer visible on the page. That nice page with nice pictures has now an almost-black rectangle. Too bad. Sure I can click on the video (outside the controls) then the controls and the black overlaying frame disappear. I can see that nice picture again. Finally. Well, but the arrow-in-a-circle button to play the video disappeared. Now the user cannot longer understand that's a video to play. It looks just like any other pictures to admire statically. Is any way to get the previous first look of the video? The one clear, with the current frame and the arrow-in-a-circle look?
0
0
114
Apr ’25
AVPlayer periodic time observer stops notifying when switching back from AirPlay to local playback
The app registers a periodic time observer to the AVPlayer when the playback starts and it works fine. When switching to AirPlay during playback, the periodic time observation continues working as expected. However, when switching back to local playback, the periodic time observer does not fire anymore until a seek is performed. The app removes the periodic time observer only when the playback stops. I can see that when switching back to local playback, the timeControlStatus successively changes to .waitingToPlayAtSpecifiedRate (reason: .evaluatingBufferingRate) then to .waitingToPlayAtSpecifiedRate (reason: .toMinimizeStalls) and finally to .playing But the time observation does not work anymore. Also, the issue is systematic with Live and VOD streams providing a program date (with HLS property #EXT-X-PROGRAM-DATE-TIME), with or without any DRM, and is never reproduced with other VOD streams.
2
0
79
Apr ’25
FairPlay-Protected HLS Files Not Transferred via Quick Start
FairPlay-Protected HLS Files Not Transferred via Quick Start I have an iOS app that downloads HLS files, which are protected by FairPlay. These files are stored locally, and their locations are managed using Core Data. When playing these tracks, I use AVURLAsset to access the stored file paths. Recently, a client upgraded to a new iPhone and used Quick Start to transfer data from his old device. While all other app data was successfully transferred, including Core Data records and UserDefaults, the actual HLS files were missing. As a result, the app retained metadata about the downloaded content, but the files themselves were gone, causing playback failures. Does Quick Start exclude certain types of locally stored files, especially DRM-protected HLS downloads, or is the issue related to how FairPlay-protected content is handled during the transfer of locally stored files?
2
0
127
Mar ’25
Playback Issues for DRM content when sending CMCD
Since iOS and tvOS 18, CMCD can now be automatically sent by AVPlayer (https://developer.apple.com/streaming/Whats-new-HLS.pdf). However, after enabling CMCD, our streams occasionally fail with the following error: CoreMediaErrorDomain Error -17383 This issue appears to affect only DRM-protected (FairPlay) streams so far. We activate CMCD via the resource loader of an AVURLAsset, before assigning the item to an AVPlayer. Unfortunately, we haven’t found a reliable way to reproduce the issue, and we’ve been unable to gather any useful diagnostic information. Has anyone else observed this behavior when enabling CMCD on FairPlay streams?
2
0
168
Mar ’25
How to detect HDCP support in Safari.
I am playing FairPlay + Multi-Key content (fMP4) in Safari browser. I want to implement the implementation to distinguish between SD and HD video quality, and play it in HD if HDCP is supported, and in SD if HDCP is not supported. I have already confirmed that HDCP support is the default, and that a black screen is output in non-HDCP environments. What I want is to improve the user experience by appropriately switching to SD/HD depending on HDCP support when playing DRM content. Question: Is there an API or function that can detect HDCP support in Safari through JavaScript or other methods? Or is there a way to indirectly guess it?
0
0
138
Mar ’25
FairPlay-Protected HLS Files Not Transferred via Quick Start
I have an iOS app that downloads HLS files, which are protected by FairPlay. These files are stored locally, and their locations are managed using Core Data. When playing these tracks, I use AVURLAsset to access the stored file paths. Recently, a client upgraded to a new iPhone and used Quick Start to transfer data from his old device. While all other app data was successfully transferred, including Core Data records and UserDefaults, the actual HLS files were missing. As a result, the app retained metadata about the downloaded content, but the files themselves were gone, causing playback failures. Does Quick Start exclude certain types of locally stored files, especially DRM-protected HLS downloads, or is the issue related to how FairPlay-protected content is handled during the transfer of locally stored files?
1
0
120
Mar ’25
Apple FPS Package Request – No Response Received. How Long Does It Usually Take?
t has been quite some time since I requested the Apple FPS package, yet I haven’t received it. I haven’t received any email either. Is there a developer support inquiry center where I can check the status of the process? Alternatively, could you share approximately how long it took for you to receive a response email?
0
0
316
Mar ’25
macOS Sonoma 'Cannot Decode' HLS Video
I use AVPlayer to play HLS video successfully on macOS Sonoma, but I encountered this error on macOS Sequoia. Please help me: Error Domain=AVFoundationErrorDomain Code=-11833 ‘Cannot Decode’ UserInfo={NSUnderlyingError=0x600001e57330 {Error Domain=CoreMediaErrorDomain Code=-12906 ‘(null)’}, NSLocalizedFailureReason=The decoder required for this media cannot be found., AVErrorMediaTypeKey=vide, NSLocalizedDescription=Cannot Decode} Thanks!
2
1
610
Mar ’25