Hi,
I have a usecase where I'd like to handle and prevent automatic retries whenever certain errors occur during FairPlay content key requests.
Here's the current flow:
FairPlay certificate is requested and obtained from my server
makeStreamingContentKeyRequestData is called on the keyRequest
The license server will return a 403 along with a body response containing a json with the detailed code and message
The error is caught and handled properly by calling AVContentKeyRequest.processContentKeyResponseError
The AVContentKeySession automatically retries up to 8 times by providing a new key request through public func contentKeySession(_ session: AVContentKeySession, didProvide keyRequest: AVContentKeyRequest)
My license server gets hit with 8 requests that will always result in a 403, these retries are useless
My custom error is succesfully caught later down the line through AVPlayerItem.observe(\.status), this is great
Thing is.. I'd like to catch the 403 error and prevent any retry from being made at step 5, ideally through
public func contentKeySession(_ session: AVContentKeySession, contentKeyRequest keyRequest: AVContentKeyRequest, didFailWithError err: Error)
I've looked for quite a while and just can't seem to find any way of achieving this. Is this not supported at all?
Streaming
RSS for tagDeep dive into the technical specifications that influence seamless playback for streaming services, including bitrates, codecs, and caching mechanisms.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Created
HLS live streaming 4k is not displaying any video, but is streaming audio.
Getting the following errors in the console where it shows that it is failing to decode every frame.
Can I get some help as to what these error codes refer to and why it would fail to decode?
08:30:42.675879-0800 videocodecd AppleAVD: AppleAVDDecodeFrameInternal(): avdDec - Frame# 3588, DecodeFrame failed with error: 0x196
08:30:42.675908-0800 videocodecd AppleAVD: AppleAVDDisplayCallback(): Asking fig to drop frame # 3588 with err -12909 - internalStatus: 315
08:30:42.697412-0800 videocodecd AppleAVD: AppleAVDDecodeFrameResponse(): Frame# 3589 DecodeFrame failed with error 0x00000196
08:30:42.697876-0800 videocodecd AppleAVD: AppleAVDDecodeFrameInternal(): failed - error: 406
Hello,
I used AVPlayer in my project to play network movie.
Most movie could play normally, but I found the sound will disappear sometimes if I play specified 4K video network stream.
The video will continue playing but audio stops after video is played for a while.
If I pause player and then resume, the sound will be back but disappeared again after several seconds
Check AVPlayerItem status:
isPlaybackLikelyToKeepUp` == true
isPlaybackBufferEmpty` = false
player.volume > 0
According the value above, it seems not cause by empty playback buffer or volume issue. I am so confused for this situation.
Movie information
Video
Format : AVC
Format/Info : Advanced Video Codec
Format profile : High L5.1
Codec ID : avc1
Codec ID/Info : Advanced Video Coding
Bit rate mode : Variable
Bit rate : 100.0 Mb/s
Width : 3 840 pixels
Height : 2 160 pixels
Display aspect ratio : 16:9
Frame rate mode : Constant
Frame rate : 29.970 (30000/1001) FPS
Audio
Format : AAC LC
Format/Info : Advanced Audio Codec Low Complexity
Codec ID : mp4a-40-2
Duration : 5 min 19 s
Bit rate mode : Constant
Bit rate : 192 kb/s
Nominal bit rate : 48.0 kb/s
Channel(s) : 2 channels
Channel layout : L R
Sampling rate : 48.0 kHz
Frame rate : 46.875 FPS (1024 SPF)
Does anyone know if AVPlayer has this limitations when playing high-bitrate movie streams, and are there any solutions?
We're integrating a web based group calling application within a native iOS application and finding that every time a CallKit session gets fully established the web based media streams break, rendering as gray with no audio.
Up to iOS 18 we worked around it by not fulfilling the call start action but that's no longer an option as the audio stopped getting automatically redirected to the speakers. We would now need the CXProvider's didActivateAudioSession callback but that would break the video.
The sample project loads up a simple webpage in a WKWebView which contains a video tag streaming the media from the device's camera.
At the same time it sets up a new CallKit session by requesting and fulfilling a CXStartCallAction transaction.
You will notice that the media doesn't render and, if you are to follow the warnings we left, you will find that not fulfilling the CXStartCallAction fixes it.
Unfortunately that's not a workaround we can use as we need the CXProvider delegate to inform us about audio session changes so we can redirect the audio to the speaker (so the proximity sensor doesn't activate and locking the screen doesn't end the call)
Any insights or workarounds would be greatly appreciated.
We are experiencing an issue with our HLS MPEG-TS streams on Apple devices, where the AVPlayer in our iOS app and Safari jumps back to the start when the player automatically changes quality. This occurs despite the stream still indicating that it is live and there is no change in the seekbar. After testing our streams with the Apple HLS Validator, the only problem that occured was an "Measured peak bitrate compared to multivariant playlist declared value exceeds error tolerance"-Error.
On Chrome and on our Android-App this playback bug does not happen. Has someone else experienced similar issues with the AVPlayer?
Hello, we have HLS stream app and we use AVPlayer for HLS stream. We want to implement dynamic resulotion feature as user's selection. For example, if user want to watch only 1080p user has to watch only 1080p but we have tried to implement "preferredMaximumResolution" and "preferredPeakBitRate" parameters and but AVPlayer does not force it which means that setting preferredMaximumResolution= CGSize(width: 1920, height: 1080) player does not only force to play 1080p profile, player drops resulotion to 720p but we do not want 720p stream if user selected 1080p resulotion. Is there any method to force it even if stream stalls? Thank you in advance
Topic:
Media Technologies
SubTopic:
Streaming
Tags:
FairPlay Streaming
Media Player
Video
HTTP Live Streaming
iOS18 Added support for WebRTC HEVC RFC 7789 RTP Payload Format. (112001659),How can i determine whether iOS 18 WebRTC uses hardware or software decoding for HEVC?
I cannot mirror or extend my screen from mac mini m2 to iPad 10 gen. Whenever I click on "mirror or extend screen" my external display for mac refreshes after showing "no signal" and comes back on meanwhile my iPad locks out and screen mirror or extending is unsuccessful. But I can mirror my iPad screen to mac mini m2. Earlier everything was working, suddenly it is not working
Hello,
I submitted a Request for a Deployment Package. However, I have not received any emails yet. Could you please let me know when I can expect a confirmation message?
Thank you.
Case-ID: 9391388
Our application uses timed Metadata as part of a rating control system.
We noticed a problem in production and diagnosis shows that we stop receiving timed Metadata on iOS18 only
Our live streams are primed with metadata at least once per second but we are seeing extended gaps in receiving this content, in excess of 10 minutes.
We have also observed that this happens more as the player climbs the bitrate ladder, and doesn't happen if we cap to a low resolution
i.e. a preferredMaximumResolution of 768x432.
Furthermore, if we throttle network conditions after we stop receiving metadata the we start receiving them again.
Following is a simple example that demonstrates the above behaviour, unfortunately I cannot share the live stream endpoint which is primed with metadata publicly, but can provide privately to Apple to reproduce the problem.
import UIKit
import AVKit
class ViewController: UIViewController, AVPlayerItemMetadataOutputPushDelegate {
var player: AVPlayer?
var itemMetadataOutput: AVPlayerItemMetadataOutput?
override func viewDidAppear(_ animated: Bool) {
guard let url = URL(string: "endpoint redacted") else {
return
}
let player = AVPlayer(url: url)
let controller = AVPlayerViewController()
controller.player = player
self.player = player
present(controller, animated: true) {
player.play()
let currentItem = player.currentItem
let itemMetadataOutput = AVPlayerItemMetadataOutput(identifiers: nil)
self.itemMetadataOutput = itemMetadataOutput
self.itemMetadataOutput?.setDelegate(self, queue: .main)
currentItem?.add(itemMetadataOutput)
}
}
public func metadataOutput(_ output: AVPlayerItemMetadataOutput,
didOutputTimedMetadataGroups groups: [AVTimedMetadataGroup],
from track: AVPlayerItemTrack?) {
print("received metadata \(Date())")
}
}
I use AVPlayer to play HLS video successfully on macOS Sonoma, but I encountered this error on macOS Sequoia. Please help me:
Error Domain=AVFoundationErrorDomain Code=-11833 ‘Cannot Decode’ UserInfo={NSUnderlyingError=0x600001e57330 {Error Domain=CoreMediaErrorDomain Code=-12906 ‘(null)’}, NSLocalizedFailureReason=The decoder required for this media cannot be found., AVErrorMediaTypeKey=vide, NSLocalizedDescription=Cannot Decode}
Thanks!
Hi there,
After upgrading to iOS 18, I noticed that ApplicationMusicPlayer.Queue behavior has broken if at least one song that is added to the queue is also in to the Apple Music Library on the device.
The resulting behavior is that the queue does not accept all the items, and only items that are in the Library are playable in the queue.
The expected behavior and the previous behavior on iOS 17 was that all the items would be added to the queue successfully. I confirmed this behavior on a separate test device running iOS 17.7.
The items added are all being fetched via MusicCatalogResourceRequest<Song> so I would expect that a requested song being present in the library would have no effect.
Hi,
I am writing to seek any help or workaround regarding an issue I have encountered while implementing Low-Latency HLS playback using the AVAssetResourceLoaderDelegate.
I have been successfully loading playlists during HLS live playback using the AVAssetResourceLoaderDelegate. However, after introducing Low-Latency HLS, I have run into a problem.
When the AVPlayer loads low-latency content playback natively, everything works fine. But when I use the delegate for loading, I encounter the following error from AVPlayer's status observer:
CoreMediaErrorDomain -15410
Low Latency: Server must support http2 ECN and SACK
It seems there is no problem since playback does not stop, but there is a very critical part missing. The playback does not achieve the expected low latency and behaves similarly to standard HLS.
Additionally, this behavior only occurs on iOS 16 devices and simulators. On iOS 17 simulators and devices, the error message does not appear, and the latency remains low as expected.
Therefore, I suspect that there might be some misjudgment in the verification process within the internal implementation of AVPlayer.
Since our app needs to support iOS 16, I would appreciate any solutions, methods to try, or workarounds that you could share regarding this issue.
Thank you.
Hi
I'm trying to stream a H264 video feed that is coming from a uniview IP camera in a browser however the stream is just not displaying. Either I get a single frame or just a black screen. I get the same issues on safari on the mac or any browser on an iphone. However the video stream works just fine using hls.js in Windows or on Android.
We are grabbing the the RTSP stream from the camera and using ngix to serve the .m3u8 url. However even if we save the stream to a file and try an play it on the iphone it has the same issue (unless we use a separate media player like VLC).
I know if we use ffmpeg to reencode as H264 rather than copy it the it will play. My guess there is an incompatibility between how uniview encode the video and what apple can accept.
I've asked uniview and they are not sure what the problem is either.
Is there a way to get more debug information on why a particular HLS stream is failing in safari on mac or iPhone.
Topic:
Media Technologies
SubTopic:
Streaming
The list of certificates on the Apple Developer web console shows the expiry of my Fairplay Streaming certificate as 'Never'.
However, if I download the same certificate and import it into my KeyChain, the certificate details show the listed expiry as 11 OCT 2023.
Which of these is correct? If the expiry in the certificate is correct, how do I renew it safely.
With my App the below lines fails at the process of -streamingContentKeyRequestData-
CODE
guard
let contentIdData = (loadingRequest.request.url?.host ?? "").data(using: .utf8),
let spcData = try? loadingRequest.streamingContentKeyRequestData(
forApp: certificate!, // This certificate is expired
contentIdentifier: contentIdData,
options: nil
)
else {
print("Error: Failed to generate SPC data due to expired certificate.")
loadingRequest.finishLoading(with: NSError(domain: "com.example.error", code: -3, userInfo: nil))
return false
}
After the update on iOS 18, FairPlay content does not play.
We get an error: CoreMediaErrorDomain Code=-12891.
this error occurs after sending a ckc message. What does this error mean? Everything works fine on iOS < 18.
I have an HDR10+ encoded video that if loaded as a mov plays back on the Apple Vision Pro but when that video is encoded using the latest (1.23b) Apple HLS tools to generate an fMP4 - the resulting m3u8 cannot be played back in the Apple Vision Pro and I only get back a "Cannot Open" error.
To generate the m3u8, I'm just calling mediafilesegmenter (with -iso-fragmented) and then variantplaylistcreator. This completes with no errors but the m3u8 will playback on the Mac using VLC but not on the Apple Vision Pro.
The relevant part of the m3u8 is:
#EXT-X-STREAM-INF:AVERAGE-BANDWIDTH=40022507,BANDWIDTH=48883974,VIDEO-RANGE=PQ,CODECS="ec-3,hvc1.1.60000000.L180.B0",RESOLUTION=4096x4096,FRAME-RATE=24.000,CLOSED-CAPTIONS=NONE,AUDIO="audio1",REQ-VIDEO-LAYOUT="CH-STEREO"
{{url}}
Has anyone been able to use the HLS tools to generate fMP4s of MV-HEVC videos with HDR10?
When playing several short HLS clips using AVPlayer connected to a TV using Apple's Lightning-to-HDMI adapter (A1438) we often fail with those unknown errors.
CoreMediaErrorDomain -12034
and
CoreMediaErrorDomain -12158
Anyone has any clue what the errors mean?
Environment:
iPhone8
iOS 15.4
Lightning-to-HDMI adapter (A1438)