Post not yet marked as solved
I've been encountering a substantial increase in the following error log and am eager to find its root cause. The pattern of these logs emerge predominantly when attempting to play downloaded FPS DRM files(MOVPKG files). Except for a few rare instances, most occurrences are associated with content downloaded in previous OS versions, leading to playback issues following recent OS updates.
The error log I've been encountering is as follows:
Error Domain=CoreMediaErrorDomain Code=-16845 "HTTP 400: (unhandled)"
Even after searching, there are hardly any cases available, and the only thing I found is these issues
https://github.com/jhomlala/betterplayer/issues?q=is%3Aissue+16845+is%3Aclosed
I've been advising users to delete and re-download the affected content, which, in all cases, results in successful playback.
I'm seeking advice from anyone who might have experienced similar issues. If you've encountered a comparable situation or have any suggestions, I would greatly appreciate your input.
My project is a TV player app for HLS streams with fairplay encryption. It is made on swiftUI for iPhone and iPad, it is in production.
I have enabled the target "Mac (Designed for iPad)" in the project settings, and It is working perfectly on Mac M1 chips when running the app from the Mac AppStore.
The Mac version has never been main main focus, but it is nice to have it working so easily.
However when I run the app from Xcode, by selecting "My Mac (Designed for iPad)", everytime AVPlayer wants to start playback I am ejected from the app and the only thing I get from the logcat is:
Message from debugger: Terminated due to signal 9
Why? And Why does it work when running the app published on the appstore?
I was able to debug a bit and identify which line of code triggers the issue but I am still stuck:
I am using an AVAssetResourceLoaderDelegate to load the Fairplay Keys instead of the default one (because I need some authentication parameters in the HTTP headers to communicate with the DRM Proxy).
So, in the process I am able to request SPC data and CKC (I have verified the data), and then when the loadingRequest.finishLoading() is called.. BOOM the app is terminated and it triggers the log Message from debugger: Terminated due to signal 9.
I am sharing the delegate method from the AVAssetResourceLoaderDelegate where it happens. This has been written a while ago and is running fine on all devices. If you are not used to this delegate, it is used by AVPlayer whenever a new mediaItem is set with the method: AVPlayer.replaceCurrentItem(with: mediaItem)
func resourceLoader(_ resourceLoader: AVAssetResourceLoader, shouldWaitForLoadingOfRequestedResource loadingRequest: AVAssetResourceLoadingRequest) -> Bool {
guard let dataRequest = loadingRequest.dataRequest else { return false }
getFairplaycertificate { data, _ in
// Request Server Playback Context (SPC) data
guard
let certificate = data,
let contentIdData = (loadingRequest.request.url?.host ?? "").data(using: String.Encoding.utf8),
let spcData = try? loadingRequest.streamingContentKeyRequestData(
forApp: certificate,
contentIdentifier: contentIdData,
options: [AVContentKeyRequestProtocolVersionsKey: [1]]
) else {
loadingRequest.finishLoading(with: NSError(domain: "tvplayer", code: -1, userInfo: nil))
print("⚠️", #function, "Unable to get SPC data.")
return false
}
// Now the CKC can be requested
let networkId = loadingRequest.request.url?.host ?? ""
self.requestCKC(spcData: spcData, contentId: networkId) { ckc, error in
if error == nil && ckc != nil {
// The CKC is correctly returned and is sent to AVPlayer. Stream is decrypted
dataRequest.respond(with: ckc!)
loadingRequest.contentInformationRequest?.contentType = AVStreamingKeyDeliveryContentKeyType
loadingRequest.finishLoading() // <--- THIS LINE IS GUILTY!!!
} else {
print("⚠️", #function, "Unable to get CKC.")
loadingRequest.finishLoading(with: error)
}
}
return true
}
return true
}
If I comment loadingRequest.finishLoading() or if I replace it by: loadingRequest.finishLoading(with: error), the app is not terminated but my decryption keys are not loaded. That's the only clue I have so far.
Any help would be appreciated.
What should I look for?
Is there a way to get a detailed error stacktrace?
Thanks.
Post not yet marked as solved
Hey Apple!
I'm just wondering if there are any recommendations on best practices for supporting AV experiences in SwiftUI?
As far as I know, VideoPlayer is the only API available directly supported in SwiftUI in AVKit (https://developer.apple.com/documentation/avkit) without the need for UIViewRepresentable / UIViewControllerRepresentable bridging of AVPlayer into SwiftUI.
However there are many core video and audio experiences that a modern audience expect that are not supported in VideoPlayer. e.g. PiP
Is there a roadmap for support in SwiftUI directly?
Thanks!
Post not yet marked as solved
we have just a black screen and don't have audio and video !
Post not yet marked as solved
Hard/Software: iPad 7, Ipad OS 17.0.3, MobileVLCKit, SDP/RTSP stream, USB/Ethernet cable.
Description: I have an iPad application for live video streaming. From a security point of view, it should work via cable (USB/Ethernet). This application uses MobileVLCKit player to play videos. Videos are available via a URLs with an IP address and use the SDP/RTSP format (url example: http://172.20.129.69/stream/sdp).
Code example:
let camUri="http://172.20.129.69/stream/sdp"
let options = ["--network-caching=100"]
var players: [VLCMediaPlayer] = []
players.append(VLCMediaPlayer(options: options))
let media = VLCMedia(url: camUri)
players[0].media = media
players[0].drawable = centerView
players[0].play()
Problem: The application worked on various iPads with iPad OS 16 and lower (using cable or Wi-Fi). After updating to iPad OS 17, the application no longer works via cable.
What could be the problem and how to fix it?
Post not yet marked as solved
We have a web application which is having stalled html video element issue on iOS Safari. iOS Version 17.0.3.
The html page contains a inline script tag for example
<script> ... </script> and runs immediately when page loads.
The script does following
videoElement.src = url;
videoElement.load();
The url is a HLS manifest url.
After the DOM elements are created, we attatch the videoElement to a <div> and expecting the video reaches canplaythrough eventually and starts to playing.
Actual Behavior:
The video never plays
videoElement readyState and networkState both stuck at value of 1
we found that "suspened" event was triggered on the video element and not sure who is triggering it.
Temporary Mitigation:
When video is stalled, if we call videoElement.load() manually in Safari js console, the readyState and networkState will increase and seeing HLS video segment are being fetched and video eventually reaches canplaythrough.
This happens only on iOS Safari, not MacOS Safari.
We suspect its because at begining when videoElement.src is set and .load() was called, the videoElement was not attached to any div so iOS decides to stop it to save battery. But its completely uneducated guess and any help would be appreciated.
Thanks !
Post not yet marked as solved
I am trying to play multiple HLS streams with AVPlayer. Starting with 4 cameras it is working fine but whenever I change cameras, it stuck with
AVPlayer.TimeControlStatus.waitingToPlayAtSpecifiedRate. it is working fine with iOS 15 but having issues with iOS 16 and 17. I have checked it with different network speeds. So bandwidth is fine.
Post not yet marked as solved
Hello community,
I'm encountering a perplexing issue with my React Native app on iOS, specifically related to network connectivity when making API calls using Axios. The problem manifests as intermittent network errors, and what's even more puzzling is that the issue seems to be specific to iOS devices and Apple browsers.
Here's a brief overview of the problem:
Intermittent Network Errors: Occasionally, when making API calls using Axios in my React Native app on iOS, I receive network errors. The strange part is that this issue is sporadic and doesn't occur consistently.
Works on Cellular Network: When the app encounters these network issues on WiFi, I've observed that switching to a cellular network resolves the problem, and the API calls start working again.
Android and Other Devices Are Unaffected: Interestingly, the app works flawlessly on Android devices and other platforms. The issue appears to be isolated to iOS and Apple browsers.
Has anyone else in the community faced a similar problem or have any insights into what might be causing this? I've already ruled out general connectivity issues, as the app works perfectly on other devices and networks.
Any suggestions, tips, or shared experiences would be greatly appreciated. I'm open to trying out different approaches or debugging techniques to get to the bottom of this issue.
Thanks in advance for your assistance!
Post not yet marked as solved
Hi there, I want to play a stream of audio immediately after the first chunk of the audio arrives. The data stream looks like this
Audio_data, Delimiter, Json_data.
currently I am handling all chunks before the delimiter and adds it in the queue of the AVQueuePlayer. However, when playing this audio during the stream there are many glitches and does not work well. Waiting until all chunks arrived and then play the audio works well. So I assume there is no problem with the audio data, but with the handling of the chunks as they come and play immediately.
Happy about any advice you have! I am pretty lost right now.
Thank you so much.
import SwiftUI
import AVFoundation
struct AudioStreamView: View {
@State private var players: [AVAudioPlayer] = []
@State private var jsonString: String = ""
@State private var queuePlayer = AVQueuePlayer()
var streamDelegate = AudioStreamDelegate()
var body: some View {
VStack(spacing: 20) {
Button("Fetch Stream") {
fetchDataFromServer()
}
.padding()
TextEditor(text: $jsonString)
.disabled(true)
.border(Color.gray)
.padding()
.frame(minHeight: 200, maxHeight: .infinity)
}
}
func fetchDataFromServer() {
guard let url = URL(string: "https://dev-sonia.riks0trv4c6ns.us-east-1.cs.amazonlightsail.com/voiceMessage") else { return }
var request = URLRequest(url: url)
request.httpMethod = "POST" // Specify the request type as POST
let parameters: [String: Any] = [
"message_uuid": "value1",
"user_uuid": "68953DFC-B9EA-4391-9F32-0B36A34ECF56",
"session_uuid": "value3",
"timestamp": "value4",
"voice_message": "Whats up?"
]
request.httpBody = try? JSONSerialization.data(withJSONObject: parameters, options: .fragmentsAllowed)
request.addValue("application/json", forHTTPHeaderField: "Content-Type")
let task = URLSession.shared.dataTask(with: request) { [weak self] (data, response, error) in
guard let strongSelf = self else { return }
if let error = error {
print("Error occurred: \(error)")
return
}
// You might want to handle the server's response more effectively based on the API's design.
// For now, I'll make an assumption that the server returns the audio URL in the response JSON.
if let data = data {
do {
if let jsonResponse = try JSONSerialization.jsonObject(with: data, options: []) as? [String: Any],
let audioURLString = jsonResponse["audioURL"] as? String,
let audioURL = URL(string: audioURLString) {
DispatchQueue.main.async {
strongSelf.playAudioFrom(url: audioURL)
strongSelf.jsonString = String(data: data, encoding: .utf8) ?? "Invalid JSON"
}
} else {
print("Invalid JSON structure.")
}
} catch {
print("JSON decoding error: \(error)")
}
}
}
task.resume()
}
func playAudioFrom(url: URL) {
let playerItem = AVPlayerItem.init(url: url)
queuePlayer.replaceCurrentItem(with: playerItem)
queuePlayer.play()
}
}
Post not yet marked as solved
I've been working with the eligibleforhdrplayback property to determine if HDR playback is supported. However, I've noticed an inconsistency. When the video format switches from HDR to SDR in settings menu on Apple TV, the property still returns true, indicating HDR is playable even when it's not (This seems to contradict what was mentioned around the [20:40] mark of this WWDC video).
I've tried using the eligibleForHDRPlaybackDidChangeNotification and even restarted the app, but I still encounter the same issue.
Are there alternative approaches to accurately determine if the app can play HDR content on Apple TV?
Post not yet marked as solved
Hello! Getting issues with multiple users not able to play HLS content on Safari after upgrading to iOS 17.
Post not yet marked as solved
For live content, if the user pauses playback and their current playhead falls out of the valid DVR window, AVPlayer will stop reporting seekableTimeRanges - why does this happen? Is there a workaround?
For example:
Suppose we were streaming content with a 5 minute DVR window
User pauses playback for 6 minutes
Their current position is now outside of the valid seekable time range
AVPlayer stops reporting seekableTimeRanges all together
This is problematic for two reasons:
We have observed the AVPlayer generally becomes unresponsive when this happens. i.e. Any seek action will cause the player to freeze up
Without knowing the seekable range, we dont know how to return the user to the live edge when they resume playback
Seems to be the same issue described in these threads:
https://developer.apple.com/forums/thread/45850
https://developer.apple.com/forums/thread/45850
Post not yet marked as solved
Hi guys,
Setting AVPlayerViewController.transportBarCustomMenuItems is not working on tvOS. I still see 2 icons for Audio and Subtitles.
let menuItemAudioAndSubtitles = UIMenu(
image: UIImage(systemName: "heart")
)
playerViewController.transportBarCustomMenuItems = [menuItemAudioAndSubtitles]
WWDC 2021 video is insufficient to make this work.
https://developer.apple.com/videos/play/wwdc2021/10191/
The video doesn't say what exactly I need to do.
Do I need to disable subtitle options?
viewController.allowedSubtitleOptionLanguages = []
This didn't work and I still see the default icon loaded by the player.
Do I need to create subclass of AVPlayerViewController?
I just want to replace those 2 default icons by 1 icon as a test, but I was unsuccessful after many hours of work.
Is it mandatory to define child menu items to the main item?
Or do I perhaps need to define UIAction?
The documentation and video are insufficient in providing guidance how to do that.
I did something like this before, but that was more than 3 years ago and audi and subtitles was showing at the top of the player screen as tabs, if I rememebr correctly.
Is transportBarCustomMenuItems perhaps deprecated?
Is it possible that when loading AVPlayerItem and it detects audi and subtitles in the stream, it automatically resets AVPlayerViewController menu? How do I suppress this behavior?
I'm currently loading AVPlayerViewController into SwiftUI interface. Is that perhaps the problem? Should I write SwiftUI player overlay from scratch?
Thanks,
Robert
Post not yet marked as solved
Hello, I'm try to create a working Clear-key HLS stream using Shaka-packager.
Shaka packager support only SAMPLE-AES.
when implemented, it's looks like the resulting stream is no't playable in Safary browser.
This is the example M3U8 file starts:
#EXTM3U
#EXT-X-VERSION:6
## Generated with https://github.com/google/shaka-packager version v2.6.1-634af65-release
#EXT-X-TARGETDURATION:13
#EXT-X-PLAYLIST-TYPE:VOD
#EXT-X-MAP:URI="audio_und_2c_128k_aac_init.mp4"
#EXT-X-KEY:METHOD=SAMPLE-AES,URI="https://www.httpstest.com:771/key.key",IV=0x00000000000000000000000000000000,KEYFORMAT="identity"
#EXTINF:10.008,
audio_und_2c_128k_aac_1.mp4
#EXTINF:10.008,
audio_und_2c_128k_aac_2.mp4
#EXTINF:9.985,
audio_und_2c_128k_aac_3.mp4
#EXTINF:10.008,
audio_und_2c_128k_aac_4.mp4
#EXTINF:10.008,
audio_und_2c_128k_aac_5.mp4
#EXTINF:9.985,
audio_und_2c_128k_aac_6.mp4
#EXTINF:0.093,
audio_und_2c_128k_aac_7.mp4
#EXT-X-ENDLIST
I'm looking for:
A. a working example of SAMPLE-AES Clear key encrypted HLS (so I will be able to learn from it how it should be defined for IOS)
B. help on how to create a clear key working HLS stream for IOS/macOS (Safari)
Post not yet marked as solved
I'm working on live streaming encoder but I cannot start to play live stream correctly if the stream type (#EXT-X-PLAYLIST-TYPE) is EVENT.
When I try to play it, the stream starts from the beginning not the current position.
When I tested on iPhone 7 (iOS 15.7.8), live streams start correctly but I tested on iPhone 8 Plus (iOS 16.6), live streams start from the beginning. (I tested on other iPhones with iOS 16 or later and the result is the same.)
I also tried to add "#EXT-X-START:TIME-OFFSET" tag but it didn't work.
Is this behavior a bug or do I have to add some tag to play?
Post not yet marked as solved
In our application, we play video-on-demand (VOD) content and display subtitles in different languages.
Post not yet marked as solved
Hi,
I have an HLS content i.e., .m3u8 manifest file but the segments are encoded with **MPEG2Video. **
Is such encoding supported by HLS? Or they only support H.264/AVC or HEVC/H.265?
Stream #0:0[0x281]: Video: mpeg2video (Main) ([2][0][0][0] / 0x0002), yuv420p(tv, bt470bg, top first), 720x576 [SAR 16:15 DAR 4:3], 3125 kb/s, 25 fps, 25 tbr, 90k tbn
Stream #0:1[0x201]: Audio: mp2 ([3][0][0][0] / 0x0003), 48000 Hz, stereo, fltp, 128 kb/s
Thanks.
Post not yet marked as solved
Hi, I am using HLS playback for live broadcasting. I am using AVPlayerItem.
In live, I found seekableDuration always have some offset from the latest moment compared to the same playback on Chrome or Android. As far as I have digged into, the difference approximately matches the recommendedTimeOffsetFromLive (usually 6~9 seconds on my tests).
The problem is, I tried to minimize configuredTimeOffsetFromLive but it does not have any effect. Even if I set it to 1~2 seconds, it is always the same as recommendedTimeOffsetFromLive. I tried to change automaticallyPreservesTimeOffsetFromLive as well, but nothing seems working.
How do these properties work and how can I make the time offset minimized?
I am facing an issue with video content that I have converted to HLS playlist content (using ffmpeg) added to an S3 bucket that is shared through a Cloudfront Distribution. My scenario is the following:
I have a bucket called bucket-a, with a "folder" video-1 which contains the following files:
output.m3u8
output0.ts
...
output15.ts
audio/
audio.aac
image.jpg
All items in bucket-a are blocked from public access through S3. Content is only vended through a Cloudfront distribution which has origin bucket-a. I am able to access https://.cloudfront.net/path/output.m3u8 on a desktop browser without fail, and no errors thrown. But the file output.m3u8 and all .ts files are not available on iPhone mobile browsers. The part that is peculiar is that this is not true for all playlist content in bucket-a. For example, I have a "folder" video-2 within bucket-a that has the same file structure as video-1 that is completely accessible through all mobile browsers.
Here is an example master playlist error: https://dbs3s11vyxuw0.cloudfront.net/bottle-promo/script_four/output.m3u8
Even more head-scratching is that I am able to access all the playlists that are within this playlist.
What I've tried:
Initially, I believed the issue to be due to the way the video was transcoding so I standardized the video transcoding.
Then I believed the issue to be due to CloudFront permissions, though those seem to be fine.
I've validated my stream here: https://ott.dolby.com/OnDelKits_dev/StreamValidator/Start_Here.html
Not sure which way to turn.
Post not yet marked as solved
Hi there,
I'm currently making a web application using webRTC.
Even though all the SDP info and ICES of caller, callee are well transmitted, the connection was kept failing.
The other devices are functioning well.
However at just Iphone(13), it's not working.
I tried to connect at same network. And it's working.
Therefore I think it's a problem about Ice candidates.
I read similar post to avoid this issue.
And when one of safari's advanced option called WebRTC platform UDP sockets is disabled it's working.
Is there a way that I can connect without tuning options of Safari?
Thanks.
FYI
this is one of my Iphone's Ice
candidate:842163049 1 udp 1685921535 118.235.10.100 50750 typ srflx raddr 0.0.0.0 rport 50750 generation 0 ufrag 7e7f network-id 3 network-cost 900