Post not yet marked as solved
Hello! Getting issues with multiple users not able to play HLS content on Safari after upgrading to iOS 17.
Post not yet marked as solved
I've been working with the eligibleforhdrplayback property to determine if HDR playback is supported. However, I've noticed an inconsistency. When the video format switches from HDR to SDR in settings menu on Apple TV, the property still returns true, indicating HDR is playable even when it's not (This seems to contradict what was mentioned around the [20:40] mark of this WWDC video).
I've tried using the eligibleForHDRPlaybackDidChangeNotification and even restarted the app, but I still encounter the same issue.
Are there alternative approaches to accurately determine if the app can play HDR content on Apple TV?
Post not yet marked as solved
Hi there, I want to play a stream of audio immediately after the first chunk of the audio arrives. The data stream looks like this
Audio_data, Delimiter, Json_data.
currently I am handling all chunks before the delimiter and adds it in the queue of the AVQueuePlayer. However, when playing this audio during the stream there are many glitches and does not work well. Waiting until all chunks arrived and then play the audio works well. So I assume there is no problem with the audio data, but with the handling of the chunks as they come and play immediately.
Happy about any advice you have! I am pretty lost right now.
Thank you so much.
import SwiftUI
import AVFoundation
struct AudioStreamView: View {
@State private var players: [AVAudioPlayer] = []
@State private var jsonString: String = ""
@State private var queuePlayer = AVQueuePlayer()
var streamDelegate = AudioStreamDelegate()
var body: some View {
VStack(spacing: 20) {
Button("Fetch Stream") {
fetchDataFromServer()
}
.padding()
TextEditor(text: $jsonString)
.disabled(true)
.border(Color.gray)
.padding()
.frame(minHeight: 200, maxHeight: .infinity)
}
}
func fetchDataFromServer() {
guard let url = URL(string: "https://dev-sonia.riks0trv4c6ns.us-east-1.cs.amazonlightsail.com/voiceMessage") else { return }
var request = URLRequest(url: url)
request.httpMethod = "POST" // Specify the request type as POST
let parameters: [String: Any] = [
"message_uuid": "value1",
"user_uuid": "68953DFC-B9EA-4391-9F32-0B36A34ECF56",
"session_uuid": "value3",
"timestamp": "value4",
"voice_message": "Whats up?"
]
request.httpBody = try? JSONSerialization.data(withJSONObject: parameters, options: .fragmentsAllowed)
request.addValue("application/json", forHTTPHeaderField: "Content-Type")
let task = URLSession.shared.dataTask(with: request) { [weak self] (data, response, error) in
guard let strongSelf = self else { return }
if let error = error {
print("Error occurred: \(error)")
return
}
// You might want to handle the server's response more effectively based on the API's design.
// For now, I'll make an assumption that the server returns the audio URL in the response JSON.
if let data = data {
do {
if let jsonResponse = try JSONSerialization.jsonObject(with: data, options: []) as? [String: Any],
let audioURLString = jsonResponse["audioURL"] as? String,
let audioURL = URL(string: audioURLString) {
DispatchQueue.main.async {
strongSelf.playAudioFrom(url: audioURL)
strongSelf.jsonString = String(data: data, encoding: .utf8) ?? "Invalid JSON"
}
} else {
print("Invalid JSON structure.")
}
} catch {
print("JSON decoding error: \(error)")
}
}
}
task.resume()
}
func playAudioFrom(url: URL) {
let playerItem = AVPlayerItem.init(url: url)
queuePlayer.replaceCurrentItem(with: playerItem)
queuePlayer.play()
}
}
Post not yet marked as solved
Hello community,
I'm encountering a perplexing issue with my React Native app on iOS, specifically related to network connectivity when making API calls using Axios. The problem manifests as intermittent network errors, and what's even more puzzling is that the issue seems to be specific to iOS devices and Apple browsers.
Here's a brief overview of the problem:
Intermittent Network Errors: Occasionally, when making API calls using Axios in my React Native app on iOS, I receive network errors. The strange part is that this issue is sporadic and doesn't occur consistently.
Works on Cellular Network: When the app encounters these network issues on WiFi, I've observed that switching to a cellular network resolves the problem, and the API calls start working again.
Android and Other Devices Are Unaffected: Interestingly, the app works flawlessly on Android devices and other platforms. The issue appears to be isolated to iOS and Apple browsers.
Has anyone else in the community faced a similar problem or have any insights into what might be causing this? I've already ruled out general connectivity issues, as the app works perfectly on other devices and networks.
Any suggestions, tips, or shared experiences would be greatly appreciated. I'm open to trying out different approaches or debugging techniques to get to the bottom of this issue.
Thanks in advance for your assistance!
Post not yet marked as solved
I am trying to play multiple HLS streams with AVPlayer. Starting with 4 cameras it is working fine but whenever I change cameras, it stuck with
AVPlayer.TimeControlStatus.waitingToPlayAtSpecifiedRate. it is working fine with iOS 15 but having issues with iOS 16 and 17. I have checked it with different network speeds. So bandwidth is fine.
Post not yet marked as solved
We have a web application which is having stalled html video element issue on iOS Safari. iOS Version 17.0.3.
The html page contains a inline script tag for example
<script> ... </script> and runs immediately when page loads.
The script does following
videoElement.src = url;
videoElement.load();
The url is a HLS manifest url.
After the DOM elements are created, we attatch the videoElement to a <div> and expecting the video reaches canplaythrough eventually and starts to playing.
Actual Behavior:
The video never plays
videoElement readyState and networkState both stuck at value of 1
we found that "suspened" event was triggered on the video element and not sure who is triggering it.
Temporary Mitigation:
When video is stalled, if we call videoElement.load() manually in Safari js console, the readyState and networkState will increase and seeing HLS video segment are being fetched and video eventually reaches canplaythrough.
This happens only on iOS Safari, not MacOS Safari.
We suspect its because at begining when videoElement.src is set and .load() was called, the videoElement was not attached to any div so iOS decides to stop it to save battery. But its completely uneducated guess and any help would be appreciated.
Thanks !
Post not yet marked as solved
Hard/Software: iPad 7, Ipad OS 17.0.3, MobileVLCKit, SDP/RTSP stream, USB/Ethernet cable.
Description: I have an iPad application for live video streaming. From a security point of view, it should work via cable (USB/Ethernet). This application uses MobileVLCKit player to play videos. Videos are available via a URLs with an IP address and use the SDP/RTSP format (url example: http://172.20.129.69/stream/sdp).
Code example:
let camUri="http://172.20.129.69/stream/sdp"
let options = ["--network-caching=100"]
var players: [VLCMediaPlayer] = []
players.append(VLCMediaPlayer(options: options))
let media = VLCMedia(url: camUri)
players[0].media = media
players[0].drawable = centerView
players[0].play()
Problem: The application worked on various iPads with iPad OS 16 and lower (using cable or Wi-Fi). After updating to iPad OS 17, the application no longer works via cable.
What could be the problem and how to fix it?
Post not yet marked as solved
we have just a black screen and don't have audio and video !
Post not yet marked as solved
Hey Apple!
I'm just wondering if there are any recommendations on best practices for supporting AV experiences in SwiftUI?
As far as I know, VideoPlayer is the only API available directly supported in SwiftUI in AVKit (https://developer.apple.com/documentation/avkit) without the need for UIViewRepresentable / UIViewControllerRepresentable bridging of AVPlayer into SwiftUI.
However there are many core video and audio experiences that a modern audience expect that are not supported in VideoPlayer. e.g. PiP
Is there a roadmap for support in SwiftUI directly?
Thanks!
My project is a TV player app for HLS streams with fairplay encryption. It is made on swiftUI for iPhone and iPad, it is in production.
I have enabled the target "Mac (Designed for iPad)" in the project settings, and It is working perfectly on Mac M1 chips when running the app from the Mac AppStore.
The Mac version has never been main main focus, but it is nice to have it working so easily.
However when I run the app from Xcode, by selecting "My Mac (Designed for iPad)", everytime AVPlayer wants to start playback I am ejected from the app and the only thing I get from the logcat is:
Message from debugger: Terminated due to signal 9
Why? And Why does it work when running the app published on the appstore?
I was able to debug a bit and identify which line of code triggers the issue but I am still stuck:
I am using an AVAssetResourceLoaderDelegate to load the Fairplay Keys instead of the default one (because I need some authentication parameters in the HTTP headers to communicate with the DRM Proxy).
So, in the process I am able to request SPC data and CKC (I have verified the data), and then when the loadingRequest.finishLoading() is called.. BOOM the app is terminated and it triggers the log Message from debugger: Terminated due to signal 9.
I am sharing the delegate method from the AVAssetResourceLoaderDelegate where it happens. This has been written a while ago and is running fine on all devices. If you are not used to this delegate, it is used by AVPlayer whenever a new mediaItem is set with the method: AVPlayer.replaceCurrentItem(with: mediaItem)
func resourceLoader(_ resourceLoader: AVAssetResourceLoader, shouldWaitForLoadingOfRequestedResource loadingRequest: AVAssetResourceLoadingRequest) -> Bool {
guard let dataRequest = loadingRequest.dataRequest else { return false }
getFairplaycertificate { data, _ in
// Request Server Playback Context (SPC) data
guard
let certificate = data,
let contentIdData = (loadingRequest.request.url?.host ?? "").data(using: String.Encoding.utf8),
let spcData = try? loadingRequest.streamingContentKeyRequestData(
forApp: certificate,
contentIdentifier: contentIdData,
options: [AVContentKeyRequestProtocolVersionsKey: [1]]
) else {
loadingRequest.finishLoading(with: NSError(domain: "tvplayer", code: -1, userInfo: nil))
print("⚠️", #function, "Unable to get SPC data.")
return false
}
// Now the CKC can be requested
let networkId = loadingRequest.request.url?.host ?? ""
self.requestCKC(spcData: spcData, contentId: networkId) { ckc, error in
if error == nil && ckc != nil {
// The CKC is correctly returned and is sent to AVPlayer. Stream is decrypted
dataRequest.respond(with: ckc!)
loadingRequest.contentInformationRequest?.contentType = AVStreamingKeyDeliveryContentKeyType
loadingRequest.finishLoading() // <--- THIS LINE IS GUILTY!!!
} else {
print("⚠️", #function, "Unable to get CKC.")
loadingRequest.finishLoading(with: error)
}
}
return true
}
return true
}
If I comment loadingRequest.finishLoading() or if I replace it by: loadingRequest.finishLoading(with: error), the app is not terminated but my decryption keys are not loaded. That's the only clue I have so far.
Any help would be appreciated.
What should I look for?
Is there a way to get a detailed error stacktrace?
Thanks.
Post not yet marked as solved
I've been encountering a substantial increase in the following error log and am eager to find its root cause. The pattern of these logs emerge predominantly when attempting to play downloaded FPS DRM files(MOVPKG files). Except for a few rare instances, most occurrences are associated with content downloaded in previous OS versions, leading to playback issues following recent OS updates.
The error log I've been encountering is as follows:
Error Domain=CoreMediaErrorDomain Code=-16845 "HTTP 400: (unhandled)"
Even after searching, there are hardly any cases available, and the only thing I found is these issues
https://github.com/jhomlala/betterplayer/issues?q=is%3Aissue+16845+is%3Aclosed
I've been advising users to delete and re-download the affected content, which, in all cases, results in successful playback.
I'm seeking advice from anyone who might have experienced similar issues. If you've encountered a comparable situation or have any suggestions, I would greatly appreciate your input.
Post not yet marked as solved
What version of draft-pantos-hls-rfc8216bis does Apple currently support?
Post not yet marked as solved
I have the m3u8 like this
#EXTM3U #EXT-X-STREAM-INF:AVERAGE-BANDWIDTH=190000,BANDWIDTH=240000,RESOLUTION=240x160,FRAME-RATE=24.000,CODECS="avc1.42c01e,mp4a.40.2",CLOSED-CAPTIONS=NONE tracks-v1a1/mono.m3u8?thumbnails=10 #EXT-X-IMAGE-STREAM-INF:BANDWIDTH=10000,RESOLUTION=240x160,CODECS="jpeg",URI="images-240x160/tpl-0-60-10.m3u8?thumbnails=10"
and I have no thumbnails in the Safari native player. Could you please tell me why?
Post not yet marked as solved
I'm encountering an issue with live video streaming on iOS 17 using AVMutablePlayer. I'm utilizing a wss URL to stream videos by capturing data in chunks (e.g., 5 seconds) and playing it. Upon completion of the 5-second segment, I load another 5 seconds using self.player.replaceCurrentItem(with: nextPlayerItem).
Despite listening to events via self.player.currentItem?.observe, the functionality appears to be working well on iOS 16 but consistently displays a blank video on iOS 17.
private func playNext() { let nextSet = self.dataCollector.getNextItem(length: self.configuration.frameDelay) if nextSet.count == self.configuration.frameDelay { var playerTime:Int = self.player.currentItem != nil ? Int(CMTimeGetSeconds(player.currentTime())) : 0 var allData = Data() allData.appendAll(dataSet: dataCollector.getFileType()) nextSet.forEach { (data) in playerTime += 1 allData.append(data.getFragmentData()) self.currentFragmentTimes.updateValue(data.getFragmentTime(), forKey: playerTime) } if(allData.count > 0) { self.player.replaceCurrentItem(with: AVPlayerItem(asset: AVMutableMovie(data:allData, options: nil))) self.playerInitializedTime = nil self.player.play() } } }
Post not yet marked as solved
I'm using mediafilesegmenter with input as a fragmented mp4 hvc1 file and got this error:
Nov 23 2023 17:48:25.948: Fragmented MP4 is the only supported container format for the segmentation of HEVC content
Nov 23 2023 17:48:25.948: Unsupported media type 'hvc1' in track 0
Nov 23 2023 17:48:25.948: Unable to find any valid tracks to segment.
Segmenting failed (-12780).
Post not yet marked as solved
Hi Team,
Offline playback with AES-128 encryption
I'm downloading HLS content that is AES-128 encrypted and using the AVAssetResourceLoaderDelegate method shouldWaitForLoadingOfRequestedResource to parse the manifest to fetch the AES key URL. After fetching the key URL, I'll download and save the AES key locally. I will use the locally saved key to start the offline playback.
Since AVContentKeySession has been there for quite some time, is it okay to use the resource loader delegate method to parse and download the AES key?
Is there any chance that Apple will deprecate the downloading keys through the resource loader delegate?
Thanks,
Deepak.N
Post not yet marked as solved
Dear Apple Engineers,
First of all, thank you for this wonderful and very necessary native solution.
Question: Is it possible to use this API when processing HLS?
Thank you.
Post not yet marked as solved
I am reaching out to you as I am currently trying to solve an issue involving AVPlayer, and I have encountered a challenge related to handling errors for video segments.
In our implementation, we have noticed that AVPlayer tends to make contiuous calls to fetch the video segments when video segments returns errors. AVPlayer tries for approximately 30 seconds before throwing an error. We observed this issue when video segments return 404 or 5xx errors. Please fnd below screenshot for the same.
Is there any recommended approach or configuration setting that can be applied to restrict the number of calls AVPlayer makes in such scenarios. We are particularly interested in finding a solution that can help reduce the number of calls that are made in case of such failures.
I look forward to hearing from you soon and appreciate your support in resolving this matter.
Post not yet marked as solved
We're experimenting with a stream that has a large (10 minutes) clear portion in front of the protected section w/Fairplay.
We're noticing that AVPlayer/Safari trigger calls to fetch the license key even while it's playing the clear part, and once we provide the key, playback fails with:
name = AVPlayerItemFailedToPlayToEndTimeNotification, object = Optional(<AVPlayerItem: 0x281ff2800> I/NMU [No ID]), userInfo = Optional([AnyHashable("AVPlayerItemFailedToPlayToEndTimeErrorKey"): Error Domain=CoreMediaErrorDomain Code=-12894 "(null)"])
- name : "AVPlayerItemFailedToPlayToEndTimeNotification"
- object : <AVPlayerItem: 0x281ff2800> I/NMU [No ID]
▿ userInfo : 1 element
▿ 0 : 2 elements
▿ key : AnyHashable("AVPlayerItemFailedToPlayToEndTimeErrorKey")
- value : "AVPlayerItemFailedToPlayToEndTimeErrorKey"
- value : Error Domain=CoreMediaErrorDomain Code=-12894 "(null)"
It seems like AVPlayer is trying to decrypt the clear portion of the stream...and I'm wondering if it's because we've set up our manifest incorrectly.
Here it is:
#EXTM3U
#EXT-X-VERSION:8
#EXT-X-TARGETDURATION:20
#EXT-X-MEDIA-SEQUENCE:0
#EXT-X-INDEPENDENT-SEGMENTS
#EXT-X-PLAYLIST-TYPE:VOD
#EXT-X-MAP:URI="clear-asset.mp4",BYTERANGE="885@0"
#EXT-X-DEFINE:NAME="path0",VALUE="clear-asset.mp4"
#EXTINF:9.98458,
#EXT-X-BYTERANGE:81088@885
{$path0}
#EXTINF:19.96916,
#EXT-X-BYTERANGE:159892@81973
{$path0}
#EXTINF:19.96916,
#EXT-X-BYTERANGE:160245@241865
{$path0}
#EXT-X-DISCONTINUITY
#EXT-X-MAP:URI="secure-asset.mp4",BYTERANGE="788@0"
#EXT-X-DEFINE:NAME="path1",VALUE="secure-asset.mp4"
#EXT-X-KEY:METHOD=SAMPLE-AES,URI="skd://guid",KEYFORMAT="com.apple.streamingkeydelivery",KEYFORMATVERSIONS="1"
#EXTINF:19.96916,
#EXT-X-BYTERANGE:159928@5196150
{$path1}
#EXT-X-ENDLIST
Post not yet marked as solved
Can we confirm that as of iOS 16.3.1, key frames for MPEGTS via HLS are mandatory now?
I've been trying to figure out why https://chaney-field3.click2stream.com/ shows "Playback Error" across Safari, Chrome, Firefox, etc.. I ran the diagnostics against one of the m3u8 files that is generated via Developer Tools (e.g. mediastreamvalidator "https://e1-na7.angelcam.com/cameras/102610/streams/hls/playlist.m3u8?token=" and then hlsreport validation_data.json) and see this particular error:
Video segments MUST start with an IDR frame
Variant #1, IDR missing on 3 of 3
Does Safari and iOS devices explicitly block playback when it doesn't find one? From what I understand AngelCam simply acts as a passthrough for the video/audio packets and does no transcoding but converts the RTSP packets into HLS for web browsers But IP cameras are constantly streaming their data and a user connecting to the site may be receiving the video between key frames, so it would likely violate this expectation.
From my investigation it also seems like this problem also started happening in iOS 16.3? I'm seeing similar reports for other IP cameras here:
https://ipcamtalk.com/threads/blue-iris-ui3.23528/page-194#post-754082
https://www.reddit.com/r/BlueIris/comments/1255d78/ios_164_breaks_ui3_video_decode/
For what it's worth, when I re-encoded the MPEG ts files (e.g. ffmpeg-i /tmp/streaming-master-m4-na3.bad/segment-375.ts -c:v h264 /tmp/segment-375.ts) it strips the non key frames in the beginning and then playback works properly if I host the same images on a static site and have the iOS device connect to it.
It seems like Chrome, Firefox, VLC, and ffmpeg are much more forgiving on missing key frames. I'm wondering what the reason for enforcing this requirement? And can I confirm it's been a recent change?