Post not yet marked as solved
Trying to download an encrypted HLS stream we faced the following behaviour:
Setting requestCachePolicy and/or urlCache properties of URLSessionConfiguration that used to create the AVAssetDownloadURLSession seems to have no effect at all.
This is important for as since we need the HTTP caching policy to be applied for the .m3u8 manifest file of the stream.
Is this the intended behaviour of the download process or some kind of an issue?
Post not yet marked as solved
Trying to download an encrypted HLS stream we faced the following issue:
When we start a new download, calling resume() function of AVAssetDownloadTask, the download process gets stuck (not every time) and neither of urlSession(_:assetDownloadTask:didFinishDownloadingTo:) or urlSession(_:task:didCompleteWithError:) delegate functions (AVAssetDownloadDelegate) are getting called.
There are cases where not even the urlSession(_:assetDownloadTask:didLoad:totalTimeRangesLoaded:timeRangeExpectedToLoad:) delegate function is getting called.
Any suggestions on how to troubleshoot?
Post not yet marked as solved
1
I have a simple nodejs + node-media-server RTMP server running and I am studying and trying and trying to create a live stream app in Swift, however I am having some issues on how to send non http request in Swift,
Here's my LiveStream class code:
import Foundation
final class LiveStream: NSObject {
private let streamName: String
private var outputStream: OutputStream!
private var inputStream: InputStream!
private var streamTask: URLSessionStreamTask!
private lazy var netService: NetService = {
let service = NetService(domain: "streaming-course.herokuapp.com", type: "_rtmp._tcp.", name: streamName, port: 1935)
service.delegate = self
return service
}()
init(streamName: String) {
self.streamName = streamName
super.init()
configure()
}
func write(data: Data) {
var bytesWrittenLength = 0
data.withUnsafeBytes { unsafeRawPointer in
guard let uint8Pointer = unsafeRawPointer.bindMemory(to: UInt8.self).baseAddress else { return }
bytesWrittenLength += outputStream.write(uint8Pointer, maxLength: data.count)
}
print("Bytes written:", bytesWrittenLength)
}
func write(pointer: UnsafeRawPointer?) {
var bytesWrittenLength = 0
guard let uint8Pointer = pointer?.bindMemory(to: UInt8.self, capacity: 1024) else { return }
bytesWrittenLength += outputStream.write(uint8Pointer, maxLength: 1024)
print("Bytes written:", bytesWrittenLength)
}
}
private extension LiveStream {
func configure() {
let session = URLSession.shared
streamTask = session.streamTask(with: netService)
streamTask.resume()
streamTask.captureStreams()
}
}
extension LiveStream: NetServiceDelegate {
func netService(_ sender: NetService, didAcceptConnectionWith inputStream: InputStream, outputStream: OutputStream) {
print("Connection accepted")
}
}
extension LiveStream: URLSessionStreamDelegate {
func urlSession(_ session: URLSession, streamTask: URLSessionStreamTask, didBecome inputStream: InputStream, outputStream: OutputStream) {
self.outputStream = outputStream
self.inputStream = inputStream
inputStream.schedule(in: .main, forMode: .default)
outputStream.schedule(in: .main, forMode: .default)
inputStream.open()
outputStream.open()
}
}
The current state of my app is crash right at initialisation with logged error message:
'NSInvalidArgumentException', reason: '-[NSNetService _internalNetService]: unrecognized selector sent to instance 0x2801e6640'
So I am not initialising NetService correctly + plus I have no idea if using URLSessionStreamTask is really the way to go when sending a rmtp request...
Does anyone have any idea Why Am I getting this error? and If going towards the correct direction with this class setup?
I appreciate your attention, Thank you in advance
Post not yet marked as solved
We are experiencing audio sync issues during playback on fMP4 HLS live streams (HLS and LL-HLS) on Apple devices only (iOS and macOS) and we're not sure what's causing the problem. The issue does not occur during playback on Windows or Android platforms.
During playback in Safari, everything is fine until the sync gets lost suddenly, usually 5-10 minutes after playback begins. The extent of the desync varies but is very noticeable when it does - usually in the 15-30 frame range. Sync is always restored when restarting the player, until it becomes lost again some minutes later.
We are capturing the streams on iPhone devices and encoding HEVC / AAC-LC at 30fps locally on the device, and then sending to a media server for further processing. We then transcode the source stream and create multiple variations at different bitrates (HEVC). Because we are streaming from mobile devices in the field, during our server-side transcoding we set a constant 30fps frame rate in case of drops due to network issues. I should add that the issue occurs just as much with h264 as HEVC (we've tested many different combinations of input/output formats and protocols).
Regardless of whether we playback the source stream, the individual transcoded variations, or the ABR playlist with all variations, the sync problem appears in the same manner.
One interesting note. The issue seldom occurs on one of our older devices, an iPhone 6s Plus running a slightly older iOS version (14.4.1).
We suspect it has something to do with discontinuities inherent in our input streams that are not being corrected during our normalization/transcoding process. The Apple player is not compensating as other players are doing on other platforms.
We've run the Apple MediaStreamValidator validator tool and discovered multiple "must fix" issues - but it's not clear which of these, if any, are causing our problems. See output attached.
MediaStreamValidator output
Also, here is the full HLS report from the validator tool (in PNG format due to file restrictions here):
Happy to share more details or run more tests. We've been trying to debug this for weeks now. Thanks for your help.
Post not yet marked as solved
hi all,
I tried to create .m3u8 to stream screan ip to tv with chrome cast but it doesn't work as expected. It's laggy and can't display your picture. Looking forward to hearing from you gentlemen.
Post not yet marked as solved
Hi All!
Im try use AVPlayer for IPTV live stream (MacOS App). Use remote server with iptv.m3u file. Remote file iptv.m3u have some code (many streams TV channels):
#EXTM3U url-tvg="http://iptv.myserver.net/epg.xml" m3uautoload="1" deinterlace="7" aspect-ratio="none" cache="2000" tvg-shift="0"
#EXTINF:0 tvg-id="9929" tvg-name="9929" audio-track="en" group-title="World News" id="vsetv_9929" tvg_logo="http://iptv.myserver.net/myiptv/icons/9929.png", Channel 1
http://iptv.myserver.net:8081/Channel1
#EXTINF:0 tvg-id="9930" tvg-name="9930" audio-track="en" group-title="World News" id="vsetv_9930" tvg_logo="http://iptv.myserver.net/myiptv/icons/9930.png", Channel 2
http://iptv.myserver.net:8081/Channel2
#EXTINF:0 tvg-id="9932" tvg-name="9932" audio-track="en" group-title="World News" id="vsetv_9932" tvg_logo="http://iptv.myserver.net/myiptv/icons/9932.png", Channel 3
http://iptv.myserver.net:8081/Channel3
...
...
#EXTINF:0 tvg-id="3542" tvg-name="3542" audio-track="en" group-title="Cartoons" id="vsetv_3542" tvg_logo="http://iptv.myserver.net/myiptv/icons/3542.png", Channel 254
http://iptv.myserver.net:8081/Channel254
Im use AVPlayer and initialize with simple code:
class VideoItem: ObservableObject {
@Published var player: AVPlayer = AVPlayer()
@Published var playerItem: AVPlayerItem?
let myURL = URL(string: "http://iptv.myserver.net/iptv.m3u")! //not real url
let asset = AVURLAsset(url: url)
let playerItem = AVPlayerItem(asset: asset)
self.playerItem = playerItem
player.replaceCurrentItem(with: playerItem)
player.play()
}
and player works nice, but im see only "Channel 1" and not selected other! Im have asset, playerItem, player but not understand the way how Im create a channel list (playlist) and select a channel from this playlist to play. Where stored info data about all TV channels after dowloaded remote iptv.m3u file? Please help!
Post not yet marked as solved
iPhone pro 12 (14.7.1)
HLS codec is H.264 high profile 3
During streaming playback, the kernel log is repeatedly output.
Then a hardware reset occurs.
hardware reset kernel log
Hardware reset does not occur on iPhone 6s devices.
Post not yet marked as solved
We have an HLS video stream which is failing to play in AVPlayer but plays on ExoPlayer. AVPlayer throws the above error code, which we cannot find any reference to or help with understanding what exactly is AVPlayer doing (or finds unacceptable with the stream).
We'd like to know what causes this error to be thrown. We see this when HLS segments are authored from an MP4 with some questionable I and P frame data near the end of the container.
The error is thrown on iOS 14.7.1 AVPlayer.
Post not yet marked as solved
hi all
I am new in swift and i want read string from java server to swift client through stream but its giving me -1 here is my server code in java
try {
out = new PrintWriter(new BufferedWriter(
new OutputStreamWriter(tempClientSocket.getOutputStream())),
true);
} catch (IOException e) {
e.printStackTrace();
}
out.println("hello world");
here is my client code in swift
let dataCount = data.count
let bytesWritten = data.withUnsafeBytes { (p: UnsafePointer<UInt8>) -> Int in
return streams.outputStream.write(p, maxLength: dataCount)
when i print stream delegate event code outputOpenCompleted:
outputHasSpaceAvailable: is printing but no any printi9ng is coming for inputstream
on inputStream.read meassage is like that ** SocketStream read error [0x28173c000]: 1 60** Please please help me as soon as possibloe
Post not yet marked as solved
Trying to download an encrypted HLS stream we faced the following behaviour:
Setting requestCachePolicy and/or urlCache properties of URLSessionConfiguration that used to create the AVAssetDownloadURLSession seems to have no effect at all.
In our application the user can add multiple encrypted HLS streams at a queue. Before adding them in queue, we make sure that the manifest gets cached using the shared URLSession like this:
URLSession.shared.configuration.urlCache = .shared
let task = URLSession.shared.dataTask(with: media.url) { _, _, _ in
self.addMediaToQueue(media)
}
task.resume()
and we setup our AVAssetDownloadURLSession like this:
// Create the configuration for the AVAssetDownloadURLSession.
let backgroundConfiguration = URLSessionConfiguration.background(withIdentifier: "AAPL-Identifier")
backgroundConfiguration.urlCache = .shared
backgroundConfiguration.requestCachePolicy = .returnCacheDataElseLoad
// Create the AVAssetDownloadURLSession using the configuration.
assetDownloadURLSession = AVAssetDownloadURLSession(
configuration: backgroundConfiguration,
assetDownloadDelegate: self,
delegateQueue: .main
)
Here is an example of the caching headers that we use:
Last-Modified: Thu, 11 Mar 2021 02:23:57 GMT
Cache-Control: max-age=604800
This is important for us since our manifest url is signed and expires after 12 hours.
Example of manifest URL:
https://example.host.gr/v1/791/888/773923397316/773923397316.ism/.m3u8[…]~hmac=ee37a750b8238745b5c8cf153ebcd0b693dd5d83
If the client followed the HTTP cache policy and didn’t request the .m3u8 manifest file over the internet, the download would start, despite the 12 hours limit.
Is this the intended behaviour of the download process or some kind of an issue? Could you suggest a workaround?
Post not yet marked as solved
Situation:
I have an HLS audio only stream comprised of aac files. I've confirmed that timed metadata is attached to the stream using ffprobe. Unfortunately I'm unable to access the timed metadata from the AVPlayer.
Output from FFProbe
~ ffprobe index_1_296.aac
....
Input #0, aac, from 'index_1_296.aac':
Metadata:
id3v2_priv.com.apple.streaming.transportStreamTimestamp: \x00\x00\x00\x00.\x00\x05\xc0
Duration: 00:00:06.02, bitrate: 96 kb/s
Stream #0:0: Audio: aac (LC), 48000 Hz, stereo, fltp, 96 kb/s
What I've done:
In my class containing the AVPlayer I've extended the AVPlayerItemMetadataOutputPushDelegate and implemented the metadataOutput method.
Code
I followed an example I found here: https://dcordero.medium.com/hls-timed-metadata-with-avplayer-9e20806ef92f however below is the code implementing the metadataOutput method:
func metadataOutput(_ output: AVPlayerItemMetadataOutput, didOutputTimedMetadataGroups groups: [AVTimedMetadataGroup], from track: AVPlayerItemTrack?) {
if let item = groups.first?.items.first
{
item.value(forKeyPath: #keyPath(AVMetadataItem.value))
let metadataValue = (item.value(forKeyPath: #keyPath(AVMetadataItem.value))!)
print("Metadata value: \n \(metadataValue)")
} else {
print("MetaData Error")
}
}
What I'm seeing:
When playing manifests containing .ts files this metadataOutput method is triggered with timed metadata. However when I'm playing a manifest containing only .aac files the metadataOutput method is never triggered.
Question:
Does AVPlayer support extracting timed metadata from aac files?
If it does are there any examples of this working?
Post not yet marked as solved
Hello everyone!
Recently our backend team integrated video streaming via HLS, before we had default HTTP streaming.
With HTTP streaming this exporting code worked fine:
private func cacheFile(from asset: AVURLAsset) {
guard asset.isExportable,
let fileName = asset.url.pathComponents.last,
let outputURL = self.cacheDirectory?.appendingPathComponent(fileName),
!FileManager.default.fileExists(atPath: outputURL.path)
else { return }
asset.resourceLoader.setDelegate(self, queue: backgroundQueue.underlyingQueue)
let exporter = AVAssetExportSession(asset: asset, presetName: AVAssetExportPresetHighestQuality)
exporter?.outputURL = outputURL
exporter?.determineCompatibleFileTypes(completionHandler: { types in
guard let type = types.first else { return }
exporter?.outputFileType = type
exporter?.exportAsynchronously(completionHandler: {
if let error = exporter?.error {
print(error)
}
})
})
}
This code works great with HTTP streaming, but for HLS asset.isExportable is equal to false.
After removing check for asset.isExportable exporter?.determineCompatibleFileTypes passes empty array inside closure. If setting outputFileType to .mp4 or .mov I'm receiving error inside exportAsynchronously completionHandler:
Error Domain=AVFoundationErrorDomain Code=-11838 "Operation Stopped" UserInfo={NSLocalizedFailureReason=The operation is not supported for this media., NSLocalizedDescription=Operation Stopped, NSUnderlyingError=0x6000025abd50 {Error Domain=NSOSStatusErrorDomain Code=-16976 "(null)"
Why does this happen? AVAssetExportSession cannot combine all parts of .m3u8 to .mp4? Is there any alternative way to cache streamed video via HLS?
I'm having some issues with the generation of HLS (.ts) files.
We are working in order to generate an HLS from an RTSP source (ip camera) using ffmpeg.
The issues is this one:
If we use FFMPEG to generate the HLS we have no issues at all by playing it on macOS, iPadOS and iOS.
If we use our internal FFMPEG included inside our software (.exe running on windows) we have some issues with iOS. The HLS won't play only on iOS devices (ipads and macs works like a charm).
We've tried to understand what is the issue but we are not able to dig anything else than something related with Media Source Extensions.
I've included a link to our repo where you can fine both the file generated from our software (streamATC.ts) and the one generated directly from FFMPEG (streamFF.ts) if you want to take a look at it.
Github forum about this issue
Our repo with .ts files
No difference between both files from a codec standpoint as you can see:
Do you have any help in order to understand why of this strange behaviour ?
Post not yet marked as solved
When connection is cut before track is fully buffered and player reaches end of loaded time ranges, audio stops but status is not updated:
elapsed Time continues to send events
player.timeControlStatus = playing
currentItem!.isPlaybackLikelyToKeepUp = true
player.status = readyToPlay
player.currentItem!.status = readyToPlay
currentItem!.isPlaybackLikelyToKeepUp = true
But an event is logged in errorLog()
"NSURLErrorDomain"
errorStatusCode -1009
This results in weird behaviour where a progress bar is continuing to show progress without any sound. It even continues beyond the total track duration.
Reproducible on demo app https://github.com/timstudt/PlayerApp:
start playback
let buffer til e.g. 70sec (loadedTimeRanges)
activate flight mode
seek to 60sec (returns = successful)
watch:
when player reaches 70sec mark, audio stops, but elapsed time continues.
Note: w/o seeking the player stalls correctly on 70sec mark.
Post not yet marked as solved
For my App project I want to try the HTTP Live Streaming Tools provided by Apple. I could download the .pkg file and follow the install instruction. After successful installation (the prompt appeared) I cannot find the program anywhere.
From readme: "The HLS tools package requires an Intel-based Mac running macOS 10.15 Catalina or later."- So the problem is the M1 chip?
Is there a solution for it ?
Post not yet marked as solved
I am trying to insert timed metadata (id3) into a live HLS stream created with Apple's mediastreamsegmenter tool. I am getting the video from an ffmpeg stream, here is the command I run to test from an existing file:
ffmpeg -re -i vid1.mp4 -vcodec libx264 -acodec aac -f mpegts - | mediastreamsegmenter -f /Users/username/Sites/video -s 10 -y test -m -M 4242 -l log.txt
To inject metadata, I run this command:
id3taggenerator -text '{"x":"data dan","y":"36"}' -a localhost:4242
This setup creates the expected .ts files and I can play back the
video/audio with no issues. However the metadata I am attempting to
insert does not work in the final file. I know the metadata is there in
some form, when I file-compare a no-metadata version of the video to one
I injected metadata into, I can see the ID3 tags within the binary
data.
Bad File Analysis
When I analyze the generated files using ffmpeg:
ffmpeg -i video1.ts
the output I get is:
[mpegts @ 0x7fb00a008200] start time for stream 2 is not set in estimate_timings_from_pts[mpegts @ 0x7fb00a008200] stream 2 : no TS found at start of file, duration not set[mpegts @ 0x7fb00a008200] Could not find codec parameters for stream 2 (Audio: mp3, 0 channels): unspecified frame sizeConsider increasing the value for the 'analyzeduration' (0) and 'probesize' (5000000) options
Input #0, mpegts, from 'video1.ts': Duration: 00:00:10.02, start: 0.043444, bitrate: 1745 kb/s
Program 1
Stream #0:0[0x100]: Video: h264 (High) ([27][0][0][0] / 0x001B), yuv420p(tv, bt709, progressive), 848x464 [SAR 1:1 DAR 53:29], 30 fps, 30 tbr, 90k tbn, 60 tbc
Stream #0:1[0x101] Audio: aac (LC) ([15][0][0][0] / 0x000F), 44100 Hz, stereo, fltp, 130 kb/s
No Program
Stream #0:2[0x102]: Audio: mp3, 0 channels
Note how the third stream (stream #0:2) is marked as mp3...this is incorrect! Also it says "No Program", instead of being in "Program 1".
When I analyze a properly encoded video file with inserted ID3 metadata that I created with Apple's mediafilesegmenter tool, the analysis shows a "timed_id3" track and this metadata track works properly in my web browser.
Good File Analysis
ffmpeg -i video1.ts
—Input #0, mpegts, from 'video1.ts': Duration: 00:00:10.08, start: 19.984578, bitrate: 1175 kb/s
Program 1 Stream #0:0[0x101]: Video: h264 (High) ([27][0][0][0] / 0x001B), yuv420p(tv, bt709, progressive), 848x464, 30 fps, 30 tbr, 90k tbn, 180k tbc
Stream #0:1[0x102]: Audio: aac (LC) ([15][0][0][0] / 0x000F), 44100 Hz, stereo, fltp, 67 kb/s
Stream #0:2[0x103]: Data: timed_id3 (ID3 / 0x20334449)
I must use mediastreamsegmenter because that is required for live streams. Does anyone know how I can get timed ID3 metadata into a live HLS stream properly?
Post not yet marked as solved
We have live HLS streams with separate aac audio for multi-language tracks.
Previously we have had 6 second segments, which had led to a small variation in audio segment length, although this played back everywhere on iOS14/tvOS14.
In iOS15/tvOS15 these streams fail to play, unless we align audio and video segment lengths, which in turn requires us to run at 4 or 8 seconds for segments.
To get back to 6, we would need to sample at a much higher rate (192kHz), as opposed to the 96kHz we have working today, or the 48kHz we had working previously.
Anyone else spotted this?
Post not yet marked as solved
Hi,
The term master is out of favor in the computing world and beyond. Hence, to make codebases more inclusive, wondering if we can change "master manifest file" HLS terminology to "main manifest file"? This will help us shift from using the non-inclusive term in our organization's codebase too.
Thanks!
Post not yet marked as solved
Dear Apple Expert,
Our project uses libUSB library to interact with a USB based camera device. Our application is working fine in macOS Mojave (10.14.6 ). When the new MacOS 12 beta version was made available, we tested our code. But when we try to claim the interface via "CreateInterfaceIterator" API, we are getting "kIOReturnExclusiveAccess" error code and ultimately our application fails. The failure is observed in both libUSB versions 1.0.23 and 1.0.24.
Could you help us by explaining if there is change in the new OS with respect to access to USB devices?
Post not yet marked as solved
Hi,
when I use a local .mp4 video file encoded in HEVC + Alpha channel with an AVPlayer as the material of a SCNNode, the transparency is restituted correctly similarly as if I used a .png image with transparency.
The issue is:
when I encode this same .mp4 file into a HLS stream using mediafilesegmenter and try to play it in the same manner as a SCNNode material with AVPlayer the transparency is not restituted and instead the transparent zones are filled with opaque black. (the hls stream has correct transparency as verified by opening its url with Safari)
Sample Test:
import UIKit
import ARKit
class ViewController: UIViewController {
private var arView: ARSCNView!
lazy var sphere: SCNNode = {
let node = SCNSphere(radius: 5)
node.isGeodesic = false
node.segmentCount = 64
node.firstMaterial?.lightingModel = .constant
node.firstMaterial?.diffuse.contents = colorLiteral(red: 0, green: 0, blue: 0, alpha: 0)
node.firstMaterial?.cullMode = .front
return SCNNode(geometry: node)
}()
private var avPlayer: AVPlayer!
override func viewDidLoad() {
super.viewDidLoad()
setupArView()
setupArSession()
setupButton()
}
private func setupButton() {
let button = UIButton()
button.setTitle("START", for: .normal)
button.translatesAutoresizingMaskIntoConstraints = false
view.addSubview(button)
NSLayoutConstraint.activate([
button.centerXAnchor.constraint(equalTo: view.centerXAnchor),
button.centerYAnchor.constraint(equalTo: view.centerYAnchor)
])
button.addTarget(self, action: #selector(createSphere), for: .touchUpInside)
}
@IBAction func createSphere() {
guard avPlayer == nil else { return }
addSphere()
}
}
extension ViewController {
private func setupArView() {
arView = ARSCNView()
arView.backgroundColor = .black
arView.translatesAutoresizingMaskIntoConstraints = false
view.insertSubview(arView, at: 0)
NSLayoutConstraint.activate([
arView.leadingAnchor.constraint(equalTo: view.leadingAnchor),
arView.topAnchor.constraint(equalTo: view.topAnchor),
arView.trailingAnchor.constraint(equalTo: view.trailingAnchor),
arView.bottomAnchor.constraint(equalTo: view.bottomAnchor)
])
arView.preferredFramesPerSecond = 60
}
private func setupArSession() {
let configuration = ARWorldTrackingConfiguration()
configuration.worldAlignment = .gravityAndHeading
configuration.environmentTexturing = .none
if ARWorldTrackingConfiguration.supportsFrameSemantics(.personSegmentationWithDepth) {
configuration.frameSemantics.insert(.personSegmentationWithDepth)
}
if ARWorldTrackingConfiguration.supportsUserFaceTracking {
configuration.userFaceTrackingEnabled = true
}
arView.session.run(configuration)
}
private func addSphere() {
// let asset = AVURLAsset(url: URL(string: "https://SOMECLOUDSTORAGE.com/hls-bug/prog_index.m3u8")!)
let asset = AVURLAsset(url: Bundle.main.url(forResource: "puppets", withExtension: "mp4")!)
let playerItem = AVPlayerItem(asset: asset)
playerItem.addObserver(self, forKeyPath: #keyPath(AVPlayerItem.status), options: [.old, .new], context: nil)
playerItem.isAudioSpatializationAllowed = true
playerItem.allowedAudioSpatializationFormats = .monoStereoAndMultichannel
avPlayer = AVPlayer()
sphere.position = SCNVector3(0, 0, 0)
arView.scene.rootNode.addChildNode(sphere)
avPlayer.replaceCurrentItem(with: playerItem)
}
override func observeValue(forKeyPath keyPath: String?, of object: Any?, change: [NSKeyValueChangeKey : Any]?, context: UnsafeMutableRawPointer?) {
if keyPath == #keyPath(AVPlayerItem.status) {
let status: AVPlayerItem.Status
if let statusNumber = change?[.newKey] as? NSNumber {
status = AVPlayerItem.Status(rawValue: statusNumber.intValue)!
} else {
status = .unknown
}
switch status {
case .readyToPlay:
DispatchQueue.main.async {
self.avPlayer.playImmediately(atRate: 1)
self.sphere.geometry?.firstMaterial?.diffuse.contents = self.avPlayer
}
case .failed, .unknown:
break
@unknown default:
break
}
} else {
super.observeValue(forKeyPath: keyPath, of: object, change: change, context: context)
}
}
}
The video file used is the puppets_with_alpha_hevc.mov file from the Apple's HEVC alpha demo that I re-muxed into .mp4 container using ffmpeg.
To reproduce both scenarios replace the AVURLAsset with either a local .mp4 file or the HLS stream url.
Issue reproduced on iPhone 11 Pro iOS 15.
This issue has been unresolved for a bit of time now though I tried everything to get attention. Unsuccessful TSI ticket, silent Feedback Assistant Bug report, I even discussed about this bug during WWDC 2021 with Shiva Sundar who is in charge of HEVC dev and who said it would be checked.
Hopes