Create view-level services for media playback, complete with user controls, chapter navigation, and support for subtitles and closed captioning using AVKit.

Posts under AVKit tag

80 Posts
Sort by:

Post

Replies

Boosts

Views

Activity

AppleCoreMedia 1.0.0.21B101 always requesting 2 streams from ABR
Hi Team, We see an issue with this version if CoreMedia requesting multiple qualities at all times for a stream. We don't see this issue on 1.0.0.21C62. We are unsure what would be causing this. [2024-01-05 16:53:51] "GET /live/cinecanal/live/cinecanal_720p/video.m3u8?_HLS_msn=630&_HLS_part=0 HTTP/1.0" 200 1145 2529 1090199 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:52] "GET /live/cinecanal/live/cinecanal_1080p/audio.m3u8?_HLS_msn=630&_HLS_part=0 HTTP/1.0" 200 1146 2396 1013356 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:52] "GET /live/cinecanal/live/cinecanal_1080p/a_1459_2452140264_630_0.fmp4 HTTP/1.0" 200 1139 24975 1013385 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:52] "GET /live/cinecanal/live/cinecanal_720p/video.m3u8?_HLS_msn=630&_HLS_part=1 HTTP/1.0" 200 1145 2603 998670 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:52] "GET /live/cinecanal/live/cinecanal_720p/v_1058_2452140000_630_1.fmp4 HTTP/1.0" 200 1138 40534 998739 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:53] "GET /live/cinecanal/live/cinecanal_720p/video.m3u8?_HLS_msn=630&_HLS_part=2 HTTP/1.0" 200 1145 2677 835327 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:53] "GET /live/cinecanal/live/cinecanal_720p/v_1058_2452140000_630_2.fmp4 HTTP/1.0" 200 1138 57656 835207 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:53] "GET /live/cinecanal/live/cinecanal_1080p/audio.m3u8?_HLS_msn=630&_HLS_part=1 HTTP/1.0" 200 1146 2458 986038 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:53] "GET /live/cinecanal/live/cinecanal_1080p/a_1459_2452140264_630_1.fmp4 HTTP/1.0" 200 1139 24700 986032 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:54] "GET /live/cinecanal/live/cinecanal_720p/video.m3u8?_HLS_msn=630&_HLS_part=3 HTTP/1.0" 200 1145 2751 1013257 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:54] "GET /live/cinecanal/live/cinecanal_720p/v_1058_2452140000_630_3.fmp4 HTTP/1.0" 200 1138 55900 1013324 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:54] "GET /live/cinecanal/live/cinecanal_1080p/audio.m3u8?_HLS_msn=630&_HLS_part=2 HTTP/1.0" 200 1146 2520 1016693 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:54] "GET /live/cinecanal/live/cinecanal_1080p/a_1459_2452140264_630_2.fmp4 HTTP/1.0" 200 1139 25014 1016717 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:55] "GET /live/cinecanal/live/cinecanal_720p/video.m3u8?_HLS_msn=630&_HLS_part=4 HTTP/1.0" 200 1145 2825 917753 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:55] "GET /live/cinecanal/live/cinecanal_720p/v_1058_2452140000_630_4.fmp4 HTTP/1.0" 200 1138 103745 917903 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:55] "GET /live/cinecanal/live/cinecanal_1080p/audio.m3u8?_HLS_msn=630&_HLS_part=3 HTTP/1.0" 200 1146 2582 958102 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:55] "GET /live/cinecanal/live/cinecanal_1080p/a_1459_2452140264_630_3.fmp4 HTTP/1.0" 200 1139 24782 958195 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:56] "GET /live/cinecanal/live/cinecanal_720p/video.m3u8?_HLS_msn=630&_HLS_part=5 HTTP/1.0" 200 1145 2899 931101 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:56] "GET /live/cinecanal/live/cinecanal_720p/v_1058_2452140000_630_5.fmp4 HTTP/1.0" 200 1138 112113 931228 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:56] "GET /live/cinecanal/live/cinecanal_1080p/audio.m3u8?_HLS_msn=630&_HLS_part=4 HTTP/1.0" 200 1146 2644 935550 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:56] "GET /live/cinecanal/live/cinecanal_1080p/a_1459_2452140264_630_4.fmp4 HTTP/1.0" 200 1139 24824 937720 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:57] "GET /live/cinecanal/live/cinecanal_1080p/audio.m3u8?_HLS_msn=630&_HLS_part=5 HTTP/1.0" 200 1146 2706 895680 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:57] "GET /live/cinecanal/live/cinecanal_1080p/a_1459_2452140264_630_5.fmp4 HTTP/1.0" 200 1139 24843 895734 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)" [2024-01-05 16:53:57] "GET /live/cinecanal/live/cinecanal_720p/video.m3u8?_HLS_msn=631&_HLS_part=0 HTTP/1.0" 200 1145 2529 907045 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)"
0
0
532
Jan ’24
Problem when adding AVPlayer to TabView
The following type of problem has appeared. I need to flip through the video. Imagine you have, say, 3 videos, and you can scroll through them and choose which video you want to watch. For this, I decided to use a TabView with the .page style. But it turned out that it didn't work. And I found myself in a stupor. The TabView itself starts to lag, the scrolling starts to lag, the videos do not start the first time, and sometimes the control panel does not even appear on some videos, which is why it is impossible to expand the video to full screen. The code will be below, maybe someone has encountered this problem, how did he solve it, maybe there are some other options to make a similar logic? let videos: [String] = ["burpee", "squat", "step-up", "sun-salute"] var body: some View { TabView { ForEach(videos, id: \.self) { videoName in VideoPlayerView(videoName: videoName) .clipShape(RoundedRectangle(cornerRadius: 25)) } } .frame(width: 375, height: 230) } struct VideoPlayerView: View { let videoName: String var body: some View { if let videoURL = Bundle.main.url(forResource: videoName, withExtension: "mp4") { VideoPlayerWrapper(player: AVPlayer(url: videoURL)) } else { Text("No Video \(videoName)") } } } #Preview { VideoPlayerView(videoName: "squat") } struct VideoPlayerWrapper: UIViewControllerRepresentable { let player: AVPlayer func makeUIViewController(context: Context) -> AVPlayerViewController { let controller = AVPlayerViewController() controller.player = player controller.showsPlaybackControls = true return controller } func updateUIViewController(_ uiViewController: AVPlayerViewController, context: Context) {} }
1
0
576
Dec ’23
Crash entering Picture in Picture from webview on Mac Catalyst or Made for iPad
Crash seems to be in a private Apple framework. There's some other reports of this floating around but no solutions so far. Any ideas? *** Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '-[WebAVPlayerLayer startRedirectingVideoToLayer:forMode:]: unrecognized selector sent to instance 0x6000037033c0' *** First throw call stack: ( 0 CoreFoundation 0x0000000187d56800 __exceptionPreprocess + 176 1 libobjc.A.dylib 0x000000018784deb4 objc_exception_throw + 60 2 CoreFoundation 0x0000000187e083bc -[NSObject(NSObject) __retain_OA] + 0 3 CoreFoundation 0x0000000187cc0a84 forwarding + 1572 4 CoreFoundation 0x0000000187cc03a0 _CF_forwarding_prep_0 + 96 5 AVKit 0x00000001bdc81f30 -[__AVPlayerLayerView startRoutingVideoToPictureInPicturePlayerLayerView] + 156 6 AVKit 0x00000001bdcf1d48 -[AVPictureInPicturePlatformAdapter(Common) _setRoutingVideoToHostedWindow:pictureInPictureViewController:source:] + 84 7 AVKit 0x00000001bdcd952c -[AVPictureInPicturePlatformAdapter startPictureInPicture] + 380 8 AVKit 0x000000022883000c -[AVPictureInPicturePlatformAdapterAccessibility startPictureInPicture] + 44 9 AVKit 0x00000001bdcddea0 -[AVPictureInPictureController startPictureInPicture] + 216 10 WebCore 0x00000001c75277c8 -[WebAVPlayerViewController startPictureInPicture] + 128 11 libdispatch.dylib 0x0000000102c64f14 _dispatch_call_block_and_release + 32
2
0
577
Dec ’23
How to access a directly attached UVC camera with AVPlayer?
On macOS Sonoma I have a SwiftUI app that correctly plays remote video files and local video files from the app bundle. Where I'm having trouble is setting up the AVPlayer URL for a UVC camera device directly connected on the Mac. let url = URL(string: "https://some-remote-video.mp4")! player = AVPlayer(url: url) player.play() Is there some magic to using a UVC device with AVPlayer, or do I need to access the UVC device differently? Thanks, Grahm
1
0
554
Dec ’23
Using URLRequest with AVPlayer
I'm creating an app with a video player streaming video from an URL. The AVPlayer only takes the URL address of the video but I'd like to create a custom URLRequest and then use that to stream the video from online. So current situation is this: let videourl = URL(string: "videofoobar.com") let player = AVPlayer(url: videourl) player.play() And I would like to go to something like this: let videourl = URL(string: "videofoobar.com") var request = URLRequest(url: videourl) let player = AVPlayer(request: request) //This obviously fails, any workaround? player.play() I know it is not possible to do it like this as AVPlayer only takes URL or AVPlayerItem as parameter. Is there any workaround to make URLRequest and then give it to AVPlayer for online streaming?
0
0
464
Dec ’23
When can I check AVPlayerItem’s status?
I’ve got some code that creates an AVPlayerItem from a URL, the creates an AVQueuePlayer from it. If I check the player item's status after that, it's still unknown. According to the docs, it'll remain unknown until it is associated with an AVPlayer, and then it "immediately begins enqueuing the item’s media and preparing it for playback." But checking the status right after that, I still get unknown, which tells me it’s not quite immediate. Is there any way to test if the player item will work immediately after creation? In this case, the problem is that my app doesn't have permission, due to it being a bookmark saved in a sandboxed app.
0
0
455
Dec ’23
Create AVPlayer instance from Data (rather than URL)?
In my SwiftUI/SwiftData application, I want to store videos in SwiftData objects (using external storage). To display them, I need to instantiate an AVPlayer (for use in a VideoPlayer view). But AVPlayer expects a URL, not a Data object. Obviously, I can solve this problem via a file-caching scheme (i.e., by creating a local file when needed, using an LRU cache to control it's lifetime), but this results in an extra copy of the data (besides the hidden local file managed by SwiftData/CoreData). However, since videos can be quite large, I would prefer not to do that. Has anyone any thoughts about how I can avoid the extra data copy?
2
0
1.2k
Dec ’23
Playing a specific rectangular ROI of a video?
Is there a way to play a specific rectangular region of interest of a video in an arbitrarily-sized view? Let's say I have a 1080p video but I'm only interested in a sub-region of the full frame. Is there a way to specify a source rect to be displayed in an arbitrary view (SwiftUI view, ideally), and have it play that in real time, without having to pre-render the cropped region? Update: I may have found a solution here: img DOT ly/blog/trim-and-crop-video-in-swift/ (Apple won't allow that URL for some dumb reason)
0
0
415
Dec ’23
Adding VTT subtitles to a streaming video from an URL
Hi, I've started learning swiftUI a few months ago, and now I'm trying to build my first app :) I am trying to display VTT subtitles from an external URL into a streaming video using AVPlayer and AVMutableComposition. I have been trying for a few days, checking online and on Apple's documentation, but I can't manage to make it work. So far, I managed to display the subtitles, but there is no video or audio playing... Could someone help? Thanks in advance, I hope the code is not too confusing. // EpisodeDetailView.swift // OroroPlayer_v1 // // Created by Juan Valenzuela on 2023-11-25. // import AVKit import SwiftUI struct EpisodeDetailView4: View { @State private var episodeDetailVM = EpisodeDetailViewModel() let episodeID: Int @State private var player = AVPlayer() @State private var subs = AVPlayer() var body: some View { VideoPlayer(player: player) .ignoresSafeArea() .task { do { try await episodeDetailVM.fetchEpisode(id: episodeID) let episode = episodeDetailVM.episodeDetail guard let videoURLString = episode.url else { print("Invalid videoURL or missing data") return } guard let subtitleURLString = episode.subtitles?[0].url else { print("Invalid subtitleURLs or missing data") return } let videoURL = URL(string: videoURLString)! let subtitleURL = URL(string: subtitleURLString)! let videoAsset = AVURLAsset(url: videoURL) let subtitleAsset = AVURLAsset(url: subtitleURL) let movieWithSubs = AVMutableComposition() let videoTrack = movieWithSubs.addMutableTrack(withMediaType: .video, preferredTrackID: kCMPersistentTrackID_Invalid) let audioTrack = movieWithSubs.addMutableTrack(withMediaType: .audio, preferredTrackID: kCMPersistentTrackID_Invalid) let subtitleTrack = movieWithSubs.addMutableTrack(withMediaType: .text, preferredTrackID: kCMPersistentTrackID_Invalid) // if let videoTrackItem = try await videoAsset.loadTracks(withMediaType: .video).first { try await videoTrack?.insertTimeRange(CMTimeRangeMake(start: .zero, duration: videoAsset.load(.duration)), of: videoTrackItem, at: .zero) } if let audioTrackItem = try await videoAsset.loadTracks(withMediaType: .audio).first { try await audioTrack?.insertTimeRange(CMTimeRangeMake(start: .zero, duration: videoAsset.load(.duration)), of: audioTrackItem, at: .zero) } if let subtitleTrackItem = try await subtitleAsset.loadTracks(withMediaType: .text).first { try await subtitleTrack?.insertTimeRange(CMTimeRangeMake(start: .zero, duration: videoAsset.load(.duration)), of: subtitleTrackItem, at: .zero) } let playerItem = AVPlayerItem(asset: movieWithSubs) player = AVPlayer(playerItem: playerItem) let playerController = AVPlayerViewController() playerController.player = player playerController.player?.play() // player.play() } catch { print("Error: \(error.localizedDescription)") } } } } #Preview { EpisodeDetailView4(episodeID: 39288) }
0
0
533
Nov ’23
AVAssetWriter leading to huge kernel_task memory usage
I am currently working on a macOS app which will be creating very large video files with up to an hour of content. However, generating the images and adding them to AVAssetWriter leads to VTDecoderXPCService using 16+ GB of memory and the kernel-task using 40+ GB (the max I saw was 105GB). It seems like the generated video is not streamed onto the disk but rather written to memory for it to be written to disk all at once. How can I force it to stream the data to disk while the encoding is happening? Btw. my app itself consistently needs around 300MB of memory, so I don't think I have a memory leak here. Here is the relevant code: func analyse() { 				self.videoWritter = try! AVAssetWriter(outputURL: outputVideo, fileType: AVFileType.mp4) 				let writeSettings: [String: Any] = [ 						AVVideoCodecKey: AVVideoCodecType.h264, 						AVVideoWidthKey: videoSize.width, 						AVVideoHeightKey: videoSize.height, 						AVVideoCompressionPropertiesKey: [ 								AVVideoAverageBitRateKey: 10_000_000, 						] 				] 				self.videoWritter!.movieFragmentInterval = CMTimeMake(value: 60, timescale: 1) 				self.frameInput = AVAssetWriterInput(mediaType: AVMediaType.video, outputSettings: writeSettings) 				self.frameInput?.expectsMediaDataInRealTime = true 				self.videoWritter!.add(self.frameInput!) 				if self.videoWritter!.startWriting() == false { 						print("Could not write file: \(self.videoWritter!.error!)") 						return 				} } func writeFrame(frame: Frame) { 				/* some more code to determine the correct frame presentation time stamp */ 				let newSampleBuffer = self.setTimeStamp(frame.sampleBuffer, newTime: self.nextFrameStartTime!) 				self.frameInput!.append(newSampleBuffer) 				/* some more code */ } func setTimeStamp(_ sample: CMSampleBuffer, newTime: CMTime) -> CMSampleBuffer { 				var timing: CMSampleTimingInfo = CMSampleTimingInfo( 						duration: CMTime.zero, 						presentationTimeStamp: newTime, 						decodeTimeStamp: CMTime.zero 				) 				var newSampleBuffer: CMSampleBuffer? 				CMSampleBufferCreateCopyWithNewTiming( 						allocator: kCFAllocatorDefault, 					 sampleBuffer: sample, 					 sampleTimingEntryCount: 1, 					 sampleTimingArray: &timing, 					 sampleBufferOut: &newSampleBuffer 			 ) 				return	newSampleBuffer! 		} My specs: MacBook Pro 2018 32GB Memory macOS Big Sur 11.1
5
0
2.1k
Nov ’23
Use AVPlayer to play and AVAssetResourceLoaderDelegate to read data. The following errors occasionally occur during playback.
Use AVPlayer to play and AVAssetResourceLoaderDelegate to read data. The following errors occasionally occur during playback. -11819:Cannot Complete Action -11800:The operation could not be completed -11829:Cannot Open -11849:Operation Stopped -11870:这项操作无法完成 -1002:unsupported URL -11850:操作已停止 -1:未知错误 -17377
0
0
423
Nov ’23
Video orientation doesn't work in AVPlayerViewController
When attempting to present an AVPlayerViewController without animations, the video orientation does not function as expected. However, when the animation parameter is set to true, the video orientation works correctly. The following code does not produce the desired video orientation behavior when animation is disabled: parentViewController.present(playerViewController, animated: false) In contrast, the desired video orientation is achieved with animation enabled: parentViewController.present(playerViewController, animated: true)
0
0
348
Nov ’23
Change picture-in-picture forward/backward values in AVPictureInPictureController
How can be changed playback (forward and backward) buttons values in AVPictureInPictureController? My backward and forward buttons have 15 seconds value by default (screenshot from my app is attached), but I've found other apps has 10 seconds (for instance, Apple TV iOS app). This Apple forum discussion I've read that AVPlayerViewController adapts its capabilities and controls to the asset being played. But it seems backward/forward values in PiP stay the same for all videos independent of duration in both my app and apps I've found. But I can't find the way to change them.
0
0
404
Nov ’23
Quality of Video stream displayed through AVCaptureSession breaks while trying to zoom camera and capture a photo
Hi, We were using Capture Systems of AVKit to take photo's in our app and we need to zoom camera to certain limit. If we configure zoomFactor to AVCaptureDevice we receiving awkward VideoFrames(blurred images) through Camera. Our app works fine in all devices of iPhone/iPad except devices that support Center Stage. We looked into Apple's default Camera app we understood that it was implemented using UIImagePickerController. We tried with multiple combinations of AVCaptureDevice.Format/AVCaptureSession.Preset but nothing helped us. We want's to achieve zoom(front camera) through AVKit, we'll add code snippet we used below please help on this. session.sessionPreset = AVCaptureSession.Preset.photo var bestFormat: AVCaptureDevice.Format? var bestFrameRateRange: AVFrameRateRange? for format in device.formats { for range in format.videoSupportedFrameRateRanges { if range.maxFrameRate > bestFrameRateRange?.maxFrameRate ?? 0 { bestFormat = format bestFrameRateRange = range } } } if let bestFormat = bestFormat, let bestFrameRateRange = bestFrameRateRange { do { try device.lockForConfiguration() // Set the device's active format. device.activeFormat = bestFormat // Set the device's min/max frame duration. let duration = bestFrameRateRange.minFrameDuration device.activeVideoMinFrameDuration = duration device.activeVideoMaxFrameDuration = duration device.videoZoomFactor = 2.0 device.unlockForConfiguration() } catch { // Handle error. } }
0
0
475
Oct ’23
Modern Video and Audio experiences in Swift UI
Hey Apple! I'm just wondering if there are any recommendations on best practices for supporting AV experiences in SwiftUI? As far as I know, VideoPlayer is the only API available directly supported in SwiftUI in AVKit (https://developer.apple.com/documentation/avkit) without the need for UIViewRepresentable / UIViewControllerRepresentable bridging of AVPlayer into SwiftUI. However there are many core video and audio experiences that a modern audience expect that are not supported in VideoPlayer. e.g. PiP Is there a roadmap for support in SwiftUI directly? Thanks!
0
4
499
Oct ’23
AVPlayer fails to start playback during active VOIP call
In the app, I have VOIP functionality along with AVPlayer for playing videos from remote URLs. Once a VOIP call is established, AVPlayer gets AVPlayerRateDidChangeReasonSetRateFailed right after AVPlayerRateDidChangeReasonSetRateCalled in AVPlayer.rateDidChangeNotification observer when trying to start a video using the play() method. As a result, the video does not start. Checked AVAudioSession.interruptionNotification, it is not getting fired. AVPlayer functionality works as expected before and after the call. Issue observable on iOS 17 only. Any help would be appreciated.
0
0
556
Oct ’23
Create Media Player Application guide
I have designed an app for media player, in this app i need to implement live tv, movies and series. so url can be of any type such as .ts formate for live tv, and .mp4, .mov, etc. I am also going to work with m3u. but AVPlayer does not supports all these urls.So can i get some suggestions and solutions for that. what could be the best practice and how to work with all these kind if urls etc.
0
0
427
Oct ’23
Changing the video playing in VideoPlayer programmatically
I'm using SwiftUI to play videos. I'm trying to use a NavigationStack to show a list of videos, then when i click on on it goes to a new view to play that video. I see lots of example that have the underlying AVPlayer in a @State or @StateObject, but as soon as I reassign it to a new AVPlayer the VideoPlayer stops working; I just get a black screen. Is there a better way to change the url that VideoPlayer is playing? I'm trying to avoid creating a bunch of VideoPlayer objects ahead of time and having them all in memory, as this might have a large number of videos. More details: App is TVOS 17, using Xcode 15.
1
0
1.3k
Oct ’23
Video Quality selection in HLS streams
Hello there, in our team we were requested to add the possibility to manually select the video quality. I know that HLS is an adaptive stream and that depending on the network condition it choose the best quality that fits to the current situation. I also tried some setting with preferredMaximumResolution and preferredPeakBitRate but none of them worked once the user was watching the steam. I also tried something like replacing the currentPlayerItem with the new configuration but anyway this only allowed me to downgrade the quality of the video. When I wanted to set it for example to 4k it did not change to that track event if I set a very high values to both params mentioned above. My question is if there is any method which would allow me to force certain quality from the manifest file. I already have some kind of extraction which can parse the manifest file and provide me all the available information but I couldn't still figure out how to make the player reproduce specific stream with my desired quality from the available playlist.
4
0
6.8k
Oct ’23
`displayManager.isDisplayCriteriaMatchingEnabled` arbitrary result (no distinction between frame rate and dynamic range)
displayManager.isDisplayCriteriaMatchingEnabled returns true if one (or both) between refresh rate, and dynamic range, is set to match the content in the AppleTV settings. There's no way to make a distinction between them, and only enable one of them accordingly. Looks like Apple failed to change their APIs to me. What am I missing?
0
0
440
Oct ’23