Create view-level services for media playback, complete with user controls, chapter navigation, and support for subtitles and closed captioning using AVKit.

Posts under AVKit tag

80 Posts
Sort by:

Post

Replies

Boosts

Views

Activity

How to access a directly attached UVC camera with AVPlayer?
On macOS Sonoma I have a SwiftUI app that correctly plays remote video files and local video files from the app bundle. Where I'm having trouble is setting up the AVPlayer URL for a UVC camera device directly connected on the Mac. let url = URL(string: "https://some-remote-video.mp4")! player = AVPlayer(url: url) player.play() Is there some magic to using a UVC device with AVPlayer, or do I need to access the UVC device differently? Thanks, Grahm
1
0
554
Dec ’23
When can I check AVPlayerItem’s status?
I’ve got some code that creates an AVPlayerItem from a URL, the creates an AVQueuePlayer from it. If I check the player item's status after that, it's still unknown. According to the docs, it'll remain unknown until it is associated with an AVPlayer, and then it "immediately begins enqueuing the item’s media and preparing it for playback." But checking the status right after that, I still get unknown, which tells me it’s not quite immediate. Is there any way to test if the player item will work immediately after creation? In this case, the problem is that my app doesn't have permission, due to it being a bookmark saved in a sandboxed app.
0
0
455
Dec ’23
Create AVPlayer instance from Data (rather than URL)?
In my SwiftUI/SwiftData application, I want to store videos in SwiftData objects (using external storage). To display them, I need to instantiate an AVPlayer (for use in a VideoPlayer view). But AVPlayer expects a URL, not a Data object. Obviously, I can solve this problem via a file-caching scheme (i.e., by creating a local file when needed, using an LRU cache to control it's lifetime), but this results in an extra copy of the data (besides the hidden local file managed by SwiftData/CoreData). However, since videos can be quite large, I would prefer not to do that. Has anyone any thoughts about how I can avoid the extra data copy?
2
0
1.2k
Dec ’23
Playing a specific rectangular ROI of a video?
Is there a way to play a specific rectangular region of interest of a video in an arbitrarily-sized view? Let's say I have a 1080p video but I'm only interested in a sub-region of the full frame. Is there a way to specify a source rect to be displayed in an arbitrary view (SwiftUI view, ideally), and have it play that in real time, without having to pre-render the cropped region? Update: I may have found a solution here: img DOT ly/blog/trim-and-crop-video-in-swift/ (Apple won't allow that URL for some dumb reason)
0
0
415
Dec ’23
AVPlayer with multiple audio tracks plays audio differently when start
Hi, I'm trying to play multiple video/audio file with AVPlayer using AVMutableComposition. Each video/audio file can process simultaneously so I set each video/audio in individual tracks. I use only local file. let second = CMTime(seconds: 1, preferredTimescale: 1000) let duration = CMTimeRange(start: .zero, duration: second) var currentTime = CMTime.zero for _ in 0...4 { let mutableTrack = composition.addMutableTrack( withMediaType: .audio, preferredTrackID: kCMPersistentTrackID_Invalid ) try mutableTrack?.insertTimeRange( duration, of: audioAssetTrack, at: currentTime ) currentTime = currentTime + second } When I set many audio tracks (maybe more than 5), the first part sounds a little different from original when it starts. It seems like audio's front part is skipped. But when I set only two tracks, AVPlayer plays as same as original file. avPlayer.play() How can I fix it? Why do audio tracks affect that don't have any playing parts when start? Please let me know.
1
2
787
3w
Adding VTT subtitles to a streaming video from an URL
Hi, I've started learning swiftUI a few months ago, and now I'm trying to build my first app :) I am trying to display VTT subtitles from an external URL into a streaming video using AVPlayer and AVMutableComposition. I have been trying for a few days, checking online and on Apple's documentation, but I can't manage to make it work. So far, I managed to display the subtitles, but there is no video or audio playing... Could someone help? Thanks in advance, I hope the code is not too confusing. // EpisodeDetailView.swift // OroroPlayer_v1 // // Created by Juan Valenzuela on 2023-11-25. // import AVKit import SwiftUI struct EpisodeDetailView4: View { @State private var episodeDetailVM = EpisodeDetailViewModel() let episodeID: Int @State private var player = AVPlayer() @State private var subs = AVPlayer() var body: some View { VideoPlayer(player: player) .ignoresSafeArea() .task { do { try await episodeDetailVM.fetchEpisode(id: episodeID) let episode = episodeDetailVM.episodeDetail guard let videoURLString = episode.url else { print("Invalid videoURL or missing data") return } guard let subtitleURLString = episode.subtitles?[0].url else { print("Invalid subtitleURLs or missing data") return } let videoURL = URL(string: videoURLString)! let subtitleURL = URL(string: subtitleURLString)! let videoAsset = AVURLAsset(url: videoURL) let subtitleAsset = AVURLAsset(url: subtitleURL) let movieWithSubs = AVMutableComposition() let videoTrack = movieWithSubs.addMutableTrack(withMediaType: .video, preferredTrackID: kCMPersistentTrackID_Invalid) let audioTrack = movieWithSubs.addMutableTrack(withMediaType: .audio, preferredTrackID: kCMPersistentTrackID_Invalid) let subtitleTrack = movieWithSubs.addMutableTrack(withMediaType: .text, preferredTrackID: kCMPersistentTrackID_Invalid) // if let videoTrackItem = try await videoAsset.loadTracks(withMediaType: .video).first { try await videoTrack?.insertTimeRange(CMTimeRangeMake(start: .zero, duration: videoAsset.load(.duration)), of: videoTrackItem, at: .zero) } if let audioTrackItem = try await videoAsset.loadTracks(withMediaType: .audio).first { try await audioTrack?.insertTimeRange(CMTimeRangeMake(start: .zero, duration: videoAsset.load(.duration)), of: audioTrackItem, at: .zero) } if let subtitleTrackItem = try await subtitleAsset.loadTracks(withMediaType: .text).first { try await subtitleTrack?.insertTimeRange(CMTimeRangeMake(start: .zero, duration: videoAsset.load(.duration)), of: subtitleTrackItem, at: .zero) } let playerItem = AVPlayerItem(asset: movieWithSubs) player = AVPlayer(playerItem: playerItem) let playerController = AVPlayerViewController() playerController.player = player playerController.player?.play() // player.play() } catch { print("Error: \(error.localizedDescription)") } } } } #Preview { EpisodeDetailView4(episodeID: 39288) }
0
0
533
Nov ’23
Use AVPlayer to play and AVAssetResourceLoaderDelegate to read data. The following errors occasionally occur during playback.
Use AVPlayer to play and AVAssetResourceLoaderDelegate to read data. The following errors occasionally occur during playback. -11819:Cannot Complete Action -11800:The operation could not be completed -11829:Cannot Open -11849:Operation Stopped -11870:这项操作无法完成 -1002:unsupported URL -11850:操作已停止 -1:未知错误 -17377
0
0
423
Nov ’23
Video orientation doesn't work in AVPlayerViewController
When attempting to present an AVPlayerViewController without animations, the video orientation does not function as expected. However, when the animation parameter is set to true, the video orientation works correctly. The following code does not produce the desired video orientation behavior when animation is disabled: parentViewController.present(playerViewController, animated: false) In contrast, the desired video orientation is achieved with animation enabled: parentViewController.present(playerViewController, animated: true)
0
0
348
Nov ’23
Change picture-in-picture forward/backward values in AVPictureInPictureController
How can be changed playback (forward and backward) buttons values in AVPictureInPictureController? My backward and forward buttons have 15 seconds value by default (screenshot from my app is attached), but I've found other apps has 10 seconds (for instance, Apple TV iOS app). This Apple forum discussion I've read that AVPlayerViewController adapts its capabilities and controls to the asset being played. But it seems backward/forward values in PiP stay the same for all videos independent of duration in both my app and apps I've found. But I can't find the way to change them.
0
0
404
Nov ’23
Quality of Video stream displayed through AVCaptureSession breaks while trying to zoom camera and capture a photo
Hi, We were using Capture Systems of AVKit to take photo's in our app and we need to zoom camera to certain limit. If we configure zoomFactor to AVCaptureDevice we receiving awkward VideoFrames(blurred images) through Camera. Our app works fine in all devices of iPhone/iPad except devices that support Center Stage. We looked into Apple's default Camera app we understood that it was implemented using UIImagePickerController. We tried with multiple combinations of AVCaptureDevice.Format/AVCaptureSession.Preset but nothing helped us. We want's to achieve zoom(front camera) through AVKit, we'll add code snippet we used below please help on this. session.sessionPreset = AVCaptureSession.Preset.photo var bestFormat: AVCaptureDevice.Format? var bestFrameRateRange: AVFrameRateRange? for format in device.formats { for range in format.videoSupportedFrameRateRanges { if range.maxFrameRate > bestFrameRateRange?.maxFrameRate ?? 0 { bestFormat = format bestFrameRateRange = range } } } if let bestFormat = bestFormat, let bestFrameRateRange = bestFrameRateRange { do { try device.lockForConfiguration() // Set the device's active format. device.activeFormat = bestFormat // Set the device's min/max frame duration. let duration = bestFrameRateRange.minFrameDuration device.activeVideoMinFrameDuration = duration device.activeVideoMaxFrameDuration = duration device.videoZoomFactor = 2.0 device.unlockForConfiguration() } catch { // Handle error. } }
0
0
475
Oct ’23
Modern Video and Audio experiences in Swift UI
Hey Apple! I'm just wondering if there are any recommendations on best practices for supporting AV experiences in SwiftUI? As far as I know, VideoPlayer is the only API available directly supported in SwiftUI in AVKit (https://developer.apple.com/documentation/avkit) without the need for UIViewRepresentable / UIViewControllerRepresentable bridging of AVPlayer into SwiftUI. However there are many core video and audio experiences that a modern audience expect that are not supported in VideoPlayer. e.g. PiP Is there a roadmap for support in SwiftUI directly? Thanks!
0
4
499
Oct ’23
AVPlayer fails to start playback during active VOIP call
In the app, I have VOIP functionality along with AVPlayer for playing videos from remote URLs. Once a VOIP call is established, AVPlayer gets AVPlayerRateDidChangeReasonSetRateFailed right after AVPlayerRateDidChangeReasonSetRateCalled in AVPlayer.rateDidChangeNotification observer when trying to start a video using the play() method. As a result, the video does not start. Checked AVAudioSession.interruptionNotification, it is not getting fired. AVPlayer functionality works as expected before and after the call. Issue observable on iOS 17 only. Any help would be appreciated.
0
0
556
Oct ’23
Create Media Player Application guide
I have designed an app for media player, in this app i need to implement live tv, movies and series. so url can be of any type such as .ts formate for live tv, and .mp4, .mov, etc. I am also going to work with m3u. but AVPlayer does not supports all these urls.So can i get some suggestions and solutions for that. what could be the best practice and how to work with all these kind if urls etc.
0
0
426
Oct ’23
Changing the video playing in VideoPlayer programmatically
I'm using SwiftUI to play videos. I'm trying to use a NavigationStack to show a list of videos, then when i click on on it goes to a new view to play that video. I see lots of example that have the underlying AVPlayer in a @State or @StateObject, but as soon as I reassign it to a new AVPlayer the VideoPlayer stops working; I just get a black screen. Is there a better way to change the url that VideoPlayer is playing? I'm trying to avoid creating a bunch of VideoPlayer objects ahead of time and having them all in memory, as this might have a large number of videos. More details: App is TVOS 17, using Xcode 15.
1
0
1.3k
Oct ’23
`displayManager.isDisplayCriteriaMatchingEnabled` arbitrary result (no distinction between frame rate and dynamic range)
displayManager.isDisplayCriteriaMatchingEnabled returns true if one (or both) between refresh rate, and dynamic range, is set to match the content in the AppleTV settings. There's no way to make a distinction between them, and only enable one of them accordingly. Looks like Apple failed to change their APIs to me. What am I missing?
0
0
440
Oct ’23
Picture in Picture with WebRTC, nothing displayed
Hello 👋 I try to implement picture in picture on iOS with webRTC but I have some issue. I started by following this Apple article : https://developer.apple.com/documentation/avkit/adopting_picture_in_picture_for_video_calls At least when my app is in background, the picture in picture view appear, but nothing is display within it : So by searching on internet I found this post in Stackoverflow (https://stackoverflow.com/questions/71419635/how-to-add-picture-in-picture-pip-for-webrtc-video-calls-in-ios-swift), who says : It's interesting but unfortunately, I don't know what I have to do... Here is my PictureInPictureManager : final class VideoBufferView: UIView { override class var layerClass: AnyClass { AVSampleBufferDisplayLayer.self } var sampleBufferDisplayLayer: AVSampleBufferDisplayLayer { layer as! AVSampleBufferDisplayLayer } } final class PictureInPictureManager: NSObject { static let shared: PictureInPictureManager = .init() private override init() { } private var pipController: AVPictureInPictureController? private var bufferView: VideoBufferView = .init() func configure(for videoView: UIView) { if AVPictureInPictureController.isPictureInPictureSupported() { let bufferView: VideoBufferView = .init() let pipVideoCallViewController: AVPictureInPictureVideoCallViewController = .init() pipVideoCallViewController.preferredContentSize = CGSize(width: 108, height: 192) pipVideoCallViewController.view.addSubview(bufferView) let pipContentSource: AVPictureInPictureController.ContentSource = .init( activeVideoCallSourceView: videoView, contentViewController: pipVideoCallViewController ) pipController = .init(contentSource: pipContentSource) pipController?.canStartPictureInPictureAutomaticallyFromInline = true pipController?.delegate = self } else { print("❌ PIP not supported...") } } } With this code, the picture in picture view appear empty. I read multiple article who talk about using the buffer but I'm not sure how to do it with webRTC... I tried by adding this function to my PictureInPictureManager : func updateBuffer(with pixelBuffer: CVPixelBuffer) { if let sampleBuffer = createSampleBufferFrom(pixelBuffer: pixelBuffer) { bufferView.sampleBufferDisplayLayer.enqueue(sampleBuffer) } else { print("❌ Sample buffer error...") } } private func createSampleBufferFrom(pixelBuffer: CVPixelBuffer) -> CMSampleBuffer? { var presentationTime = CMSampleTimingInfo() // Create a format description for the pixel buffer var formatDescription: CMVideoFormatDescription? let formatDescriptionError = CMVideoFormatDescriptionCreateForImageBuffer( allocator: kCFAllocatorDefault, imageBuffer: pixelBuffer, formatDescriptionOut: &formatDescription ) guard formatDescriptionError == noErr else { print("❌ Error creating format description: \(formatDescriptionError)") return nil } // Create a sample buffer var sampleBuffer: CMSampleBuffer? let sampleBufferError = CMSampleBufferCreateReadyWithImageBuffer( allocator: kCFAllocatorDefault, imageBuffer: pixelBuffer, formatDescription: formatDescription!, sampleTiming: &presentationTime, sampleBufferOut: &sampleBuffer ) guard sampleBufferError == noErr else { print("❌ Error creating sample buffer: \(sampleBufferError)") return nil } return sampleBuffer } but by doing that, I get this error message : Any help is welcome ! 🙏 Thanks, Alexandre
2
0
1.1k
Oct ’23
NSToolbar Draws On Top of "Full Screen" Video Played in WKWebView in Mac Catalyst app
I have a Mac Catalyst app configured like so: The root view controller on the window is a tripe split UISplitViewController. The secondary view controller in the Split View controller is a view controller that uses WKWebView. Load a website in the WKWebview that has a video. Expand the video to “Full screen” (on Mac Catalyst this is only “Full window” because the window does not enter full screen like AppKit apps do). The NSToolbar overlaps the “Full screen video.” On a triple Split View controller only the portions of the toolbar in the secondary and supplementary columns show through (the video actually covers the toolbar area in the “primary” column). The expected results: -For the video to cover the entire window including the NSToolbar. Actual results: The NSToolbar draw on top of the video. -- Anyone know of a workaround? I filed FB13229032
0
0
429
Oct ’23
Timestamps in AVPlayer
I want to show the user actual start and end dates of the video played on the AVPlayer time slider, instead of the video duration data. I would like to show something like this: 09:00:00 ... 12:00:00 (which indicates that the video started at 09:00:00 CET and ended at 12:00:00 CET), instead of: 00:00:00 ... 02:59:59. I would appreciate any pointers to this direction.
1
1
526
Sep ’23
Crash removing time observer from player
Hi, We have a tvOS App with a custom player and we're getting some crashes trying to remove a periodicTimeObserver on a player instance: Incident Identifier: 3FE68C1C-126D-4A16-BBF2-9F8D1E395548 Hardware Model: AppleTV6,2 Process: MyApp [2516] Path: /private/var/containers/Bundle/Application/B99FEAB0-0753-48FE-A7FC-7AEB8E2361C1/MyApp.app/MyApp Identifier: pt.appletv.bundleid Version: 4.9.5 (2559) AppStoreTools: 15A240a AppVariant: 1:AppleTV6,2:16 Beta: YES Code Type: ARM-64 (Native) Role: Foreground Parent Process: launchd [1] Coalition: pt.appletv.bundleid [317] Date/Time: 2023-09-21 18:49:39.0241 +0100 Launch Time: 2023-09-21 18:38:34.6957 +0100 OS Version: Apple TVOS 16.6 (20M73) Release Type: User Report Version: 104 Exception Type: EXC_CRASH (SIGABRT) Exception Codes: 0x0000000000000000, 0x0000000000000000 Termination Reason: SIGNAL 6 Abort trap: 6 Terminating Process: MyApp [2516] Triggered by Thread: 0 Last Exception Backtrace: 0 CoreFoundation 0x1914c12c8 __exceptionPreprocess + 160 (NSException.m:202) 1 libobjc.A.dylib 0x190cfc114 objc_exception_throw + 56 (objc-exception.mm:356) 2 AVFCore 0x1c432b89c -[AVPlayer removeTimeObserver:] + 176 (AVPlayer.m:0) 3 CustomPlayer 0x10549f670 MyPlayerViewController.removePlayerObservers(_:) + 248 (MyPlayerViewController.swift:252) 4 CustomPlayer 0x10549c978 closure #1 in MyPlayerViewController.player.didset + 68 (MyPlayerViewController.swift:98) 5 CustomPlayer 0x10549be60 thunk for @escaping @callee_guaranteed () -> () + 28 (<compiler-generated>:0) 6 libdispatch.dylib 0x190e5eef4 _dispatch_call_block_and_release + 24 (init.c:1518) 7 libdispatch.dylib 0x190e60784 _dispatch_client_callout + 16 (object.m:560) 8 libdispatch.dylib 0x190e6dd34 _dispatch_main_queue_drain + 892 (queue.c:7794) 9 libdispatch.dylib 0x190e6d9a8 _dispatch_main_queue_callback_4CF + 40 (queue.c:7954) 10 CoreFoundation 0x19142b038 __CFRUNLOOP_IS_SERVICING_THE_MAIN_DISPATCH_QUEUE__ + 12 (CFRunLoop.c:1780) 11 CoreFoundation 0x19142569c __CFRunLoopRun + 2080 (CFRunLoop.c:3147) 12 CoreFoundation 0x191424a3c CFRunLoopRunSpecific + 584 (CFRunLoop.c:3418) 13 GraphicsServices 0x1980cab0c GSEventRunModal + 160 (GSEvent.c:2196) 14 UIKitCore 0x1da6fe6ec -[UIApplication _run] + 868 (UIApplication.m:3782) 15 UIKitCore 0x1da702bc4 UIApplicationMain + 148 (UIApplication.m:5372) 16 MyApp 0x104418268 main + 176 (main.swift:12) 17 dyld 0x1ddd81744 start + 1832 (dyldMain.cpp:1165) Thread 0 name: Thread 0 Crashed: 0 libsystem_kernel.dylib 0x0000000190fe69a8 __pthread_kill + 8 (:-1) 1 libsystem_pthread.dylib 0x000000019109e440 pthread_kill + 208 (pthread.c:1670) 2 libsystem_c.dylib 0x0000000190f5f8dc __abort + 124 (abort.c:155) 3 libsystem_c.dylib 0x0000000190f5f860 abort + 132 (abort.c:126) 4 libc++abi.dylib 0x0000000190da1fe0 abort_message + 128 (:-1) 5 libc++abi.dylib 0x0000000190d92be8 demangling_terminate_handler() + 300 6 libobjc.A.dylib 0x0000000190cda7d4 _objc_terminate() + 124 (objc-exception.mm:498) 7 FirebaseCrashlytics 0x0000000105118754 FIRCLSTerminateHandler() + 340 (FIRCLSException.mm:452) 8 libc++abi.dylib 0x0000000190da15c0 std::__terminate(void (*)()) + 12 (:-1) 9 libc++abi.dylib 0x0000000190da1570 std::terminate() + 52 10 libdispatch.dylib 0x0000000190e60798 _dispatch_client_callout + 36 (object.m:563) 11 libdispatch.dylib 0x0000000190e6dd34 _dispatch_main_queue_drain + 892 (queue.c:7794) 12 libdispatch.dylib 0x0000000190e6d9a8 _dispatch_main_queue_callback_4CF + 40 (queue.c:7954) 13 CoreFoundation 0x000000019142b038 __CFRUNLOOP_IS_SERVICING_THE_MAIN_DISPATCH_QUEUE__ + 12 (CFRunLoop.c:1780) 14 CoreFoundation 0x000000019142569c __CFRunLoopRun + 2080 (CFRunLoop.c:3147) 15 CoreFoundation 0x0000000191424a3c CFRunLoopRunSpecific + 584 (CFRunLoop.c:3418) 16 GraphicsServices 0x00000001980cab0c GSEventRunModal + 160 (GSEvent.c:2196) 17 UIKitCore 0x00000001da6fe6ec -[UIApplication _run] + 868 (UIApplication.m:3782) 18 UIKitCore 0x00000001da702bc4 UIApplicationMain + 148 (UIApplication.m:5372) 19 MyApp 0x0000000104418268 main + 176 (main.swift:12) 20 dyld 0x00000001ddd81744 start + 1832 (dyldMain.cpp:1165) The code is: @objc public dynamic var player: AVPlayer? { willSet { removeThumbnails() } didSet { DispatchQueue.main.async { [weak self] in guard let self else { return } self.removePlayerObservers(oldValue) self.addPlayerObservers(self.player) } } } func removePlayerObservers(_ player: AVPlayer?) { if let periodicTimeObserver = periodicTimeObserver { player?.removeTimeObserver(periodicTimeObserver) self.periodicTimeObserver = nil } } What could be the problem? Thank you
0
0
355
Sep ’23