Create view-level services for media playback, complete with user controls, chapter navigation, and support for subtitles and closed captioning using AVKit.

AVKit Documentation

Posts under AVKit tag

75 Posts
Sort by:
Post not yet marked as solved
0 Replies
471 Views
Situation: I have an HLS audio only stream comprised of aac files. I've confirmed that timed metadata is attached to the stream using ffprobe. Unfortunately I'm unable to access the timed metadata from the AVPlayer. Output from FFProbe ~ ffprobe index_1_296.aac .... Input #0, aac, from 'index_1_296.aac': Metadata: id3v2_priv.com.apple.streaming.transportStreamTimestamp: \x00\x00\x00\x00.\x00\x05\xc0 Duration: 00:00:06.02, bitrate: 96 kb/s Stream #0:0: Audio: aac (LC), 48000 Hz, stereo, fltp, 96 kb/s What I've done: In my class containing the AVPlayer I've extended the AVPlayerItemMetadataOutputPushDelegate and implemented the metadataOutput method. Code I followed an example I found here: https://dcordero.medium.com/hls-timed-metadata-with-avplayer-9e20806ef92f however below is the code implementing the metadataOutput method: func metadataOutput(_ output: AVPlayerItemMetadataOutput, didOutputTimedMetadataGroups groups: [AVTimedMetadataGroup], from track: AVPlayerItemTrack?) { if let item = groups.first?.items.first { item.value(forKeyPath: #keyPath(AVMetadataItem.value)) let metadataValue = (item.value(forKeyPath: #keyPath(AVMetadataItem.value))!) print("Metadata value: \n \(metadataValue)") } else { print("MetaData Error") } } What I'm seeing: When playing manifests containing .ts files this metadataOutput method is triggered with timed metadata. However when I'm playing a manifest containing only .aac files the metadataOutput method is never triggered. Question: Does AVPlayer support extracting timed metadata from aac files? If it does are there any examples of this working?
Posted
by
Post not yet marked as solved
1 Replies
341 Views
Good day community! I took part in Apple's MacOS 12 beta testing program and faced a problem with AVPlayer + AVPlayerLayer video playback. System: Macos 12 Beta 5, Xcode 12.5.1 Problem: If application starts in a fullscreen mode, video is not rendering (while audio stream is playing). If I switched to windowed mode (at runtime) video layer becomes visible. This issue does not happen if application starts in windowed mode, there are no problems with video visibility. I can resize, switch fullscreen, switch back - everything works fine. Also there are no problem with video playback on Macos 11. I provide the simplest application's source code, where one can reproduce the problem. git clone https://github.com/s-petrovskiy/apple-test and follow preparation steps in README.md Steps: 1) Run the program; 2) Switch to fullscreen (by pressing F); 3) Launch video playback (by pressing P); 4) Observe the problem: video layer is not visible while audio is played; 5) Switch back to windowed mode (by pressing F); 6) Observe: video becomes visible. Alternative: 1) Run the program; 2) Do not switch to fullscreen and launch the video playback (by pressing P); 3) Observe: video is visible; 4) Toggle to fullscreen (by pressing F) or resize the window manually; 5) Observe: everything works fine. The simplest playback is roughly implemented as following: having a root NSView. create NSView playerView and set it up with wantsLayer:YES; setFrame:[root bounds]; add as subview via [root addSubview:playerView]; create AVPlayer player; create AVPlayerLayer and init with AVPlayer's instance; set up playerLayer with setAutoresizingMask:kCALayerWidthSizable | lCALayerHeightSizable; setVideoGravityAVLayerVideoGravityResizeAspectFill; setFrame:[videoView bounds]; add playerLayer as sublayer to playerView backing layer (which was automatically created by wantsLayer:YES call) via [playerView.layer addSublayer:playerLayer]; attach playable asset to player; launch playback as [player play]; I have no problems with video playback on MacOS 11 Big Sur and earlier. Do I miss something in my code or may it be MacOS 12's issue? I have read MacOS 12 release notes but haven't found any API or behaviour changes which may be related to my question. Any help is highly appreciated. Best regards.
Posted
by
Post not yet marked as solved
2 Replies
904 Views
Hello All, I have an app, which is playing url mp3 audio with two image buttons - play and stop. Now I would like to improve it a little bit. I have two .png images (play.png and pause.png) right now and I would like them to change with each other with a tap depending on whether the stream is on or off. Any ideas how to make it? Here is my code: import UIKit import AVKit import MediaPlayer class ViewController: UIViewController, AVAudioPlayerDelegate { var player : AVPlayer! var dict = NSDictionary() @IBAction func playButtonPressed(_ sender: UIButton){ let url = "https://stream.com/radio.mp3" do { try AVAudioSession.sharedInstance().setCategory(.playback, mode: .default, options: [.mixWithOthers, .allowAirPlay]) try AVAudioSession.sharedInstance().setCategory(AVAudioSession.Category.playback, options: []) print("Playback OK") try AVAudioSession.sharedInstance().setActive(true) print("Session is Active") } catch { print(error) } player = AVPlayer(url: URL(string: url)!) player.volume = 1.0 player.rate = 1.0 player.play() } @IBAction func stopButtonStopped(sender: UIButton) { player.pause() }
Posted
by
Post not yet marked as solved
1 Replies
696 Views
Hello, I've implemented two functions in View controller (setupRemoteTransportControls() and setupNowPlaying()) and added one function to AppDelegate, but I'm still unable to see background audio controls of my app on the lock screen and also audio interruption function isn't working. This is the live stream from url, as you can spot on in the code. In the general settings I have added background playing: What I would like to do is to print on the Remote Command Center artist, title and albumArt, but but i was stuck just displaying the command center. I attach link my code on github, because it is too many characters to paste it: https://github.com/pawelzet/promil_new/blob/main/ViewController.swift Here is AppDelegate func that I've added:     func application(_ application: UIApplication, didFinishLaunchingWithOptions launchOptions: [UIApplication.LaunchOptionsKey: Any]?) -> Bool {         application.beginReceivingRemoteControlEvents()      //    Override point for customization after application launch.        return true     }
Posted
by
Post not yet marked as solved
0 Replies
302 Views
Hello, I am building an app that processes video at a higher framerate (120-160fps). However, I would also like to combine this with LIDAR and more accurate depth data. However, it seems that ARKit runs at a maximum of 60fps. Is it possible to access LIDAR depth data and high framerate video at the same time? I do not need depth data at 120fps, just every "once and a while" to get more accurate measurements.
Posted
by
Post not yet marked as solved
1 Replies
556 Views
I'm parsing some data from my API and I would like to show artist and title in title section of MPMediaItemPropertyTitle and the album art. Right now I can print a string, as you can spot in the code below, but I would like to print the API data, as I'm doing it in the labels. Thank you in advance for your help. Here is my code: import UIKit import AVKit import MediaPlayer class ViewController: UIViewController, AVAudioPlayerDelegate { var player : AVPlayer! var dict = NSDictionary() @IBOutlet weak var artist: UILabel! @IBOutlet weak var songtitle: UILabel! @IBOutlet weak var artUrl: UIImageView! override func viewDidLoad() { super.viewDidLoad() // Do any additional setup after loading the view. overrideUserInterfaceStyle = .light setupRemoteTransportControls() requestNowPlaying() setupNowPlaying() addInterruptionsObserver() } [...] //There is parsing section [...] DispatchQueue.main.async { self.songtitle.text = radio.nowPlaying.song.title self.artist.text = radio.nowPlaying.song.artist self.playlist.text = radio.nowPlaying.playlist //albumcover art section if let artUrl = URL(string: radio.nowPlaying.song.art), artUrl != self.songArtUrl { //Loading image from `artUrl` let imageDatatask = session.dataTask(with: artUrl) { imageData, imageResponse, imageError in if let imageError = imageError { print(imageError) return } guard let imageData = imageData else { print("image_data is nil") return } DispatchQueue.main.async { self.songArtUrl = artUrl let albumArt = UIImage(data: imageData) self.artUrl.image = albumArt } } imageDatatask.resume() } [...] //here is some code with adding remote controls [...] func setupNowPlaying() { // Define Now Playing Info var nowPlayingInfo = [String : Any]() nowPlayingInfo[MPMediaItemPropertyTitle] = "Here I would like to print artist + title" nowPlayingInfo[MPMediaItemPropertyArtist] = "My name as string - nothing to change" if let image = UIImage(named: "Deault_albumart") { //Here I would like to add image from API nowPlayingInfo[MPMediaItemPropertyArtwork] = MPMediaItemArtwork(boundsSize: image.size) { size in return image } } nowPlayingInfo[MPNowPlayingInfoPropertyIsLiveStream] = true MPNowPlayingInfoCenter.default().nowPlayingInfo = nowPlayingInfo }
Posted
by
Post not yet marked as solved
2 Replies
356 Views
Hi. I try to reproduce a streaming in my AvPlayer but I have this error: NSConcreteNotification 0x281b850b0 {name = AVPlayerItemFailedToPlayToEndTimeNotification; object = <AVPlayerItem: 0x2815aa060, asset = <AVURLAsset: 0x2816d4640, URL = URLSTREAMING>>; userInfo = { AVPlayerItemFailedToPlayToEndTimeErrorKey = "Error Domain=AVFoundationErrorDomain Code=-11800 "No se ha podido completar la operaci\U00f3n" UserInfo={NSLocalizedFailureReason=Se ha producido un error desconocido (-12976), NSLocalizedDescription=No se ha podido completar la operaci\U00f3n, NSUnderlyingError=0x281b85770 {Error Domain=NSOSStatusErrorDomain Code=-12976 "(null)"}}"; }} I know that it is related to the metadata of the signal. But I don't know what is the cause of this error. Somebody could help me? Thanks greetings
Posted
by
Post not yet marked as solved
1 Replies
473 Views
How can I play video using AVPlayer? I have retrieved the file URL i.e file:///Users/admin/Library/Developer/CoreSimulator/Devices/718B08F8-D4DD-44E6-9DFA-0E81D5EDA78C/data/Containers/Shared/AppGroup/D82C51F4-E1B2-4390-9885-296A185ACF16/File%20Provider%20Storage/photospicker/version=1&amp;uuid=BCC39930-E835-4BBE-A6F1-716B21CA10A0&amp;mode=compatible.mov how to play using this?
Posted
by
Post not yet marked as solved
2 Replies
511 Views
I have a music app that can play in the background, using AVQueuePlayer. I'm in the process of adding support for CloudKit sync of the CoreData store, switching from NSPersistentContainer to NSPersistentCloudKitContainer. The initial sync can be fairly large (10,000+ records), depending on how much the user has used the app. The issue I'm seeing is this: ✅ When the app is in the foreground, CloudKit sync uses a lot of CPU, nearly 100% for a long time (this is expected during the initial sync). ✅ If I AM NOT playing music, when I put the app in the background, CloudKit sync eventually stops syncing until I bring the app to the foreground again (this is also expected). ❌ If I AM playing music, when I put the app in the background, CloudKit never stops syncing, which leads the system to terminate the app after a certain amount of time due to high CPU usage. Is there any way to pause the CloudKit sync when the app is in the background or is there any way to mitigate this?
Posted
by
Post not yet marked as solved
0 Replies
288 Views
I'm trying to programmatically take a screenshot of a view controller that has an AVPlayerViewController. The problem is that, when taking the screenshot on the simulator, the video player appears but on a real device the video player will appear as blank. Here's the relevant code: UIGraphicsBeginImageContextWithOptions(imageSize, NO, 0.0f); CGContextRef context = UIGraphicsGetCurrentContext(); [window drawViewHierarchyInRect:windowFrame afterScreenUpdates:afterScreenUpdates]; UIImage *screenShot = UIGraphicsGetImageFromCurrentImageContext(); UIGraphicsEndImageContext();
Posted
by
Post not yet marked as solved
0 Replies
318 Views
I am building a macOs app with SwiftUI I would like to have the window resize according the media's aspect ratio, like what vlc player does. But I could not found any function to control the size of the window on runtime. The "frame" control either produce a fixed size window or not working(maxH maxW) at all.
Posted
by
Post not yet marked as solved
2 Replies
510 Views
Hi, I have an app attempting to do some video playback and editing, but I'm having an issue where with some videos, some operations like AVQueuePlayer/AVPlayerItem.seek will cause all players to move to .failed and just show black. The only indication the OS gives the app that this has happened is some logs relating to the haptic engine: [hcln]     AVHapticClient.mm:1309 -[AVHapticClient handleServerConnectionInterruption]: [xpc] Entered (due to connection interruption) for client ID 0x100031e [hapi]     CHHapticEngine.mm:614  -[CHHapticEngine finishInit:]_block_invoke: ERROR: Server connection broke with error 'The operation couldn’t be completed. (com.apple.CoreHaptics error -4811.) In the device console, the most likely culprit seems to be this (reported by kernal): EXC_RESOURCE -> mediaserverd[768] exceeded mem limit: ActiveSoft 2500 MB (non-fatal) If my assumption is correct, is it possible to mitigate this issue before mediaserverd and all the associated players get killed? My understanding of this is that AV memory is separate to the app memory so responding to the usual didReceiveMemoryWarningNotification isn't applicable in this case, and that notification doesn't seem to be being send before these failures. Thanks
Posted
by
Post not yet marked as solved
1 Replies
354 Views
RealityKit ARImageAnchor with VideoMaterial problems When I move the camera closer, sometimes the image from the ARResources overlaps the video with itself. What could be the problem? Links: https://www.dropbox.com/s/b8yaczq4xjk9v1p/IMG_9429.PNG?dl=0 https://www.dropbox.com/s/59dj4ldf6l3yj4u/RPReplay_Final1637392988.mov?dl=0 VideoEntity class final class VideoEntity {     var videoPlayer = AVPlayer()     func videoModelEntity(width: Float?, height: Float?) -> ModelEntity {         let plane = MeshResource.generatePlane(width: width ?? Float(), height: height ?? Float())         let videoItem = createVideoItem(with: "Cooperation")         let videoMaterial = createVideoMaterial(with: videoItem)         return ModelEntity(mesh: plane, materials: [videoMaterial])      }     func placeVideoScreen(videoEntity: ModelEntity, imageAnchor: ARImageAnchor, arView: ARView) {         let anchorEntity = AnchorEntity(anchor: imageAnchor)         let rotationAngle = simd_quatf(angle: GLKMathDegreesToRadians(-90), axis: SIMD3<Float>(x: 1, y: 0, z: 0))         videoEntity.setOrientation(rotationAngle, relativeTo: anchorEntity)         videoEntity.setPosition(SIMD3<Float>(x: 0, y: 0.015, z: 0), relativeTo: anchorEntity)         anchorEntity.addChild(videoEntity)         arView.scene.addAnchor(anchorEntity)     }     private func createVideoItem(with filename: String) -> AVPlayerItem {         guard let url = Bundle.main.url(forResource: filename, withExtension: "mov") else {             fatalError("Fatal Error: - No file source.")         }         return AVPlayerItem(url: url)     }     private func createVideoMaterial(with videoItem: AVPlayerItem) -> VideoMaterial {         let videoMaterial = VideoMaterial(avPlayer: videoPlayer)         videoPlayer.replaceCurrentItem(with: videoItem)         videoPlayer.actionAtItemEnd = .none         videoPlayer.play()         NotificationCenter.default.addObserver(self, selector: #selector(loopVideo), name: NSNotification.Name.AVPlayerItemDidPlayToEndTime, object: videoPlayer.currentItem)         return videoMaterial     }         @objc     private func loopVideo(notification: Notification) {         guard let playerItem = notification.object as? AVPlayerItem else { return }         playerItem.seek(to: CMTime.zero, completionHandler: nil)         videoPlayer.play()     } } ViewModel class     func startImageTracking(arView: ARView) {         guard let arReferenceImage = ARReferenceImage.referenceImages(inGroupNamed: "ARResources", bundle: Bundle.main) else { return }         let configuration = ARImageTrackingConfiguration().do {             $0.trackingImages = arReferenceImage             $0.maximumNumberOfTrackedImages = 1         }         let personSegmentation: ARWorldTrackingConfiguration.FrameSemantics = .personSegmentationWithDepth         if ARWorldTrackingConfiguration.supportsFrameSemantics(personSegmentation) {   configuration.frameSemantics.insert(personSegmentation)         }         arView.session.run(configuration, options: [.resetTracking]) } ARSessionDelegate protocol   func session(_ session: ARSession, didAdd anchors: [ARAnchor]) {         for anchor in anchors {             if let imageAnchor = anchor as? ARImageAnchor {                 let videoEntity = viewModel.videoEntity.videoModelEntity(width: Float(imageAnchor.referenceImage.physicalSize.width), height: Float(imageAnchor.referenceImage.physicalSize.height))                 viewModel.videoEntity.placeVideoScreen(videoEntity: videoEntity, imageAnchor: imageAnchor, arView: arView)             }         }     }
Posted
by
Post not yet marked as solved
0 Replies
354 Views
App getting crashed after updating OS version to OS 15.1 at the time of first time launch, and after crash it works fine. In earlier version like in 15.0 it was working fine. While Debug I found in the first time in audio video permission app getting stuck in below code.    if ([AVCaptureDevice respondsToSelector:@selector(requestAccessForMediaType: completionHandler:)]) {     [AVCaptureDevice requestAccessForMediaType:AVMediaTypeAudio completionHandler:^(BOOL granted) {       if (granted) {         dispatch_async(dispatch_get_main_queue(), ^{                                });       } else {       }     }];   } else {             }
Posted
by
Post not yet marked as solved
0 Replies
355 Views
I want to make a custom UI that integrates each video feed from my FaceTime group activity participants. I found the app Share+ in the App Store integrates the video from each FaceTime participant into it's own UI, so I know it's possible. Can anyone point me to the relevant documentation that shows me how I can get to the video of each FaceTime group member to put in my own AVSampleBufferDisplayLayer or otherwise?
Posted
by
Post not yet marked as solved
0 Replies
296 Views
Hello everyone I'm seeing weird crash on bugsnag. Its about player on tvOS and it happens when I'm exiting player. And its 2 system classes. Can someone help me understand what's going on here Unable to activate constraint with anchors <NSLayoutXAxisAnchor:0x2831d5480 "AVFocusProxyView:0x1224b3370.left"> and <NSLayoutXAxisAnchor:0x28356cf40 "AVPlayerLayerView:0x1224bf9b0.left"> because they have no common ancestor. Does the constraint or its anchors reference items in different view hierarchies? That's illegal.
Posted
by
Post not yet marked as solved
2 Replies
377 Views
I'm trying to enforce a duration limit for picked videos. It doesn't appear that there's a configuration option in PHPickerConfiguration unless I missed it. I'm able to get the duration from the URL returned by loadFileRepresentation but it appears that this loads the entire file which somewhat defeats the point of limiting video size and can take a long time. It also looks like I can use the local assetIdentifier but this requires initializing PHPickerConfiguration with a PHPhotoLibrary which requires asking the user for permission and complicates the whole flow when I just want them to pick a single file. Is there a way to take advantage of PHPickerViewController and let the user choose a video and enforce a video limit without loading the entire file / requiring photo library permissions? Thanks!
Posted
by
Post not yet marked as solved
0 Replies
279 Views
I know that if you want background audio from AVPlayer you need to detatch your AVPlayer from either your AVPlayerViewController or your AVPlayerLayer in addition to having your AVAudioSession configured correctly. I have that all squared away and background audio is fine until we introduce AVPictureInPictureController or use the PiP behavior baked into AVPlayerViewController. If you want PiP to behave as expected when you put your app into the background by switching to another app or going to the homescreen you can't perform the detachment operation otherwise the PiP display fails. On an iPad if PiP is active and you lock your device you continue to get background audio playback. However on an iPhone if PiP is active and you lock the device the audio pauses. However if PiP is inactive and you lock the device the audio will pause and you have to manually tap play on the lockscreen controls. This is the same between iPad and iPhone devices. My questions are: Is there a way to keep background-audio playback going when PiP is inactive and the device is locked (iPhone and iPad) Is there a way to keep background-audio playback going when PiP is active and the device is locked? (iPhone)
Posted
by
Post not yet marked as solved
0 Replies
206 Views
Hi, I'm trying to add custom actions on AVPlayerViewController on iOS. I was able to use transportBarCustomMenuItems for tvOS but I can't find any iOS equivalent. On the Apple's TV app for iOS they use custom menus like this.
Posted
by