Integrate video and other forms of moving visual media into your apps.

Posts under Video tag

82 Posts
Sort by:

Post

Replies

Boosts

Views

Activity

How change videoGravity of VideoPlayer in SwiftUI
I have the new iOS 14 VideoPlayer: private let player = AVPlayer(url: Bundle.main.url(forResource: "TL20_06_Shoba3_4k", withExtension: "mp4")!) 		var body: some View { 								VideoPlayer(player: player) 								.aspectRatio(contentMode: .fill) ... This player setup can not display 4:3 video on 16:9 screen of tv without black stripes. Modifier aspectRatio does not work on VideoPlayer. How can I set videoGravity of existed AVPlayerLayer to resizeAspectFill via SwiftUI API?
6
0
5.3k
Aug ’23
Video Quality selection in HLS streams
Hello there, in our team we were requested to add the possibility to manually select the video quality. I know that HLS is an adaptive stream and that depending on the network condition it choose the best quality that fits to the current situation. I also tried some setting with preferredMaximumResolution and preferredPeakBitRate but none of them worked once the user was watching the steam. I also tried something like replacing the currentPlayerItem with the new configuration but anyway this only allowed me to downgrade the quality of the video. When I wanted to set it for example to 4k it did not change to that track event if I set a very high values to both params mentioned above. My question is if there is any method which would allow me to force certain quality from the manifest file. I already have some kind of extraction which can parse the manifest file and provide me all the available information but I couldn't still figure out how to make the player reproduce specific stream with my desired quality from the available playlist.
4
0
6.8k
Oct ’23
tvOS: How to avoid fast-forwarding in AVPlayerViewController
Due to legal restrictions I need to prevent my app's users from skipping and fast-forwarding the content that is played by AVPlayerViewController. I use playerViewController(:willResumePlaybackAfterUserNavigatedFrom:to:) and playerViewController(:timeToSeekAfterUserNavigatedFrom:to:) delegate methods to control the skipping behaviour. However, those delegate methods are only triggered for skip +/- 10, but not for fast-forwarding/rewinding.  Is there a way to prevent fast-forwarding in addition to skipping in AVPlayerViewController? Here is an example of the code I use: class ViewController: UIViewController {   override func viewDidAppear(_ animated: Bool) {     super.viewDidAppear(animated)     setUpPlayerViewController()   }   private func setUpPlayerViewController() {     let playerViewController = AVPlayerViewController()     playerViewController.delegate = self guard let url = URL(string: "https://devstreaming-cdn.apple.com/videos/streaming/examples/img_bipbop_adv_example_ts/master.m3u8") else {       debugPrint("URL is not found")       return     }     let playerItem = AVPlayerItem(url: url)     let player = AVPlayer(playerItem: playerItem)     playerViewController.player = player     present(playerViewController, animated: true) {       playerViewController.player?.play()     }   } } extension ViewController: AVPlayerViewControllerDelegate {   public func playerViewController(_ playerViewController: AVPlayerViewController, willResumePlaybackAfterUserNavigatedFrom oldTime: CMTime, to targetTime: CMTime) { // Triggered on skip +/- 10, but not on fast-forwarding/rewinding     print("playerViewController(_:willResumePlaybackAfterUserNavigatedFrom:to:)")   }   public func playerViewController(_ playerViewController: AVPlayerViewController, timeToSeekAfterUserNavigatedFrom oldTime: CMTime, to targetTime: CMTime) -> CMTime {     // Triggered on skip +/- 10, but not on fast-forwarding/rewinding     print("playerViewController(_:timeToSeekAfterUserNavigatedFrom:to:)")     return targetTime   } }
2
1
1.2k
Sep ’23
AVPlayer, AVAssetReader, AVAssetWriter - Using MXF files
Hi, I would like to read in .mxf files using AVPlayer and also AVAssetReader. I would like to write out to .mxf files using AVAssetWriter, Should this be possible ? Are there any examples of how to do this ? I found the VTRegisterProfessionalVideoWorkflowVideoDecoders() call but this did not seem to help. I would be grateful for any suggestions. Regards Tom
1
0
1.6k
Aug ’23
Quicktime Interactivity?
Is it still possible to tutor a QuickTime movie with hyperlinks? I'm building a website and I know at one point you could author a QuickTime movie that supported links inside the video - either to other timestamps in the video or to other web pages. I don't want to use a custom player, I'd prefer to use the system level. I've seen a really amazing example of this on the mobile version of the memory alpha (Star Trek nerds!) website. There is a movie that plays at the top of pages that is fully interactive. Is that still supported? Is it possible to author that way? I'm not making anything insanely complicate, I just thought it would be a nice way to build a website with tools I'm more comfortable working in.
1
0
1.1k
Aug ’23
Video's sound become so low then to normal when recording using AVCaptureSession
Hi everyone, In my app i use AVCaptureSession to record video. I add videoDeviceInput and audioDeviceInput. And as output I use AVCaptureMovieFileOutput. And the result for some iPhone specially the iPhones after IPhoneX(iPhone 11,12, 13,14) has bad audio quality, The sound is like so low then after some seconds(around 7 secs) become normal. I have tried setting the audio setting for movie file out but it's still happen. Anyone know how to solve this issue?
0
0
384
Jul ’23
Seeking to specific frame index with AVAssetReader
We're using code based on AVAssetReader to get decoded video frames through AVFoundation. The decoding part per se works great but the seeking just doesn't work reliably. For a given H.264 file (in the MOV container) the decoded frames have presentation time stamps that sometimes don't correspond to the actual decoded frames. So for example: the decoded frame's PTS is 2002/24000 but the frame's actual PTS is 6006/24000. The frames have burnt-in timecode so we can clearly tell. Here is our code: - (BOOL) setupAssetReaderForFrameIndex:(int32_t) frameIndex { NSError* theError = nil; NSDictionary* assetOptions = @{ AVURLAssetPreferPreciseDurationAndTimingKey: @YES }; self.movieAsset = [[AVURLAsset alloc] initWithURL:self.filePat options:assetOptions]; if (self.assetReader) [self.assetReader cancelReading]; self.assetReader = [AVAssetReader assetReaderWithAsset:self.movieAsset error:&theError]; NSArray<AVAssetTrack*>* videoTracks = [self.movieAsset tracksWithMediaType:AVMediaTypeVideo]; if ([videoTracks count] == 0) return NO; self.videoTrack = [videoTracks objectAtIndex:0]; [self retrieveMetadata]; NSDictionary* outputSettings = @{ (id)kCVPixelBufferPixelFormatTypeKey: @(self.cvPixelFormat) }; self.videoTrackOutput = [AVAssetReaderTrackOutput assetReaderTrackOutputWithTrack:self.videoTrack outputSettings:outputSettings]; self.videoTrackOutput.alwaysCopiesSampleData = NO; [self.assetReader addOutput:self.videoTrackOutput]; CMTimeScale timeScale = self.videoTrack.naturalTimeScale; CMTimeValue frameDuration = (CMTimeValue)round((float)timeScale/self.videoTrack.nominalFrameRate); CMTimeValue startTimeValue = (CMTimeValue)frameIndex * frameDuration; CMTimeRange timeRange = CMTimeRangeMake(CMTimeMake(startTimeValue, timeScale), kCMTimePositiveInfinity); self.assetReader.timeRange = timeRange; [self.assetReader startReading]; return YES; } This is then followed by this code to actually decode the frame: CMSampleBufferRef sampleBuffer = [self.videoTrackOutput copyNextSampleBuffer]; CVPixelBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); if (!imageBuffer) { CMSampleBufferInvalidate(sampleBuffer); AVAssetReaderStatus theStatus = self.assetReader.status; NSError* theError = self.assetReader.error; NSLog(@"[AVAssetVideoTrackOutput copyNextSampleBuffer] didn't deliver a frame - %@", theError); return false; } Is this method by itself the correct way of seeking and if not: what is the correct way? Thanks!
0
0
585
Jul ’23
Progressively supply media data
I'm trying to use the resourceLoader of an AVAsset to progressively supply media data. Unable to because the delegate asks for the full content requestsAllDataToEndOfResource = true. class ResourceLoader: NSObject, AVAssetResourceLoaderDelegate { func resourceLoader(_ resourceLoader: AVAssetResourceLoader, shouldWaitForLoadingOfRequestedResource loadingRequest: AVAssetResourceLoadingRequest) -> Bool { if let ci = loadingRequest.contentInformationRequest { ci.contentType = // public.mpeg-4 ci.contentLength = // GBs ci.isEntireLengthAvailableOnDemand = false ci.isByteRangeAccessSupported = true } if let dr = loadingRequest.dataRequest { if dr.requestedLength > 200_000_000 { // memory pressure // dr.requestsAllDataToEndOfResource is true } } return true } } Also tried using a fragmented MP4 created using AVAssetWriter. But didn't work. Please let me know if it's possible for the AVAssetResourceLoader to not ask for the full content?
1
0
645
Aug ’23
Vision Pro Safari WebXR video playback not working
The Safari version for VisionOS (or spatial computing) supports WebXR, as reported here. I am developing a Web App that intends to leverage WebXR, so I've tested several code samples on the safari browser of the Vision Pro Simulator to understand the level of support for immersive web content. I am currently facing an issue that seems like a bug where video playback stops working when entering an XR session (i.e. going into VR mode) on a 3D web environment (using ThreeJS or similar). There's an example from the Immersive Web Community Group called Stereo Video (https://immersive-web.github.io/webxr-samples/stereo-video.html) that lets you easily replicate the issue, the code is available here. It's worth mentioning that video playback has been successfully tested on other VR platforms such as the Meta Quest 2. The issue has been reported in the following forums: https://discourse.threejs.org/t/videotexture-playback-html5-videoelement-apple-vision-pro-simulator-in-vr-mode-not-playing/53374 https://bugs.webkit.org/show_bug.cgi?id=260259
5
1
2.1k
Mar ’24
Black frames in resulting AVComposition.
I have multiple AVAssets with that I am trying to merge together into a single video track using AVComposition. What I'm doing is iterating over my avassets and inserting them in to a single AVCompositionTrack like so: - (AVAsset *)combineAssets { // Create a mutable composition AVMutableComposition *composition = [AVMutableComposition composition]; AVMutableCompositionTrack *compositionVideoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid]; AVMutableCompositionTrack *compositionAudioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid]; // Keep track of time offset CMTime currentOffset = kCMTimeZero; for (AVAsset *audioAsset in _audioSegments) { AVAssetTrack *audioTrack = [[audioAsset tracksWithMediaType:AVMediaTypeAudio] firstObject]; // Add the audio track to the composition audio track NSError *audioError; [compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, audioAsset.duration) ofTrack:audioTrack atTime:currentOffset error:&audioError]; if (audioError) { NSLog(@"Error combining audio track: %@", audioError.localizedDescription); return nil; } currentOffset = CMTimeAdd(currentOffset, audioAsset.duration); } // Reset offset to do the same with videos. currentOffset = kCMTimeZero; for (AVAsset *videoAsset in _videoSegments) { { // Get the video track AVAssetTrack *videoTrack = [[videoAsset tracksWithMediaType:AVMediaTypeVideo] firstObject]; // Add the video track to the composition video track NSError *videoError; [compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoTrack.timeRange.duration) ofTrack:videoTrack atTime:currentOffset error:&videoError]; if (videoError) { NSLog(@"Error combining audio track: %@", videoError.localizedDescription); return nil; } // Increment current offset by asset duration currentOffset = CMTimeAdd(currentOffset, videoTrack.timeRange.duration); } } return composition; } The issue is that when I export the composition using an AVExportSession I notice that there's a black frame between the merged segments in the track. In other words, if there were two 30second AVAssets merged into the composition track to create a 60 second video. You would see a black frame for a split second at the 30 second mark where the two assets combine. I don't really want to re encode the assets I just want to stitch them together. How can I fix the black frame issue?.
1
0
473
Aug ’23
Timestamps in AVPlayer
I want to show the user actual start and end dates of the video played on the AVPlayer time slider, instead of the video duration data. I would like to show something like this: 09:00:00 ... 12:00:00 (which indicates that the video started at 09:00:00 CET and ended at 12:00:00 CET), instead of: 00:00:00 ... 02:59:59. I would appreciate any pointers to this direction.
1
1
526
Sep ’23
Webrtc zoom/resize issue in safari 17
I am learning to develop webrtc apps and have noticed that starting with safari and safari mobile 17 there is a noticeable zoom distortion that occurs when resizing some webrtc players. This seems to be safari specific and only on version 17. What feature change could cause this? Here is an example of catalina vs Sonoma. Sorry i dont have access to any other versions in between atm but i have only seen this issue since updating to safari 17
2
4
606
Oct ’23
HomeKit Video Playback
I have two MacBook Pro computers. On one, an M1 Max, the HomeKit video palyback will not work on my account. It will work a guest account on the same computer. On the second, an i7 13", no issues at all. I can also view the playback on my iPhone 14 and my iPad Pro. This started when I first began to use the HomeKit Video, well over a year ago and on previous OS versions. There is a HomePod and several AppleTvs around the house, so there is no issue of there not being a HomeKit hub.to function with, and I mentioned, it's only on the M1 that there is an issue. It shows the recordings, but when I click on one to view, it just zooms in on the scene. To confuse things, it will work on occasion, especially if I am calling Apple about it. (yes, it's fickle) Everything is up to date, both machines are running the latest Sonoma and the 13" has never had a problem with the video. It's going to be something so simple that I will do a "DUH" but what's the answer?
0
0
422
Oct ’23
Can't scroll past a video with playsinline attribute inside an iframe [Safari iOS 17.0.3]
I'm loading a page inside an iframe. The page contains a video with the playsinline attribute. The video is completely unresponsive to touch events, meaning if it fills up the screen I can't scroll past it, thus breaking the page. Note the video also has autoplay, loop and muted attributes but these did not cause scrolling issues; only playsinline causes the break. Myself and another tester are running Safari on an iPhone 14 with iOS 17.0.3 and it's broken for both of us. Confirmed with 2 additional testers there is no issue with iPhones running iOS 16.6. Test case: https://codepen.io/gem0303/pen/qBgEeaG Repro steps: Load in Safari on an iPhone with iOS 17.0.3 Tap and drag on the white background and dummy scrolling text -- works fine. Tap and drag on the cat video -- scrolling is impossible.
4
1
1.3k
Oct ’23
Create Media Player Application guide
I have designed an app for media player, in this app i need to implement live tv, movies and series. so url can be of any type such as .ts formate for live tv, and .mp4, .mov, etc. I am also going to work with m3u. but AVPlayer does not supports all these urls.So can i get some suggestions and solutions for that. what could be the best practice and how to work with all these kind if urls etc.
0
0
426
Oct ’23