Integrate video and other forms of moving visual media into your apps.

Posts under Video tag

90 Posts
Sort by:
Post not yet marked as solved
6 Replies
4.8k Views
I have the new iOS 14 VideoPlayer: private let player = AVPlayer(url: Bundle.main.url(forResource: "TL20_06_Shoba3_4k", withExtension: "mp4")!) 		var body: some View { 								VideoPlayer(player: player) 								.aspectRatio(contentMode: .fill) ... This player setup can not display 4:3 video on 16:9 screen of tv without black stripes. Modifier aspectRatio does not work on VideoPlayer. How can I set videoGravity of existed AVPlayerLayer to resizeAspectFill via SwiftUI API?
Posted
by
Post marked as solved
4 Replies
6.1k Views
Hello there, in our team we were requested to add the possibility to manually select the video quality. I know that HLS is an adaptive stream and that depending on the network condition it choose the best quality that fits to the current situation. I also tried some setting with preferredMaximumResolution and preferredPeakBitRate but none of them worked once the user was watching the steam. I also tried something like replacing the currentPlayerItem with the new configuration but anyway this only allowed me to downgrade the quality of the video. When I wanted to set it for example to 4k it did not change to that track event if I set a very high values to both params mentioned above. My question is if there is any method which would allow me to force certain quality from the manifest file. I already have some kind of extraction which can parse the manifest file and provide me all the available information but I couldn't still figure out how to make the player reproduce specific stream with my desired quality from the available playlist.
Posted
by
Post not yet marked as solved
1 Replies
1.1k Views
Due to legal restrictions I need to prevent my app's users from skipping and fast-forwarding the content that is played by AVPlayerViewController. I use playerViewController(:willResumePlaybackAfterUserNavigatedFrom:to:) and playerViewController(:timeToSeekAfterUserNavigatedFrom:to:) delegate methods to control the skipping behaviour. However, those delegate methods are only triggered for skip +/- 10, but not for fast-forwarding/rewinding.  Is there a way to prevent fast-forwarding in addition to skipping in AVPlayerViewController? Here is an example of the code I use: class ViewController: UIViewController {   override func viewDidAppear(_ animated: Bool) {     super.viewDidAppear(animated)     setUpPlayerViewController()   }   private func setUpPlayerViewController() {     let playerViewController = AVPlayerViewController()     playerViewController.delegate = self guard let url = URL(string: "https://devstreaming-cdn.apple.com/videos/streaming/examples/img_bipbop_adv_example_ts/master.m3u8") else {       debugPrint("URL is not found")       return     }     let playerItem = AVPlayerItem(url: url)     let player = AVPlayer(playerItem: playerItem)     playerViewController.player = player     present(playerViewController, animated: true) {       playerViewController.player?.play()     }   } } extension ViewController: AVPlayerViewControllerDelegate {   public func playerViewController(_ playerViewController: AVPlayerViewController, willResumePlaybackAfterUserNavigatedFrom oldTime: CMTime, to targetTime: CMTime) { // Triggered on skip +/- 10, but not on fast-forwarding/rewinding     print("playerViewController(_:willResumePlaybackAfterUserNavigatedFrom:to:)")   }   public func playerViewController(_ playerViewController: AVPlayerViewController, timeToSeekAfterUserNavigatedFrom oldTime: CMTime, to targetTime: CMTime) -> CMTime {     // Triggered on skip +/- 10, but not on fast-forwarding/rewinding     print("playerViewController(_:timeToSeekAfterUserNavigatedFrom:to:)")     return targetTime   } }
Posted
by
Post not yet marked as solved
7 Replies
7.1k Views
I am trying to replace gifs with mp4s on my website. Currently its working great in Chrome and Firefox, but the behavior is odd in Safari. <video autoplay loop muted playsinline defaultmuted preload="auto"> <source src="/path/to/video.mp4" type="video/mp4"> </video> This video is an h264 mp4 video with no audio track. Firefox, Chrome on my Macbook: Works as expected (autoplay's like it were a gif) iOS Safari without Low Power Mode: Works as expected iOS Safari with Low Power Mode: Autoplays but there is a play button on top that disappears when tapped. macOS Safari: Does not autoplay. A play button appears and it plays if clicked. I have been following https://developer.apple.com/documentation/webkit/delivering_video_content_for_safari as well as other guides on the internet and it still isn't working. I'm pretty sure there is a recent change responsible for this because it used to work in an older version of desktop safari.
Posted
by
Post not yet marked as solved
1 Replies
1k Views
The tappable description item does not show up. let descriptionItem = AVMutableMetadataItem() descriptionItem.identifier = .commonIdentifierDescription descriptionItem.value = "test" as NSString playerItem.externalMetadata = [descriptionItem]
Posted
by
Post not yet marked as solved
1 Replies
1.5k Views
Hi, I would like to read in .mxf files using AVPlayer and also AVAssetReader. I would like to write out to .mxf files using AVAssetWriter, Should this be possible ? Are there any examples of how to do this ? I found the VTRegisterProfessionalVideoWorkflowVideoDecoders() call but this did not seem to help. I would be grateful for any suggestions. Regards Tom
Posted
by
Post not yet marked as solved
4 Replies
1.9k Views
Hi, I'm trying to add a video to my first iOS app. From the tutorials I've read online, this seemed to be a simple process of creating a AVPlayer, providing a URL to the video file and using onAppear to start the video playing when the view is shown. Below is a simplified version of the code I'm using in my app: struct ContentView: View {   let avPlayer = AVPlayer(url: Bundle.main.url(forResource: "Intro", withExtension: "mp4")!)   var body: some View {     VStack{       VideoPlayer(player: avPlayer)         .onAppear{           avPlayer.play()         }     }   } } When I run this code, the video plays but when it finishes playing I receive the following errors in the Xcode output window: 2023-01-27 11:56:39.530526+1100 TestVideo[29859:2475750] [api] -[CIImage initWithCVPixelBuffer:options:] failed because the buffer is nil. 2023-01-27 11:56:39.676462+1100 TestVideo[29859:2475835] [TextIdentificationService] Text LID dominant failure: lidInconclusive 2023-01-27 11:56:39.676822+1100 TestVideo[29859:2475835] [VisualTranslationService] Visual isTranslatable: NO; not offering translation: lidInconclusive 2023-01-27 11:56:40.569337+1100 TestVideo[29859:2476091] Metal API Validation Enabled I have googled each of these error messages but have not been able to find any information explaining exactly what they mean or how to eliminate them. I am using Xcode 14.2 and testing on iOS 16.2. If anyone could please point me in the right direction of how to understand and eliminate these errors I'd really appreciate it. Thanks!
Posted
by
Post not yet marked as solved
2 Replies
1.5k Views
Learn about the usage and requirements of “Ambient Viewing Environment” metadata with Dolby Vision™ Profile 8.4 playback. View Technote TN3145 &gt;
Posted
by
Post not yet marked as solved
1 Replies
964 Views
Is it still possible to tutor a QuickTime movie with hyperlinks? I'm building a website and I know at one point you could author a QuickTime movie that supported links inside the video - either to other timestamps in the video or to other web pages. I don't want to use a custom player, I'd prefer to use the system level. I've seen a really amazing example of this on the mobile version of the memory alpha (Star Trek nerds!) website. There is a movie that plays at the top of pages that is fully interactive. Is that still supported? Is it possible to author that way? I'm not making anything insanely complicate, I just thought it would be a nice way to build a website with tools I'm more comfortable working in.
Posted
by
Post not yet marked as solved
1 Replies
714 Views
Using quicktime play mp4 file, an error occurred and the document "XX. mp4" could not be opened. An unknown error occurred (-12842), and with Safari browser, only the first keyframe can be played and the video will become stuck. Both Chrome and Firefox can play [normally]
Posted
by
Post marked as solved
2 Replies
1k Views
Hello, I am attempting to simultaneously stream video to a remote client and run inference on a neural network (on the same video frame) locally. I have done this in other platforms, using Gstreamer on linux and on android using libstreaming for compression and packetization. I've attempted this now on iPhone using ffmpeg to stream and a capture session to feed the neural network but I run into the problem of multiple camera access. Most of the posts I see are concerned with receiving RTP streams in iOS but I need to do the opposite. As I am new to iOS and Swift I was hoping someone could provide method for RTP packetization? Any library recommendations or example code for something similar? Best,
Posted
by
Post not yet marked as solved
0 Replies
336 Views
Dear community, we run a large streaming farm here, and have had a very strange phenomenon reported by a customer: When we play a HLS (VOD) stream from our CDN on Apple devices with Apple software (AppleTV, Quicktime or Safari on MacBook, within our IOS app, native player in safari), it happens every 10 seconds (probably at the chunk borders) that the second to last frame is played twice and the last one not at all. It seems, that the phenomenon has nothing to do with the files - on Apple hardware without Apple software, e.g. Chrome on Macbook, or all other devices like Windows PCs the phenomenon does not occur. Does anyone have an idea what the problem could be? Thank you and best regards Thomas
Posted
by
Post not yet marked as solved
1 Replies
952 Views
Background I am building a web app where users can talk to a microphone and watch dynamic video that changes depending on what they say. Here is a sample flow: User accesses the page Mic permission popup appears, user grants it User presses start buttton, standby video starts playing and mic turns on User speaks something, the speech gets turned to text, analyzed, and based on that, the src changes The new video plays, mic turns on, the loop continues. Problem The problem in iOS is that volume goes down dramatically to a level where it is hard to hear on max volume if mic access is granted. iPhone's volume up/down buttons don't help much either. That results in a terrible UX. I think the OS is forcefully keeping the volume down, but I could not find any documentation about it. On the contrary, if mic permission not granted, the volume does not change, video plays in a normal volume. Question Why does it happen and what can I do prevent volume going down automatically? Any help would be appreciated. This will not happen in PC(MacOS, Windows) or Android OS. Has anyone had any similar experience before? Context For context: I have two tags(positioned:absolute, with and height 100%) that are switched(by toggling z-indexs) to appear one on top of the other. This is to hide load, buffer and black screen from the user for better UX. If the next video is loaded and can play, then they are switched places. both tags have playsinline to enable inline play as required by webKit. both tags start out muted, muted is removed after play starts. video.play() is initiated after user grants mic permission Tech stack NextJS with Typescript, latest versions Testing on latest Chrome and Safari on iOS 16 fully updated
Posted
by
Post not yet marked as solved
1 Replies
643 Views
We noticed the video on our site being distorted in the banner. After hours of emptying caches and testing different phones, we narrowed it down to 16.5, being the issue. We replicated on Browser Stack in seconds. You can see the issue here if you have 16.5. It replicates on Safari, Firefox, and Chrome. https://stundesign.com/
Posted
by
Post not yet marked as solved
1 Replies
592 Views
We provide a mechanism for a user to upload video files via mobile Safari, using a standard HTML file input, eg: &lt;input type="file" multiple&gt; As per a StackOverflow answer from a few years back, we've been including the multiple attribute, which worked around whatever was compressing the video and allowed it to be uploaded in original format. This no longer works, and the video is compressed as a part of this upload workflow. We've also noticed this is specific to the Photo Library -- if the user were to copy the video over to Files, and then upload it via the "browse" prompt (instead of Photo Library) it uploads as is without compressing. Is there anything else we can do to prevent this compression of video prior to upload?
Posted
by
Post not yet marked as solved
0 Replies
638 Views
I am currently working on a SwiftUI video app. When I load a slow motion video being in 240 IPS (239.68), I use "asset.loadTracks" and then ".load(.nominalFrameRate)" which returns 30 IPS (29.xx), asset being AVAsset(url: ). And the duration in asset.load(.duration) is also 8 times bigger than original duration. Do you know how to get this 239.68 displayed in the Apple Photo app ? Is it stored somewhere in the video metadata or is it computed ?
Posted
by
Post not yet marked as solved
0 Replies
888 Views
I watched the WWDC 2023 session titled Explore media formats for the web. It talked in depth about the Managed Media Source API but google is not finding anything about this API. I'd really like to read more about it
Posted
by
Post not yet marked as solved
0 Replies
723 Views
I have electronjs app for the MAC Catalyst. I have implemented audio/video calling functionalities. Those works well. I have also implemented functionality to share the screen by using below code. navigator.mediaDevices.getDisplayMedia(options).then((streams) => { var peer_connection = session.sessionDescriptionHandler.peerConnection; var video_track = streams.getVideoTracks()[0]; var sender_kind = peer_connection.getSenders().find((sender) => { return sender.track.kind == video_track.kind; }); sender_kind.replaceTrack(video_track); video_track.onended = () => { }; }, () => { console.log("Error occurred while sharing screen"); } ); But when I hit the button to share the screen by using above code, I am getting below error. Uncaught (in promise) DOMException: Not supported I have also tried navigator.getUserMedia(options,success,error). It's supported by the Mac Catalyst desktop apps. But it's only giving the streams of the webcam. I have also checked online if navigator.mediaDevices.getDisplayMedia(options) is supported in the Mac Catalyst or not. It's supports in the Mac Catalyst. But still I am facing this issue. I have also tried with the desktopCapturer API of the electronjs. But I don't know how can I get the streams from it. //CODE OF 'main.js' ipcMain.on("ask_permission", () => { desktopCapturer .getSources({ types: ["window", "screen"] }) .then(async (sources) => { for (const source of sources) { // console.log(source); if (source.name === "Entire screen") { win.webContents.send("SET_SOURCE", source.id); return; } } }); }); I have tried to get streams by using the below code in the preload.js. But I was getting the error Cannot read property 'srcObject' of undefined. window.addEventListener("DOMContentLoaded", (event) => { ipcRenderer.on("SET_SOURCE", async (event, sourceId) => { try { const stream = await navigator.mediaDevices.getUserMedia({ audio: false, video: { mandatory: { chromeMediaSource: "desktop", chromeMediaSourceId: sourceId, minWidth: 1280, maxWidth: 1280, minHeight: 720, maxHeight: 720, }, }, }); handleStream(stream); } catch (e) { handleError(e); } }); let btn = document.getElementById("btnStartShareOutgoingScreens"); btn.addEventListener("click", () => { if (isSharing == false) { ipcRenderer.send("ask_permission"); } else { console.error("USer is already sharing the screen.............."); } }); }); function handleStream(stream) { const video = document.createElement("video"); video.srcObject = stream; video.muted = true; video.id = "screenShareVideo"; video.style.display = "none"; const box = document.getElementById("app"); box.appendChild(video); isSharing = true; } How can I resolve it. If this is not supported in the MAC Catalyst, Is there is any other way to share the screen from the MAC Catalyst app by using WebRTC.
Posted
by
Post not yet marked as solved
1 Replies
578 Views
Hello, consider this very simple example: guard let url = URL(string: "some_url") else { return } let player = AVPlayer(url: url) let controller = AVPlayerViewController() controller.player = player present(controller, animated: true) { player.play() } When the video URL is using redirection and returns 302 when queried, AVPlayer's internal implementation is querying twice, which is proven by proxying. I'm not sure if I can provide the actual links, thus the screenshot is blurred. It can be seen though, that the redirecting URL, which receives 302 as a response, it is queried twice, and only after the 2nd attempt the actual redirection is taking place. This behavior is problematic to the backend services and we need to remediate it somehow. Do you have any idea on how to address this problem, please?
Posted
by