Integrate video and other forms of moving visual media into your apps.

Posts under Video tag

91 Posts
Sort by:
Post not yet marked as solved
0 Replies
136 Views
I have a AVPlayer() which loads the video and places it on the screen ModelEntity in the immersive view using the VideoMaterial. This also makes the video untappable as it is a VideoMaterial. Here's the code for the same: let screenModelEntity = model.garageScreenEntity as! ModelEntity let modelEntityMesh = screenModelEntity.model!.mesh let url = Bundle.main.url(forResource: "<URL>", withExtension: "mp4")! let asset = AVURLAsset(url: url) let playerItem = AVPlayerItem(asset: asset) let player = AVPlayer() let material = VideoMaterial(avPlayer: player) screenModelEntity.components[ModelComponent.self] = .init(mesh: modelEntityMesh, materials: [material]) player.replaceCurrentItem(with: playerItem) return player I was able to load and play the video. However, I cannot figure out how to show the player controls (AVPlayerViewController) to the user, similar to the DestinationVideo sample app. How can I add the video player controls in this case?
Posted
by Prakshal.
Last updated
.
Post not yet marked as solved
0 Replies
192 Views
<div class="container" style="background-size: contain; user-select: none; pointer-events: none; height: 787.5px; width: 1400px;"> <div class="container__header">header</div> <span> <div class="video-container" style="inset: 17.853% 68% 11.747% 1%; z-index: 2; opacity: 1;"> <div class="video-container__placeholder-image">image</div> <div class="video-container__content"> <div class="some-info"></div> <div class="video-canvas"></div> <div class="other-info"></div> </div> </div> <div class="video-container" style="inset: 17.853% 1% 11.747% 33%; z-index: 1; opacity: 1;"> <div class="video-container__placeholder-image">image</div> <div class="video-container__content"> <div class="video-canvas"> <div class="player" style="width: 100%; height: 100%; position: relative; overflow: hidden; background-color: black;"> <video playsinline="" muted="" style="object-fit: cover; width: 100%; height: 100%; position: absolute; left: 0px; top: 0px;"></video> </div> </div> </div> </div> </span> </div> The page looks like Then, the html changed as follows, <div class="container" style="background-size: contain; user-select: none; pointer-events: none; height: 787.5px; width: 1400px;"> <div class="container__header">header</div> <span> <div class="video-container" style="inset: 100% 100% 0% 0%; z-index: 2; opacity: 0;"> <div class="video-container__placeholder-image">image</div> <div class="video-container__content"> <div class="some-info"></div> <div class="video-canvas"></div> <div class="other-info"></div> </div> </div> <div class="video-container" style="style="inset: 6.106% 5.98719% 0%; z-index: 3; opacity: 1;""> <div class="video-container__placeholder-image">image</div> <div class="video-container__content"> <div class="video-canvas"> <div class="player" style="width: 100%; height: 100%; position: relative; overflow: hidden; background-color: black;"> <video playsinline="" muted="" style="object-fit: cover; width: 100%; height: 100%; position: absolute; left: 0px; top: 0px;"></video> </div> </div> </div> </div> </span> </div> From the mac developer tools, the width of the video is 1400px, but it render like the size is same as before in iOS17+(iOS17.1 and iOS17.3.1). The expected results looks like the actual results are looks like I tried the same operators in iOS 14.6 and 16.4 and it worked as expected, this problem likes only exists in iOS17+. Please help me to resolve this problom. Thanks.
Posted
by zcl.
Last updated
.
Post not yet marked as solved
0 Replies
160 Views
We utilized AVFragmentedAssetMinder to refresh the player data. While notifications for AVAssetDurationDidChange were consistently received whenever the player duration changed. However, following the release of iOS 17, notifications for AVAssetDurationDidChange ceased to be received. Could you please advise anyone why this notification is not being triggered? what we have to change NotificationCenter.default.addObserver(self, selector: #selector(self.onVideoUpdate), name: .AVAssetDurationDidChange, object: nil) #AVPLAyer, #AVMUtableMovie
Posted
by GVRajeev.
Last updated
.
Post not yet marked as solved
0 Replies
266 Views
Hi guys, I'm implementing FairPlay support for a video streaming application. I've managed to get as far as generating the SPC and acquiring a license from the license server. However when it comes to parsing the license (CKC) returned from the server, the FPS module returns error code -42671. Has anyone else faced this before and / or knows what the fix is? I thought passing it the license should be enough unless additional data is required?
Posted
by ThetaSeg.
Last updated
.
Post not yet marked as solved
1 Replies
305 Views
I've encountered an issue with the seek bar time display in the video player on iOS 17, specifically affecting live stream videos using HLS manifests with the time displayed in am/pm format. As the video progresses, the displayed start time appears to shift backwards in time with a peculiar pattern: Displayed Start Time = Normal Start Time - Viewed Duration For instance, if a program begins at 9:00 AM, at 9:30 AM, the start time shown will erroneously be 8:30 AM. Similarly, at 9:40 AM, the displayed start time will be 8:20 AM. This issue is observed with both VideoPlayer and AVPlayerViewController on iOS 17. The same implementation of the video player on iOS 16 displays the duration of the viewed program and doesn’t have any issues. Please advise on any known workarounds or solutions to address this issue on iOS 17.
Posted
by Amuron.
Last updated
.
Post not yet marked as solved
0 Replies
146 Views
How Can I update the cookies of the previously set m3u8 video in AVPlayer without creating the new AVURLAsset and replacing the AVPlayer current Item with it
Posted
by doc_aman.
Last updated
.
Post not yet marked as solved
1 Replies
271 Views
Does the new MV-HEVC vision pro spatial video format supports having an alpha channel? I've tried converting a side by side video with alpha channel enabled by using this Apple example project, but the alpha channel is being removed. https://developer.apple.com/documentation/avfoundation/media_reading_and_writing/converting_side-by-side_3d_video_to_multiview_hevc
Posted Last updated
.
Post not yet marked as solved
4 Replies
1.6k Views
The Safari version for VisionOS (or spatial computing) supports WebXR, as reported here. I am developing a Web App that intends to leverage WebXR, so I've tested several code samples on the safari browser of the Vision Pro Simulator to understand the level of support for immersive web content. I am currently facing an issue that seems like a bug where video playback stops working when entering an XR session (i.e. going into VR mode) on a 3D web environment (using ThreeJS or similar). There's an example from the Immersive Web Community Group called Stereo Video (https://immersive-web.github.io/webxr-samples/stereo-video.html) that lets you easily replicate the issue, the code is available here. It's worth mentioning that video playback has been successfully tested on other VR platforms such as the Meta Quest 2. The issue has been reported in the following forums: https://discourse.threejs.org/t/videotexture-playback-html5-videoelement-apple-vision-pro-simulator-in-vr-mode-not-playing/53374 https://bugs.webkit.org/show_bug.cgi?id=260259
Posted Last updated
.
Post not yet marked as solved
2 Replies
296 Views
Hi everyone, I need to add spatial video maker in my app which was wrote in objective-c. I found some reference code by swift, can you help me with converting the code to objective -c? let left = CMTaggedBuffer( tags: [.stereoView(.leftEye), .videoLayerID(leftEyeLayerIndex)], pixelBuffer: leftEyeBuffer) let right = CMTaggedBuffer( tags: [.stereoView(.rightEye), .videoLayerID(rightEyeLayerIndex)], pixelBuffer: rightEyeBuffer) let result = adaptor.appendTaggedBuffers( [left, right], withPresentationTime: leftPresentationTs)
Posted
by pinkywon.
Last updated
.
Post not yet marked as solved
0 Replies
241 Views
Is there any way to play panoramic or 360 videos in an immersive space, without using VideoMaterial on a sphere? I've tried using local videos with 4k and 8k quality and all of them look pixelated using this approach. I tried both simulator as well as the real device, and I can't ever get a high-quality playback. If the video is played on a regular 2D player, on the other hand, it shows the expected quality.
Posted Last updated
.
Post marked as solved
1 Replies
442 Views
This is my h5 code: <video id="myVideo" src="xxxapp://***.***.xx/***/***.mp4" style="object-fit:cover;opacity:1;width:100%;height:100%;display:block;possition:absolute;" type="video/mp4"></video> I want to load local large video, so, I use WKURLSchemeHandler. - (void)webView:(WKWebView *)webView startURLSchemeTask:(id<WKURLSchemeTask>)urlSchemeTask { NSURLRequest *request = [urlSchemeTask request]; NSURL *url = request.URL; NSString *urlString = url.absoluteString; NSString *videoPath = [[NSBundle mainBundle] pathForResource:@"***" ofType:@"mp4"]; NSData *videoData = [NSData dataWithContentsOfFile:videoPath options:nil error:nil]; NSURLResponse *response = [[NSURLResponse alloc] initWithURL:url MIMEType:@"video/mp4" expectedContentLength:videoData.length textEncodingName:nil]; [urlSchemeTask didReceiveResponse:response]; [urlSchemeTask didReceiveData:videoData]; [urlSchemeTask didFinish]; } but its not work, data is not nil, but video do not play. I would greatly appreciate it if someone could help me find a solution!! ps: can make it, but we cannot use it due to some reasons.
Posted Last updated
.
Post marked as solved
1 Replies
401 Views
Hey devs! I recently started a project, a macOS app, which is like a Remote Desktop app but only on local network. For this I wanted to use the MultipeerConnectivity framework, and it's my first time using it. So I have already done the device discovery side that is working well as it is the easier part. Now I just need someone who knows how it works and who has time to explain me (as I couldn't find any documentation about this) how does work the OutputStream And InputStream in MC and if its a good choice for my needs. It has to be low latency and high resolution... I also have seen other frameworks such as WebRTC that I could combine with a local WebSocket Server, but as I'm new to live video streaming and that I don't know anyone that is experimented with this I wanted to ask there for your advices. Thank you in advance, TR-MZ (just an unknown Indie dev).
Posted Last updated
.
Post not yet marked as solved
0 Replies
270 Views
Hey, all! I've been trying to upload a video preview to the AVP storefront for our app, but some of the export requirements seem to contradict one another. For the AVP, a resolution of 4K is needed... which would require H264 level 5.2. Yet, the H264 level can't be any higher than 4... which is 1080p. It seems like a catch-22 where either the H264 level will be too high, or the resolution will be too low. Does anyone have a fix or a way around this issue?
Posted Last updated
.
Post not yet marked as solved
1 Replies
262 Views
Does Video Toolbox’s compression session yield data I can decompress on a different device that doesn’t have Apple’s decompression? i.e. so I can network data to other devices that aren’t necessarily Apple? or is the format proprietary rather than just regular h.264 (for example)? If I can decompress without video toolbox, may I have reference to some examples for how to do this using cross-platform APIs? Maybe FFMPEG has something?
Posted Last updated
.
Post not yet marked as solved
0 Replies
507 Views
HELP! How could I play a spatial video in my own vision pro app like the official app Photos? I've used API of AVKit to play a spatial video in XCode vision pro simulator with the guild of the official developer document, this video could be played but it seems different with what is played through app Photos. In Photos the edge of the video seems fuzzy but in my own app it has a clear edge. How could I play the spatial video in my own app with the effect like what is in Photos?
Posted
by alvinLK.
Last updated
.
Post marked as solved
4 Replies
594 Views
When I try to play video on my Apple Vision Pro simulator using a custom view with an AVPlayerLayer (as seen in my below VideoPlayerView), nothing displays but a black screen while the audio for the video i'm trying to play plays in the background. I've tried everything I can think of to resolve this issue, but to no avail. import SwiftUI import AVFoundation import AVKit struct VideoPlayerView: UIViewRepresentable { var player: AVPlayer func makeUIView(context: Context) -> UIView { let view = UIView(frame: .zero) let playerLayer = AVPlayerLayer(player: player) playerLayer.videoGravity = .resizeAspect view.layer.addSublayer(playerLayer) return view } func updateUIView(_ uiView: UIView, context: Context) { if let layer = uiView.layer.sublayers?.first as? AVPlayerLayer { layer.frame = uiView.bounds } } } I have noticed however that if i use the default VideoPlayer (as demonstrated below), and not my custom VideoPlayerView, the video displays just fine, but any modifiers I use on that VideoPlayer (like the ones in my above custom struct), cause the video to display black while the audio plays in the background. import SwiftUI import AVKit struct MyView: View { var player: AVPlayer var body: some View { ZStack { VideoPlayer(player: player) Does anyone know a solution to this problem to make it so that video is able to display properly and not just appear as a black screen with audio playing in the background?
Posted Last updated
.