Integrate video and other forms of moving visual media into your apps.

Posts under Video tag

111 Posts
Sort by:
Post not yet marked as solved
0 Replies
74 Views
I am trying to replace gifs with mp4s on my website. Currently its working great in Chrome and Firefox, but the behavior is odd in Safari. <video autoplay loop muted playsinline defaultmuted preload="auto"> <source src="/path/to/video.mp4" type="video/mp4"> </video> This video is an h264 mp4 video with no audio track. Firefox, Chrome on my Macbook: Works as expected (autoplay's like it were a gif) iOS Safari without Low Power Mode: Works as expected iOS Safari with Low Power Mode: Autoplays but there is a play button on top that disappears when tapped. macOS Safari: Does not autoplay. A play button appears and it plays if clicked. I have been following https://developer.apple.com/documentation/webkit/delivering_video_content_for_safari as well as other guides on the internet and it still isn't working. I'm pretty sure there is a recent change responsible for this because it used to work in an older version of desktop safari.
Posted
by
Post not yet marked as solved
0 Replies
66 Views
I am trying to develop an app that can choose a video from the iphone and save the url (or string representation of the url) so the user can select the video from the app (by selecting an assigned video title) without having to use the videopicker again. I have implemented the videopicker and can print the url (e.g. file:///private/var/mobile/Containers/Data/PluginKitPlugin/3BBCBF37-7659-439E-A3D6-4390D751F29D/tmp/trim.D973E0CE-468C-4B5F-B5FC-63FA1F647175.MOV), but I can't find a way to use this string to select the video. Can someone help me with this? Thanks.
Posted
by
Post not yet marked as solved
0 Replies
80 Views
As the title says CGDisplayCopyAllDisplayModes does not appear to return ALL of the display modes. I've attached a screenshot showing a list of the modes returned by CGDisplayCopyAllDisplayModes. Notice that a CGDisplayMode with the currently used mode ID# 13 is not in the list. My second screen, a non-Retina display, seems to behave as expected. How do you find 'all' of the CGDisplayModes for an Apple Studio Display?
Posted
by
Post not yet marked as solved
1 Replies
118 Views
Hi! I have limited mobility so I want to create voice command short cuts to navigate my video settings. How do I create custom shortcuts to: Take cinematic video Take Time lapse Take Slow motion Also is there a way to set a time limit? I find Siri and even sometimes voice control does not allow voice control when recording because it’s using the microphone to record. so I would like to try a shortcut Like this: ”Take Timelapse” Start Timelapse immediately stop recording after 2 min. thanks!
Posted
by
Post not yet marked as solved
2 Replies
229 Views
I'm trying to understand what exactly is made possible by Media Device Discovery Extensions, what responsibility the containing app has, and what exactly is made available to other apps or the system, if anything. I haven't been able to find any meaningful high level documentation, and WWDC 2022 session 10096 only mentions these new extensions in passing. The most comprehensive body of information I found is the example project: https://developer.apple.com/documentation/devicediscoveryextension/discovering_a_third-party_media-streaming_device?changes=latest_beta&language=objc However, I don't think it's working the way it should out of the box: I've got the Client target app built and running on an iPad Pro running iPadOS 16 Beta 2 I've got the MacServer target running on a Mac Mini with macOS 13 Ventura Beta 2 I've got the Server target running on an iPhone with iOS 15.5. (Non-beta) If I tap the AirPlay icon on the Client's video player, I can see the two servers, but selecting one just causes a spinner to show up next to its name. This keeps going for a while, eventually the spinner goes away again, but the device selection tick stays next to 'iPad'. The text "Select route" also doesn't change, which I think it's supposed to, judging by the code. I've tried a variety of combinations of settings on the servers - bluetooth only, bonjour only, different protocols, etc., but I'm always getting the same behaviour. Has anyone had any success in getting the example to work, and how? Is there any high level documentation available that I've missed? Can someone explain what exactly we can build with this in more detail than "implementations of custom A/V streaming protocols?" The WWDC session video talks about 3rd party SDKs, so do these extensions have to be embedded in every app that would be streaming the video, implying that it's not useful for mirroring?
Posted
by
Post not yet marked as solved
0 Replies
105 Views
This video session is essentially a consumer facing video, there isn't even a single line of code shown. VideoPlayer(player: player) doesn't give the shown "new features" by default, an example / implementation should be expected of a WWDC session.
Posted
by
Post not yet marked as solved
2 Replies
273 Views
Hi, I see announcement of the availability of beta version of Advanced Video Quality Tool (AVQT) for Linux in wwdc2022. However I am unable to find the AVQT packages for Linux. The AVQT resource page seems to still point to the .dmg file which is for MacOS. Where can I find the Linux version of AVQT? Thanks
Posted
by
Post not yet marked as solved
0 Replies
120 Views
Hello, One of the features of AVKit you list is "performance optimized". Could you confirm that you've made performance improvements in the last 12 months to AVKit? If so, could you share in what areas or what metrics improved? Thanks!
Posted
by
Post not yet marked as solved
0 Replies
145 Views
I'm trying to build an educational SwiftUI iOS app with course videos. I've tried storing these videos on YouTube as private videos and also Vimeo. But they both show the video controls which allows the URL to be extracted, which I don't want. Storing the videos as a local resource is a no no otherwise the app would be several gb's. I can also store the videos just on my web hosting, but again I think these are discoverable and I don't want to go down the route of creating log ins and user accounts. Are there any other solutions of doing this? Is it possible to store the videos in Firebase and get the app to access them stored there?
Posted
by
Post not yet marked as solved
0 Replies
171 Views
Hello everyone, Everything was working good until I updated to mac OS Monterey, once I finish the update, I did check the app and it start to consume a lot of memory, the app starts in 35MB and 3 seconds later grows to 345, then 650mb then 780 and so on. I don't know if someone is experimenting the same but your help to solve this will be very appreciated, thank you. This is the code for that: let detector = CIDetector(     ofType: CIDetectorTypeQRCode,     context: nil,     options: [CIDetectorAccuracy: CIDetectorAccuracyLow, CIDetectorTracking: false]  ) guard let features = detector?.features(in: ciimage) else {       return decode } for feature in features as! [CIQRCodeFeature] {       decode = feature.messageString! }
Posted
by
Post not yet marked as solved
1 Replies
196 Views
Hello We're implementing Search in our Apple TV App, and want to show Search Suggestions. We've followed the WWDC talk https://developer.apple.com/videos/play/wwdc2020/10634/ and so far so good. We've got our suggestions showing, but at the start of our suggestions list is the default built in suggestion of whatever has already been typed. We want to remove this first suggestion, because below the suggestions is already the results for the typed search term, and therefore offering this suggestion doesn't work for us. An array of two UISearchSuggestionItem were added to self.searchController.searchSuggestions, but this seems to have been pre-pended with another suggestion, which is what the customer has already typed. This is what we want to turn off. This suggestion isn't present in the Apple TV+ app, so it feels like we must somehow be able to turn this off, but haven't found any way to do so. This is AppleTV+ app which does NOT show a suggestion of "jo". Please help. Thanks Antony
Posted
by
Post not yet marked as solved
0 Replies
179 Views
Playback glitch is observed on iPhone and iPad devices for our encrypted asset, But playback is working fine on safari browser for the same asset. I have attached below listed files for your reference Manifest files Index.m3u8 Level(4417274) Log snippet with decoder errors(Captured on iPhone8 and iPhone12) iPhone12-Log-Snippet iPhone8-Log-Snippet
Posted
by
Post not yet marked as solved
0 Replies
184 Views
When I try to build a demo app for exploring PIP Swap feature using custom player view controller, I'm facing an issue that play and pause not working on the PIP window playback. But when I use AVPlayerViewController. I can pause and play the PIP window playback. Here is the sample code attached.     var nowPlayingSession: MPNowPlayingSession?     var player: AVPlayer? {         didSet {             playerLayer = AVPlayerLayer(player: player)             if player != nil {                 nowPlayingSession = MPNowPlayingSession(players: [player!])                 nowPlayingSession?.remoteCommandCenter.pauseCommand.addTarget(handler: { [weak self] event in                     guard let self = self else { return .commandFailed }                     self.pause()                     return .success                 })                 nowPlayingSession?.remoteCommandCenter.playCommand.addTarget(handler: { [weak self] event in                     guard let self = self else { return .commandFailed }                     self.play()                     return .success                 })                 nowPlayingSession?.remoteCommandCenter.togglePlayPauseCommand.addTarget(handler: { [weak self] event in                     guard let self = self else { return .commandFailed }                     self.togglePlayPause()                     return .success                 })             }         }     }     override func viewDidLayoutSubviews() {         super.viewDidLayoutSubviews()         playerLayer?.frame = view.bounds         publishNowPlayingMetadata()     }     func publishNowPlayingMetadata() {         var nowPlayingInfo = [String: Any]()         nowPlayingInfo[MPMediaItemPropertyTitle] = "Unknown Content"         nowPlayingInfo[MPMediaItemPropertyPlaybackDuration] = 15.0         nowPlayingInfo[MPNowPlayingInfoPropertyDefaultPlaybackRate] = 1.0         nowPlayingInfo[MPMediaItemPropertyArtist] = "Unknown Artist"         nowPlayingInfo[MPMediaItemPropertyAlbumArtist] = "Unknown Album Artist"         nowPlayingInfo[MPMediaItemPropertyAlbumTitle] = "Unknown Album Title"         nowPlayingSession?.nowPlayingInfoCenter.nowPlayingInfo = nowPlayingInfo         nowPlayingSession?.becomeActiveIfPossible()     } Ref: https://developer.apple.com/documentation/avkit/adopting_picture_in_picture_playback_in_tvos (The above changes are made on top of it. ) Please suggest for changes.
Posted
by
Post not yet marked as solved
0 Replies
167 Views
I am developing a hybrid app, I am using java script and html, to be able to compile it to xcode I use capacitor, the problem is that my app includes videos but I cannot block the native ios player, I want to block it. webview.allowsInlineMediaPlayback = yes; I found this, the problem is that it only blocks it for ipad, not for iphones.
Posted
by
Post not yet marked as solved
1 Replies
174 Views
Hello, few days ago i updated my iPhone 7 to iOS 15. Now 15.4.1. My issue is that when i start playing some video through any external link (opens in Safari) audio works but video seems to be frozen. every time screen remains black.
Posted
by
Post not yet marked as solved
0 Replies
137 Views
Hi everyone. I have a course on Udemy which i think i would like to put into an app (its about Apple stuff). I have developed a couple of apps, so i know a little bit but not much. Anyway, i was just wondering the best way to do this? Can i store the videos privately on YouTube then link to them from the App? Maybe include a video or two as part of the build then put everything else behind a paywall like In App Purchase? Any thoughts on the best way to do this would be greatly appreciated! Or if you know of any similar starter projects on GitHub to point me in the right direction with the code that would be great too. Thank you
Posted
by