Post not yet marked as solved
When I try to build a demo app for exploring PIP Swap feature using custom player view controller, I'm facing an issue that play and pause not working on the PIP window playback.
But when I use AVPlayerViewController. I can pause and play the PIP window playback.
Here is the sample code attached.
var nowPlayingSession: MPNowPlayingSession?
var player: AVPlayer? {
didSet {
playerLayer = AVPlayerLayer(player: player)
if player != nil {
nowPlayingSession = MPNowPlayingSession(players: [player!])
nowPlayingSession?.remoteCommandCenter.pauseCommand.addTarget(handler: { [weak self] event in
guard let self = self else { return .commandFailed }
self.pause()
return .success
})
nowPlayingSession?.remoteCommandCenter.playCommand.addTarget(handler: { [weak self] event in
guard let self = self else { return .commandFailed }
self.play()
return .success
})
nowPlayingSession?.remoteCommandCenter.togglePlayPauseCommand.addTarget(handler: { [weak self] event in
guard let self = self else { return .commandFailed }
self.togglePlayPause()
return .success
})
}
}
}
override func viewDidLayoutSubviews() {
super.viewDidLayoutSubviews()
playerLayer?.frame = view.bounds
publishNowPlayingMetadata()
}
func publishNowPlayingMetadata() {
var nowPlayingInfo = [String: Any]()
nowPlayingInfo[MPMediaItemPropertyTitle] = "Unknown Content"
nowPlayingInfo[MPMediaItemPropertyPlaybackDuration] = 15.0
nowPlayingInfo[MPNowPlayingInfoPropertyDefaultPlaybackRate] = 1.0
nowPlayingInfo[MPMediaItemPropertyArtist] = "Unknown Artist"
nowPlayingInfo[MPMediaItemPropertyAlbumArtist] = "Unknown Album Artist"
nowPlayingInfo[MPMediaItemPropertyAlbumTitle] = "Unknown Album Title"
nowPlayingSession?.nowPlayingInfoCenter.nowPlayingInfo = nowPlayingInfo
nowPlayingSession?.becomeActiveIfPossible()
}
Ref:
https://developer.apple.com/documentation/avkit/adopting_picture_in_picture_playback_in_tvos (The above changes are made on top of it. )
Please suggest for changes.
Post not yet marked as solved
I've built a web app that uses WebRTC to allow people to video chat in the browser and not be forced to download an app.
However, it would be really helpful if iOS users could stream their video using their native camera app and not just the RTC element in the browser.
Is this possible?
I've found a way to open the native camera app using this HTML:
<input type="file" accept="video/*" capture="environment">
However, this only allows the user to upload their video and not stream it.
Post not yet marked as solved
I am developing a hybrid app, I am using java script and html, to be able to compile it to xcode I use capacitor, the problem is that my app includes videos but I cannot block the native ios player, I want to block it.
webview.allowsInlineMediaPlayback = yes;
I found this, the problem is that it only blocks it for ipad, not for iphones.
Post not yet marked as solved
The safari browser and chrome browser video getting black screen and i have try to disable GPU media but its not showing over safari advance setting
Post not yet marked as solved
Hello, few days ago i updated my iPhone 7 to iOS 15. Now 15.4.1. My issue is that when i start playing some video through any external link (opens in Safari) audio works but video seems to be frozen. every time screen remains black.
Post not yet marked as solved
Hi everyone. I have a course on Udemy which i think i would like to put into an app (its about Apple stuff).
I have developed a couple of apps, so i know a little bit but not much.
Anyway, i was just wondering the best way to do this? Can i store the videos privately on YouTube then link to them from the App? Maybe include a video or two as part of the build then put everything else behind a paywall like In App Purchase?
Any thoughts on the best way to do this would be greatly appreciated! Or if you know of any similar starter projects on GitHub to point me in the right direction with the code that would be great too.
Thank you
Post not yet marked as solved
Looking for the best ott platform provider or solution to launch our own video on demand business with customized features and functionalities, revenue models. We are focusing on movie streaming content to broadcast across the web, IOS, Apple TV, Amazon fire tv and monetize over it. Need suggestion regarding this.
Thanks, Advance
Post not yet marked as solved
Simple AVPlayer sample in swift for iOS 15.4.1
Interstitial specified via EXT-X-DATERANGE tag. Interstitial displayed as expected but no notifications generated for either AVPlayerInterstitialEventMonitor.currentEventDidChangeNotification or .eventsDidChangeNotification?
Tested on both a simulator and a device??
Suggestions?
Post not yet marked as solved
Does AppleTV 4K support HLG in any configuration? When I connect my AppleTV 4K to a Samsung TV with HLG support via HDMI
AVPlayer.availableHDRModes.contains(.hlg) returns false
However
AVPlayer.availableHDRModes.contains(.hdr10) returns true
https://support.apple.com/en-us/HT208074 only mentions HDR10 and Dolby Vision support.
Is there any way to play HLG video on AppleTV 4K similar to how it works on newer iPhones and iPads?
Post not yet marked as solved
Hi here,
i have set the video attributes x-webkit-wirelessvideoplaybackdisabled, x-webkit-airplay="deny", disableremoteplayback according to apple's documentation so as to opt out from airplay on my video player. For Safari versions <=12 the airplay button correctly does not appear. But for versions greater than 12 it appears! Can you please give an update/guidance on that?
Both standard mp4 files and streaming HLS files are experiencing substantial playback and rendering issues on iOS 15.
This includes:
Safari immediately crashes
Video displays only black (occasional audio can be heard)
Video is frozen on 1st frame despite time updating
Substantial load times (10+ seconds). Should be immediate.
GPU Process:Media has been disabled yet issues persist.
Safari immediately crashes with GPU Process: WebGL enabled.
These videos are being rendered via WebGL (threejs)
None of these issues were present on iOS 14.
I’m on an iPad Pro 12.9 2020.
Post not yet marked as solved
We have got to know, in order to serve hevc videos on iOS, we need to use HLS+fMP4 approach from the following link
https://developer.apple.com/forums/thread/132291
When serving a HLS+fMP4+HEVC video, its unable to play on iOS, and plays smoothly on Android. Tool used for transcoding, FFMPEG. Following commands were used.
Pass-1:
ffmpeg -y -i input.mp4 -pix_fmt yuv420p -vcodec libx265 -profile:v main -preset fast -vf scale=w="min(trunc(iw/2)*2\,640)":h=-2 -b:v 1500k -maxrate 1500k -bufsize 1500k -acodec aac -ab 128k -subq 6 -async 2 -tag:v hvc1 -movflags faststart -map 0 -vsync 2 -max_muxing_queue_size 9999 -hls_segment_type fmp4 -force_key_frames expr:gte(t,n_forced*3) -hls_flags single_file -hls_list_size 0 -hls_time 6 -hls_playlist_type vod -hls_segment_filename output.mp4 -x265-params min-keyint=90:keyint=90:pass=1:stats=logfile -f null /dev/null
Pass-2:
ffmpeg -y -i input.mp4 -pix_fmt yuv420p -vcodec libx265 -profile:v main -preset fast -filter_complex [0:v]scale=w="min(trunc(iw/2)*2\,640)":h=-2,unsharp=5:5:0.75,eq=saturation=1.06:contrast=1.05,colortemperature=6900 -b:v 1500k -maxrate 1500k -bufsize 1500k -acodec aac -ab 128k -subq 6 -async 2 -tag:v hvc1 -movflags faststart -map 0 -vsync 2 -max_muxing_queue_size 9999 -hls_segment_type fmp4 -force_key_frames expr:gte(t,n_forced*3) -hls_flags single_file -hls_list_size 0 -hls_time 6 -hls_playlist_type vod -hls_segment_filename output.mp4 -x265-params min-keyint=90:keyint=90:pass=2:stats=logfile output.m3u8
What is the mistake in the followed approach?
Post not yet marked as solved
I have the new iOS 14 VideoPlayer:
private let player = AVPlayer(url: Bundle.main.url(forResource: "TL20_06_Shoba3_4k", withExtension: "mp4")!)
		var body: some View {
								VideoPlayer(player: player)
								.aspectRatio(contentMode: .fill)
...
This player setup can not display 4:3 video on 16:9 screen of tv without black stripes. Modifier aspectRatio does not work on VideoPlayer.
How can I set videoGravity of existed AVPlayerLayer to resizeAspectFill via SwiftUI API?
Post not yet marked as solved
Is there a way to disable the default video controls (play/pause/scrubber/etc) on the new SwiftUI VideoPlayer in iOS14, so I could create a custom one?
Post not yet marked as solved
Hi,
I have an app that uses AVPlayer to stream and play videos (HLS) but I'm struggling to find a way to do the same with fmp4.
This is what I use to play my HLS stream. I tried just to replace the url with the fmp4 one but it does not work.
private func connect() {
let stringUrl = "https://wolverine.raywenderlich.com/content/ios/tutorials/video_streaming/foxVillage.m3u8"
let url = URL(string: stringUrl)!
let asset = AVURLAsset(url: url)
let item = AVPlayerItem(asset: asset)
if #available(iOS 10.0, *) {
item.preferredForwardBufferDuration = Double(50000) / 1000
}
if #available(iOS 13.0, *) {
item.automaticallyPreservesTimeOffsetFromLive = true
}
self.player = AVPlayer(playerItem: item)
let playerLayer = AVPlayerLayer(player: self.player)
playerLayer.frame = self.playerView.bounds
playerLayer.videoGravity = .resizeAspect
self.playerView.layer.addSublayer(playerLayer)
self.videoLayer = playerLayer
self.videoLayer?.frame = self.playerView.bounds
player?.play()
}
I haven't got any luck looking for a possible solution and I'm out of ideas. I'll be really grateful if anyone of you could point me to a good direction.
Post not yet marked as solved
It seems that Safari (desktop and iOS) is not respecting the Cache-Control header for video files like video/mp4 and video/webm.
The response headers for the video includes
Cache-Control: public, max-age=3600
And the response comes from the network any time the page is refreshed.
I've checked that the Disable Cache is not enabled in dev tools.
The same video request is cached as expected on Chrome and Firefox.
Post not yet marked as solved
I have downloaded and run this example. The Export command seems to hang when it hits 100%. Unfortunately I'm a noob and can't find the cause of the problem. The dialog processing seems a bit confusing for a new guy, I certainly don't pretend to understand the use of class, structs, etc. in this part of the example code.
I have been able to understand the metal processing, custom filters, etc.
I am running Mac OS 12.2.1, Xcode 13.3
Thanks to all for reading!
Somehow I was not able to add the tag for this in the 'search for a tag' on the web page.
Post not yet marked as solved
As per the title, but let me provide some more context:
on macos, using the Photos app, I can change the ‘poster frame, for a video. This frame is the one displayed as a thumbnail.
on iOS’ Photos app this functionality does not (see to?) exist.
So, to solve my own problem, I would like to build an app that does just that.
I am not sure how ‘poster frame’ is implemented, and have no idea where to start looking (perusing the PhotoKit and AVAssetWriter documentation, I didn’t find any hints on ‘poster frame’.
I am just looking for some pointers at this point, to understand if this is at all possible.
Post not yet marked as solved
Hi Team,
In tvOS based on Siri/IR Remote, Keyboard layout will be differently. Is there a way to identify whether user is using Siri Remote or IR Remote, so that we update the UI accordingly.
We are using react native or tvOS development and search results UI is developed in React Native. So, we wanted to know which remote user is using.
Post not yet marked as solved
I want to allow user to record a video only in portrait mode(orientation) and restrict the user to record in landscape. I'm using UIImagePickerController and I couldn't find any orientation options in it. Could anyone help me out in this?