Post not yet marked as solved
Simple AVPlayer sample in swift for iOS 15.4.1
Interstitial specified via EXT-X-DATERANGE tag. Interstitial displayed as expected but no notifications generated for either AVPlayerInterstitialEventMonitor.currentEventDidChangeNotification or .eventsDidChangeNotification?
Tested on both a simulator and a device??
Suggestions?
Post not yet marked as solved
Hi,
I have an app that uses AVPlayer to stream and play videos (HLS) but I'm struggling to find a way to do the same with fmp4.
This is what I use to play my HLS stream. I tried just to replace the url with the fmp4 one but it does not work.
private func connect() {
let stringUrl = "https://wolverine.raywenderlich.com/content/ios/tutorials/video_streaming/foxVillage.m3u8"
let url = URL(string: stringUrl)!
let asset = AVURLAsset(url: url)
let item = AVPlayerItem(asset: asset)
if #available(iOS 10.0, *) {
item.preferredForwardBufferDuration = Double(50000) / 1000
}
if #available(iOS 13.0, *) {
item.automaticallyPreservesTimeOffsetFromLive = true
}
self.player = AVPlayer(playerItem: item)
let playerLayer = AVPlayerLayer(player: self.player)
playerLayer.frame = self.playerView.bounds
playerLayer.videoGravity = .resizeAspect
self.playerView.layer.addSublayer(playerLayer)
self.videoLayer = playerLayer
self.videoLayer?.frame = self.playerView.bounds
player?.play()
}
I haven't got any luck looking for a possible solution and I'm out of ideas. I'll be really grateful if anyone of you could point me to a good direction.
Post not yet marked as solved
As per the title, but let me provide some more context:
on macos, using the Photos app, I can change the ‘poster frame, for a video. This frame is the one displayed as a thumbnail.
on iOS’ Photos app this functionality does not (see to?) exist.
So, to solve my own problem, I would like to build an app that does just that.
I am not sure how ‘poster frame’ is implemented, and have no idea where to start looking (perusing the PhotoKit and AVAssetWriter documentation, I didn’t find any hints on ‘poster frame’.
I am just looking for some pointers at this point, to understand if this is at all possible.
Post not yet marked as solved
I have downloaded and run this example. The Export command seems to hang when it hits 100%. Unfortunately I'm a noob and can't find the cause of the problem. The dialog processing seems a bit confusing for a new guy, I certainly don't pretend to understand the use of class, structs, etc. in this part of the example code.
I have been able to understand the metal processing, custom filters, etc.
I am running Mac OS 12.2.1, Xcode 13.3
Thanks to all for reading!
Somehow I was not able to add the tag for this in the 'search for a tag' on the web page.
Post not yet marked as solved
Hi Team,
In tvOS based on Siri/IR Remote, Keyboard layout will be differently. Is there a way to identify whether user is using Siri Remote or IR Remote, so that we update the UI accordingly.
We are using react native or tvOS development and search results UI is developed in React Native. So, we wanted to know which remote user is using.
Post not yet marked as solved
I want to allow user to record a video only in portrait mode(orientation) and restrict the user to record in landscape. I'm using UIImagePickerController and I couldn't find any orientation options in it. Could anyone help me out in this?
Post not yet marked as solved
How can I know the width and height of a video?
For example, the video is playing and the screen is rotated.
Post not yet marked as solved
let time1cmt = CMTimeGetSeconds(playerOK.currentTime())
time1 = Double(time1cmt)
let time2cmt = CMTimeGetSeconds(playerOK.currentTime())
time2 = Double(time2cmt)
videoDif = (time2 - time1)
Now I'm using .currentTime() to get the start point and end point, but they don't seem to be accurate, is there any other way to get the time more accurately?
Post not yet marked as solved
It seems that Safari (desktop and iOS) is not respecting the Cache-Control header for video files like video/mp4 and video/webm.
The response headers for the video includes
Cache-Control: public, max-age=3600
And the response comes from the network any time the page is refreshed.
I've checked that the Disable Cache is not enabled in dev tools.
The same video request is cached as expected on Chrome and Firefox.
Post not yet marked as solved
I am playing a video with AVPlayer, how can I get the float value of the FPS of the video?
I readed it's possible with AVAssetTrack but I don't know how to implement it.
On Swift/SwiftUI
Thanks!
This question pertains to both wwdc21-10159 and wwdc20-10009.
Let's say I have a simple Core Image kernel that does some simple operation as shown below.
#include <metal_stdlib>
#include <CoreImage/CoreImage.h> // includes CIKernelMetalLib.h
using namespace metal;
extern "C" float4 HDRHighlight(coreimage::sample_t s, float time, coreimage::destination dest)
{
return float4(2.0, 0.0, 0.0, 1.0);
}
In my build rules, I have the appropriate command to compile and link the .ci.metal source into the ci.metallib library.
I can confirm internally in the resources that this library file is generated.
However, when I import the kernel using CIColorKernel in Swift, I am given an error that states the kernel function failed to load.
In the logs, I see that it says:
[api] reflect Function 'HDRHighlight' does not exist.
Fatal error: Unable to load the kernel.
Post not yet marked as solved
Im using more easily to create a pick up and play a video, and that part works, and to show the video I use this line:
PhotoPickerResultView(result: photoPickerService.results[0])
and this part work fine to, arrives from this:
struct PhotoPickerResultView: View {
var result: PHPickerResult
enum MediaType {
case loading, error, video
}
@State private var loaded = false
@State private var url: URL?
@State private var mediaType: MediaType = .loading
@State private var latestErrorDescription = ""
var body: some View {
Group {
switch mediaType {
case .loading:
ProgressView()
case .error:
VStack {
Image(systemName: "exclamationmark.triangle.fill")
Text(latestErrorDescription).font(.caption)
}
.foregroundColor(.gray)
case .video:
if url != nil {
VideoPlayer(player: AVPlayer(url: url!))
...
.....
My question is, How can I use or implement the custom buttons of the AVPlayer? like:
@State private var player1 = AVPlayer(url: URL(string: "https...mp4")!)
VideoPlayer(player: player1)
Button {
player1.play()
} label: {
Text(" PLAY ")
}
Button {
player1.pause()
} label: {
Text(" PAUSE ")
}
from this line:
PhotoPickerResultView(result: photoPickerService.results[0])
???
or what do I have to change to use more easily that custom buttons of the AVPlayer??
Thanks
For example, when Apple Engineers design something new like PHPickerViewController, I imagine that they test all the functionalities and to test all these functions, they create applications.
Where can I download the code of those applications? I mean, code that is already tested and works from Apple Engineers.
I'm sure they have tons of tested code, which would be very useful for us.
Thanks!
Post not yet marked as solved
I have a Xamarin.Forms application that implements video chat. For now, a video call is made by sending push notifications using UIKit, and is answered by touching the push notification.
But the push notification can be easily missed. I would like to use a native iOS telephony feature, that would let the app make a real phone ringing etc.
I heard that it could be done using Callkit, but I can't find any example or explanation on how this can be done. Or maybe there is some way other than Callkit?
Basically, answering the call (like with a common phone call) should replace touching the notification.
So in the method that handles the incoming push notification (AppDelegate class), something should make ringing and maybe show buttons to take/reject the call. When the buttons is pressed, I should be able the handle that event in the code.
is it possible?
Post not yet marked as solved
I noticed user are unable to capture screenshot on Netflix content with Safari, I wonder which webkit API they use in order to make screen capture result black screen?
Post not yet marked as solved
Hi,
We have a Video Player App, which seem to crash randomly when player is closed.
Crash Info:
crash_info_entry_0 : BUG IN CLIENT OF LIBDISPATCH: dispatch_sync called on queue already owned by current thread
Stack trace:
0 libdispatch.dylib 0x12dd0 __DISPATCH_WAIT_FOR_QUEUE__ + 484
1 libdispatch.dylib 0x12900 _dispatch_sync_f_slow + 144
2 MediaToolbox 0x1f6374 FigCaptionRendererSessionSetPlayer + 68
3 MediaToolbox 0x2a7b30 setPlayerDo + 184
4 libdispatch.dylib 0x3950 _dispatch_client_callout + 20
5 libdispatch.dylib 0x12a70 _dispatch_lane_barrier_sync_invoke_and_complete + 56
6 MediaToolbox 0x2a7a6c -[FigSubtitleCALayer setPlayer:] + 64
7 AVFCore 0x6e2c0 -[AVPlayer _removeLayer:videoLayer:closedCaptionLayer:subtitleLayer:interstitialLayer:] + 572
8 AVFCore 0x30dc8 -[AVPlayerLayer dealloc] + 324
9 Foundation 0x2cf78 NSKVODeallocate + 216
10 QuartzCore 0x68c54 CA::Layer::free_transaction(CA::Transaction*) + 404
11 QuartzCore 0x4f284 CA::Transaction::commit() + 952
12 MediaToolbox 0x3375bc setBounds + 376
13 MediaToolbox 0x200160 UpdateLayoutContext + 892
14 MediaToolbox 0x1feda8 onCaptionInputDo + 212
15 libdispatch.dylib 0x3950 _dispatch_client_callout + 20
16 libdispatch.dylib 0xb0ac _dispatch_lane_serial_drain + 664
17 libdispatch.dylib 0xbc10 _dispatch_lane_invoke + 392
18 libdispatch.dylib 0x16318 _dispatch_workloop_worker_thread + 656
19 libsystem_pthread.dylib 0x11b0 _pthread_wqthread + 288
20 libsystem_pthread.dylib 0xf50 start_wqthread + 8
Post not yet marked as solved
May I ask is there has any sample app that Brad demonstrated on last WWDC 2019 session 249?
Any lead will be greatly appreciated. Thank you. (https://developer.apple.com/videos/play/wwdc2019/249/)
Post not yet marked as solved
I've built a web app that uses WebRTC to allow people to video chat in the browser and not be forced to download an app.
However, it would be really helpful if iOS users could stream their video using their native camera app and not just the RTC element in the browser.
Is this possible?
I've found a way to open the native camera app using this HTML:
<input type="file" accept="video/*" capture="environment">
However, this only allows the user to upload their video and not stream it.
Post not yet marked as solved
When i use capcut pro apk in ios device but it did not work and i tried many times but i failed could anyone tell me the reason why i should not use it. https://apkpure7.com/
Post not yet marked as solved
AVPlayer started to throw unknow error:
The operation couldn’t be completed. (CoreMediaErrorDomain error -16190.)
Is there any exhaustive list of CoreMediaErrorDomain's errors?