Integrate video and other forms of moving visual media into your apps.

Posts under Video tag

111 Posts
Sort by:
Post marked as solved
2 Replies
458 Views
This question pertains to both wwdc21-10159 and wwdc20-10009. Let's say I have a simple Core Image kernel that does some simple operation as shown below. #include <metal_stdlib> #include <CoreImage/CoreImage.h> // includes CIKernelMetalLib.h using namespace metal; extern "C" float4 HDRHighlight(coreimage::sample_t s, float time, coreimage::destination dest) {   return float4(2.0, 0.0, 0.0, 1.0); } In my build rules, I have the appropriate command to compile and link the .ci.metal source into the ci.metallib library. I can confirm internally in the resources that this library file is generated. However, when I import the kernel using CIColorKernel in Swift, I am given an error that states the kernel function failed to load. In the logs, I see that it says: [api] reflect Function 'HDRHighlight' does not exist. Fatal error: Unable to load the kernel.
Posted
by GaganBhat.
Last updated
.
Post marked as solved
2 Replies
1.4k Views
Hello, I'm currently stuck trying to load a Video - previously picked by an PHPicker. In the photos you can see the current Views. The Videoplayer View stays unresponsive but in the first frames when the picker disappears you can see the thumbnail and a play button. What am i doing wrong? Should i load the file differently? This is my Picker: struct VideoPicker: UIViewControllerRepresentable{     @Binding var videoURL:String? func makeUIViewController(context: Context) -> PHPickerViewController {         var config = PHPickerConfiguration()         config.filter = .videos         let picker = PHPickerViewController(configuration: config)         picker.delegate = context.coordinator         return picker     } func updateUIViewController(_ uiViewController: PHPickerViewController, context: Context) {} func makeCoordinator() -> Coordinator {         Coordinator(self)     } class Coordinator:NSObject, PHPickerViewControllerDelegate{ let parent:VideoPicker init(_ parent: VideoPicker){ self.parent = parent } func picker(_ picker: PHPickerViewController, didFinishPicking results: [PHPickerResult]) { picker.dismiss(animated: true) { // do something on dismiss }              guard let provider = results.first?.itemProvider else {return} provider.loadFileRepresentation(forTypeIdentifier: "public.movie") { url, error in guard let url = url else {return} self.parent.videoURL = url.absoluteString print(url) print(FileManager.default.fileExists(atPath: url.path)) } } } } I'm totally able to get the URL (local URL - e.g.: file:///private/var/mobile/Containers/Data/Application/22126131-CBF4-4CAF-B943-22540F1096E1/tmp/.com.apple.Foundation.NSItemProvider. ) But for the life of me - the VideoPlayer won't play it: struct VideoView:View{     @Binding var videoURL:String? @Binding var showVideoPicker:Bool     var body: some View{         if let videoURL = videoURL {             VideoPlayer(player: AVPlayer(url: URL(fileURLWithPath:videoURL)))  .frame(width: 100, height: 100, alignment: .center)  .clipShape(RoundedRectangle(cornerRadius: 16)) .onLongPressGesture{ generator.feedback.notificationOccurred(.success) showVideoPicker.toggle() } } else{     Text("...") } } } Maybe somebody can point me in the right direction because in every Tutorial everybody uses stuff that's bundled to play a video. I want to use Videos from the Photos APP (apple). The videoURL is a @State in my ContentView. It gets updated through the VideoPicker. Sorry for the formatting this is my first Post.
Posted
by brunzbus.
Last updated
.
Post not yet marked as solved
2 Replies
585 Views
Regarding the new SwiftUI VideoPlayer in iOS14, I have the following questions, pls help me: How can I change the play speed rate? How to pause/stop the video? How to listenson the event when finishing playing the video, and fire an action once the video playing is finished? How to get the more information about such new VideoPlayer control -- so far the infomation on the web is very limited. Thanks in advance.
Posted
by Jason2050.
Last updated
.
Post not yet marked as solved
0 Replies
273 Views
let time1cmt = CMTimeGetSeconds(playerOK.currentTime()) time1 = Double(time1cmt) let time2cmt = CMTimeGetSeconds(playerOK.currentTime()) time2 = Double(time2cmt) videoDif = (time2 - time1) Now I'm using .currentTime() to get the start point and end point, but they don't seem to be accurate, is there any other way to get the time more accurately?
Posted
by RayOV.
Last updated
.
Post not yet marked as solved
2 Replies
715 Views
I'm trying to use the sample code associated to the talk Author fragmented MPEG-4 content with AVAssetWriter which can be found here. It works well when I run it on macOS, but after adapting it to run in iOS (basically moving the code in the main file to a view controller), it doesn't work. The problem is that the function: assetWriter(_:didOutputSegmentData:segmentType:segmentReport:) is never called for the last segment. In macOS, the last segment is reported after calling the function AVAssetWriter.finishWriting(completionHandler:), but before the completionHandler parameter block is invoked. In iOS, nothing happens at that point. Is there anything I could do from my side to fix this problem? Thanks in advance!
Posted
by rlaguilar.
Last updated
.
Post not yet marked as solved
1 Replies
654 Views
I'm working on Group Activities for our video app. When I start a video from Apple TV, it's fine to sync it with other user's iPhone device. but inverse case, it's not working. And in some cases, I saw "Unsupported Activity : The active SharePlay activity is not supported on this Apple TV" What did I miss or wrong something?
Posted
by Wontai.
Last updated
.
Post not yet marked as solved
3 Replies
1k Views
So my timeline is this: Got MBP 16' in March with graphics options: AMD Radeon Pro 5500M 4 GB Intel UHD Graphics 630 1536 MB Up until 10.15.5 came out, I had zero problems/crashes and I always have the laptop closed and an external display connected with an official Apple A/V adapter using HDMI. As soon as I installed 10.15.5 the panics started happening. Reason:&#9;&#9;&#9;&#9;&#9; (1 monitored services unresponsive): checkin with service: WindowServer returned not alive with context: unresponsive work processor(s): WindowServer main thread&#9;40 seconds since last successful checkin Literally after the update ended, I didn't touch the laptop for some time, the external monitor went to sleep and the laptop panic'ed and rebooted. I installed apps like Caffeine to prevent the external monitor from going to sleep and managed to continue working. Some days after this the crashes started happening even when the monitor was not going to sleep. Usually when using apps that put some strain on the video such as video conferencing apps. These crashes started to become more frequent. The display froze, for about 2 minutes, the laptop started getting very warm and the fans would not go faster, then after 2 minutes the fans go into turbo mode for about 1 second and the laptop reboots. After this I reverted to 10.15.4 and reset SMC, etc, and the panics when the display goes to sleep are gone, but the crashes when I'm using the computer continue. I tried ditching the adapter and using a usb-c displayport cable but the problem remained. As a final test, I unplugged everything from the laptop and disabled "automatic graphics switching" to force the AMD to be used even with no external display. Sure enough, I was able to reproduce the issue. So it seems not related to an external display, but the AMD card itself (which is always used when an external display is connected). Sad times.
Posted
by EDevil.
Last updated
.
Post not yet marked as solved
1 Replies
411 Views
I am playing a video with AVPlayer, how can I get the float value of the FPS of the video? I readed it's possible with AVAssetTrack but I don't know how to implement it. On Swift/SwiftUI Thanks!
Posted
by RayOV.
Last updated
.
Post marked as solved
2 Replies
369 Views
For example, when Apple Engineers design something new like PHPickerViewController, I imagine that they test all the functionalities and to test all these functions, they create applications. Where can I download the code of those applications? I mean, code that is already tested and works from Apple Engineers. I'm sure they have tons of tested code, which would be very useful for us. Thanks!
Posted
by RayOV.
Last updated
.
Post not yet marked as solved
1 Replies
470 Views
Related to this question Loading slow-mo videos with PHPicker via NSItemProvider.loadFileRepresentation seems to ignore PHPickerConfiguration.preferredAssetRepresentationMode and always reencodes (presumably to bake the slow-mo time segments into the video file). This reencode appears to be ~realtime so this can take a minute for a 1 minute video. Normal videos are near instant. Is there a faster way to import slow-mo videos than this API? One nuance - If I use loadInPlaceFileRepresentation I get no error, inPlace is set to true, but the file is 0 bytes/nonexistent. This seems perfect for the case mentioned in the docs where it would set inPlace to false and take time to make a local copy, so this seems like a bug. Interestingly on a slow-mo that I have previously used loadFileRepresentation that I then use loadInPlaceFileRepresentation it will work instantly but I think it's just using a cached version var configuration = PHPickerConfiguration(photoLibrary: PHPhotoLibrary.shared())   configuration.filter = .videos   configuration.selectionLimit = 1   configuration.preferredAssetRepresentationMode = .current       let photoPicker = PHPickerViewController(configuration: configuration)   photoPicker.delegate = self present(photoPicker, animated: true, completion: nil) ... func picker(_ picker: PHPickerViewController, didFinishPicking results: [PHPickerResult]) {   guard let provider = results.first?.itemProvider, provider.hasItemConformingToTypeIdentifier(UTType.movie.identifier) else {    print("Failed to get photo asset")    picker.dismiss(animated: true, completion: nil)    return   }       provider.loadFileRepresentation(forTypeIdentifier: UTType.movie.identifier) { url, error in    DispatchQueue.main.async {     picker.dismiss(animated: true, completion: nil)    }         guard error == nil, let url = url, let copyTo = DocumentsHelper.getCachedVideoDirectory()?.appendingPathComponent(url.lastPathComponent) else {     print("Failed to load video")     return    }    do {     if FileManager.default.fileExists(atPath: copyTo.path) {      try FileManager.default.removeItem(at: copyTo)     }     try FileManager.default.copyItem(at: url, to: copyTo)    } catch let error {     print("error")     return    } // load video into a player }
Posted
by ryan204.
Last updated
.
Post not yet marked as solved
0 Replies
372 Views
Im using more easily to create a pick up and play a video, and that part works, and to show the video I use this line: PhotoPickerResultView(result: photoPickerService.results[0]) and this part work fine to, arrives from this: struct PhotoPickerResultView: View {       var result: PHPickerResult       enum MediaType {     case loading, error, video   }       @State private var loaded = false   @State private var url: URL?   @State private var mediaType: MediaType = .loading   @State private var latestErrorDescription = ""       var body: some View {           Group {       switch mediaType {       case .loading:         ProgressView()       case .error:         VStack {           Image(systemName: "exclamationmark.triangle.fill")           Text(latestErrorDescription).font(.caption)         }         .foregroundColor(.gray)       case .video:         if url != nil {                                 VideoPlayer(player: AVPlayer(url: url!))          ... ..... My question is, How can I use or implement the custom buttons of the AVPlayer? like: @State private var player1 = AVPlayer(url: URL(string: "https...mp4")!)          VideoPlayer(player: player1)                   Button {           player1.play()         } label: {           Text(" PLAY ")         }         Button {           player1.pause()         } label: {           Text(" PAUSE ")         } from this line: PhotoPickerResultView(result: photoPickerService.results[0]) ??? or what do I have to change to use more easily that custom buttons of the AVPlayer?? Thanks
Posted
by RayOV.
Last updated
.
Post not yet marked as solved
0 Replies
220 Views
I have a Xamarin.Forms application that implements video chat. For now, a video call is made by sending push notifications using UIKit, and is answered by touching the push notification. But the push notification can be easily missed. I would like to use a native iOS telephony feature, that would let the app make a real phone ringing etc. I heard that it could be done using Callkit, but I can't find any example or explanation on how this can be done. Or maybe there is some way other than Callkit? Basically, answering the call (like with a common phone call) should replace touching the notification. So in the method that handles the incoming push notification (AppDelegate class), something should make ringing and maybe show buttons to take/reject the call. When the buttons is pressed, I should be able the handle that event in the code. is it possible?
Posted
by dreznik.
Last updated
.
Post not yet marked as solved
1 Replies
1k Views
We are experiencing audio sync issues during playback on fMP4 HLS live streams (HLS and LL-HLS) on Apple devices only (iOS and macOS) and we're not sure what's causing the problem. The issue does not occur during playback on Windows or Android platforms. During playback in Safari, everything is fine until the sync gets lost suddenly, usually 5-10 minutes after playback begins. The extent of the desync varies but is very noticeable when it does - usually in the 15-30 frame range. Sync is always restored when restarting the player, until it becomes lost again some minutes later. We are capturing the streams on iPhone devices and encoding HEVC / AAC-LC at 30fps locally on the device, and then sending to a media server for further processing. We then transcode the source stream and create multiple variations at different bitrates (HEVC). Because we are streaming from mobile devices in the field, during our server-side transcoding we set a constant 30fps frame rate in case of drops due to network issues. I should add that the issue occurs just as much with h264 as HEVC (we've tested many different combinations of input/output formats and protocols). Regardless of whether we playback the source stream, the individual transcoded variations, or the ABR playlist with all variations, the sync problem appears in the same manner. One interesting note. The issue seldom occurs on one of our older devices, an iPhone 6s Plus running a slightly older iOS version (14.4.1). We suspect it has something to do with discontinuities inherent in our input streams that are not being corrected during our normalization/transcoding process. The Apple player is not compensating as other players are doing on other platforms. We've run the Apple MediaStreamValidator validator tool and discovered multiple "must fix" issues - but it's not clear which of these, if any, are causing our problems. See output attached. MediaStreamValidator output Also, here is the full HLS report from the validator tool (in PNG format due to file restrictions here): Happy to share more details or run more tests. We've been trying to debug this for weeks now. Thanks for your help.
Posted
by ricklive.
Last updated
.
Post not yet marked as solved
2 Replies
744 Views
Hi All, Back to scripting Quicktime with Applescript, I assess the following line now fails : open image sequence seq1 frames per second 24 What I want it to automate .mov generation form numerous and large jpeg image sets, in Quicktime, through pure Applescript or Automate. I have spent hours on Google to sort it out and it seems there are no recent talks on such an issue. What is the issue ? Any helps or scriptlet will be much appreciated
Posted
by jrs@adc.
Last updated
.
Post not yet marked as solved
3 Replies
758 Views
Hello I'm facing a problem on new Video Player view from iOS 14 Now, I'm trying to use Video Player introduced on iOS 14+ in a View routed from Navigation Link. I have set up navigationBarBackButtonHidden -> true navigationBarHidden -> true navigationBarTitle("", displayMode: .inline) statusBar -> hidden: true edgesIgnoringSafeArea -> all However, when VideoPlayer's mp4 video will start to be played, NavigationBar will be shown. To narrow down the issue, I tried to put Colored Rectangle on the area for VideoPlayer instead. Then, NavigationBar won't be shown. I suspect that this issue is VideoPlayer View's bug, but I hope this is not a bug. Does anyone know correct way or how to avoid Navigation Bay display when using VideoPlayer ? This issue happens on both simulator and iPad device. Best regards,
Posted
by hiro1900.
Last updated
.
Post not yet marked as solved
1 Replies
963 Views
Trying to download an encrypted HLS stream we faced the following issue: When we start a new download, calling resume() function of AVAssetDownloadTask, the download process gets stuck (not every time) and neither of urlSession(_:assetDownloadTask:didFinishDownloadingTo:) or urlSession(_:task:didCompleteWithError:) delegate functions (AVAssetDownloadDelegate) are getting called. There are cases where not even the urlSession(_:assetDownloadTask:didLoad:totalTimeRangesLoaded:timeRangeExpectedToLoad:) delegate function is getting called. Any suggestions on how to troubleshoot?
Posted
by gcharita.
Last updated
.
Post not yet marked as solved
0 Replies
332 Views
Hi, We have a Video Player App, which seem to crash randomly when player is closed. Crash Info: crash_info_entry_0 : BUG IN CLIENT OF LIBDISPATCH: dispatch_sync called on queue already owned by current thread Stack trace: 0 libdispatch.dylib 0x12dd0 __DISPATCH_WAIT_FOR_QUEUE__ + 484 1 libdispatch.dylib 0x12900 _dispatch_sync_f_slow + 144 2 MediaToolbox 0x1f6374 FigCaptionRendererSessionSetPlayer + 68 3 MediaToolbox 0x2a7b30 setPlayerDo + 184 4 libdispatch.dylib 0x3950 _dispatch_client_callout + 20 5 libdispatch.dylib 0x12a70 _dispatch_lane_barrier_sync_invoke_and_complete + 56 6 MediaToolbox 0x2a7a6c -[FigSubtitleCALayer setPlayer:] + 64 7 AVFCore 0x6e2c0 -[AVPlayer _removeLayer:videoLayer:closedCaptionLayer:subtitleLayer:interstitialLayer:] + 572 8 AVFCore 0x30dc8 -[AVPlayerLayer dealloc] + 324 9 Foundation 0x2cf78 NSKVODeallocate + 216 10 QuartzCore 0x68c54 CA::Layer::free_transaction(CA::Transaction*) + 404 11 QuartzCore 0x4f284 CA::Transaction::commit() + 952 12 MediaToolbox 0x3375bc setBounds + 376 13 MediaToolbox 0x200160 UpdateLayoutContext + 892 14 MediaToolbox 0x1feda8 onCaptionInputDo + 212 15 libdispatch.dylib 0x3950 _dispatch_client_callout + 20 16 libdispatch.dylib 0xb0ac _dispatch_lane_serial_drain + 664 17 libdispatch.dylib 0xbc10 _dispatch_lane_invoke + 392 18 libdispatch.dylib 0x16318 _dispatch_workloop_worker_thread + 656 19 libsystem_pthread.dylib 0x11b0 _pthread_wqthread + 288 20 libsystem_pthread.dylib 0xf50 start_wqthread + 8
Posted
by nanjunda.
Last updated
.
Post not yet marked as solved
2 Replies
163 Views
When i use capcut pro apk in ios device but it did not work and i tried many times but i failed could anyone tell me the reason why i should not use it. https://apkpure7.com/
Posted
by AlexWaqas.
Last updated
.