Dive into the world of video on Apple platforms, exploring ways to integrate video functionalities within your iOS,iPadOS, macOS, tvOS, visionOS or watchOS app.

Video Documentation

Posts under Video subtopic

Post

Replies

Boosts

Views

Created

FxPlug, Titles in FCPX where to start?
Hi, trying to wrap my head around Xcode's FXPlug. I already sell Final Cut Pro titles for a company. These titles were built in motion. However, they want me to move them to an app and I'm looking for any help on how to accomplish this *What the app should do is: Allow users with an active subscription to our website the ability to access titles within FCPX and if they are not an active subscriber, for access to be denied.
2
0
2.3k
Oct ’22
AV1 Hardware Decoding
Recently I've been trying to play some AV1-encoded streams on my iPhone 15 Pro Max. First, I check for hardware support: VTIsHardwareDecodeSupported(kCMVideoCodecType_AV1); // YES Then I need to create a CMFormatDescription in order to pass it into a VTDecompressionSession. I've tried the following: { mediaType:'vide' mediaSubType:'av01' mediaSpecific: { codecType: 'av01' dimensions: 394 x 852 } extensions: {{ CVFieldCount = 1; CVImageBufferChromaLocationBottomField = Left; CVImageBufferChromaLocationTopField = Left; CVPixelAspectRatio = { HorizontalSpacing = 1; VerticalSpacing = 1; }; FullRangeVideo = 0; }} } but VTDecompressionSessionCreate gives me error -8971 (codecExtensionNotFoundErr, I assume). So it has something to do with the extensions dictionary? I can't find anywhere which set of extensions is necessary for it to work 😿. VideoToolbox has convenient functions for creating descriptions of AVC and HEVC streams (CMVideoFormatDescriptionCreateFromH264ParameterSets and CMVideoFormatDescriptionCreateFromHEVCParameterSets), but not for AV1. As of today I am using XCode 15.0 with iOS 17.0.0 SDK.
8
3
8.9k
Oct ’23
FCPXML Creation issue...
I have generated FCPXML, but i can't figure out issue: <?xml version="1.0"?> <fcpxml version="1.11"> <resources> <format id="r1" name="FFVideoFormat3840x2160p2997" frameDuration="1001/30000s" width="3840" height="2160" colorSpace="1-1-1 (Rec. 709)"/> <asset id="video0" name="11a(1-5).mp4" start="0s" hasVideo="1" videoSources="1" duration="6.81s"> <media-rep kind="original-media" src="file:///Volumes/Dropbox/RealMedia Dropbox/Real Media/Media/Test/Test AE videos, City, testOLOLO/video/11a(1-5).mp4"/> </asset> <asset id="video1" name="12(4)r8 mute.mp4" start="0s" hasVideo="1" videoSources="1" duration="9.94s"> <media-rep kind="original-media" src="file:///Volumes/Dropbox/RealMedia Dropbox/Real Media/Media/Test/Test AE videos, City, testOLOLO/video/12(4)r8 mute.mp4"/> </asset> <asset id="video2" name="13 mute.mp4" start="0s" hasVideo="1" videoSources="1" duration="6.51s"> <media-rep kind="original-media" src="file:///Volumes/Dropbox/RealMedia Dropbox/Real Media/Media/Test/Test AE videos, City, testOLOLO/video/13 mute.mp4"/> </asset> <asset id="video3" name="13x (8,14,24,29,38).mp4" start="0s" hasVideo="1" videoSources="1" duration="45.55s"> <media-rep kind="original-media" src="file:///Volumes/Dropbox/RealMedia Dropbox/Real Media/Media/Test/Test AE videos, City, testOLOLO/video/13x (8,14,24,29,38).mp4"/> </asset> </resources> <library> <event name="Untitled"> <project name="Untitled Project" uid="28B2D4F3-05C4-44E7-8D0B-70A326135EDD" modDate="2024-04-17 15:44:26 -0400"> <sequence format="r1" duration="4802798/30000s" tcStart="0s" tcFormat="NDF" audioLayout="stereo" audioRate="48k"> <spine> <asset-clip ref="video0" offset="0/10000s" name="11a(1-5).mp4" duration="0/10000s" format="r1" tcFormat="NDF"/> <asset-clip ref="video1" offset="12119/10000s" name="12(4)r8 mute.mp4" duration="0/10000s" format="r1" tcFormat="NDF"/> <asset-clip ref="video2" offset="22784/10000s" name="13 mute.mp4" duration="0/10000s" format="r1" tcFormat="NDF"/> <asset-clip ref="video3" offset="34544/10000s" name="13x (8,14,24,29,38).mp4" duration="0/10000s" format="r1" tcFormat="NDF"/> </spine> </sequence> </project> </event> </library> </fcpxml> Any ideas?
2
0
819
May ’24
FxPlug4.3_FxRemoteWindowAPI_window.frame.origin?
1.在Fxplug4.3的 FxRemoteWindowAPI 的协议中,没有提供window.frame.origin的设置。 2.如果我自定义NSWindow时,在FxPlug中 [Window setLevel:NSFloatingWindowLevel];也没有执行。 3.请问我应该如何把窗口保留在Final cut pro的前面,并且不影响Final cut pro 的操作呢?
1
0
491
Aug ’24
FxPlug4.3_NSPanel_setLevel
NSPanel *panel = [[myPanel alloc] initWithContentRect:NSMakeRect(100, 100, 400, 300) styleMask:NSWindowStyleMaskTitled | NSWindowStyleMaskClosable backing:NSBackingStoreBuffered defer:NO]; [panel setLevel:NSFloatingWindowLevel];//无效???? [panel makeKeyAndOrderFront:self]; 问题:在FxPlug4.3中使用setLevel不能将panel放在Final cut pro和Mition的前面? 救命~~~全世界都没找到答案!
1
0
502
Aug ’24
Sending '$0' risks causing data races
I had no luck to compile a sample code provided by apple with Xcode 16.0 beta 5. ScreenCaptureKit demo (https://developer.apple.com/documentation/screencapturekit/capturing_screen_content_in_macos) The part it is failling is, streamOutput.capturedFrameHandler = { continuation.yield($0) } And the error message is Sending '$0' risks causing data races Task-isolated '$0' is passed as a 'sending' parameter; Uses in callee may race with later task-isolated uses Please enlighten me why this is an issue and how to avoid? Thanks in advance!
3
2
2.6k
Aug ’24
FxPlug4.3 & Window
1.In the FxRemoteWindowAPI protocol, there is no way to set window.frame.origin. 2.When using NSWindow, you cannot set [Window setLevel:NSFloatingWindowLevel]. 3.How can I keep the window in front of Final Cut Pro without affecting the normal use of Final Cut Pro?
1
0
416
Aug ’24
Alternative for crashing API MPMediaItemArtwork
When setting the now playing info for playing media in MPNowPlayingInfoCenter we can set artwork. But it seems the Apple API for creating the artwork is crashing on iOS 18 (FB15145734). On iOS 17 this gave the warning that the completion handler was not run on the main thread. I've tried to seek help here: https://stackoverflow.com/questions/78989543/swift-data-race-with-appkit-mpmediaitemartwork-function/78990231?noredirect=1#comment139277425_78990231 but it seems that it's not possible to override the completion handler and therefor it's up to Apple to fix this issue. .task { await MainActor.run { let nowPlayingInfoCenter = MPNowPlayingInfoCenter.default() var nowPlayingInfo = [String: Any]() let image = NSImage(named: "image")! // warning: data race detected: @MainActor function at MPMediaItemArtwork/ContentView.swift:22 was not called on the main thread nowPlayingInfo[MPMediaItemPropertyArtwork] = MPMediaItemArtwork(boundsSize: image.size, requestHandler: { _ in // Not on main thread here! return image }) nowPlayingInfoCenter.nowPlayingInfo = nowPlayingInfo } } I'm wondering if there is an alternative method to set the now playing artwork?
4
0
864
Sep ’24
fcpxml asset-clip "tcFormat" attribute question
I'm trying to create code to generate an fcpxml file so I can automate Final Cut Pro timeline (project) creation. Here's an xml element that FCP successfully imports (and successfully creates a project/timeline). <project name="2013-08-09 19_23_07 (id).mov"> <sequence format="r1"> <spine> <asset-clip ref="r2" offset="0s" name="2013-08-09 19_23_07 (id).mov" start="146173027/60000s" duration="871871/60000s" tcFormat="DF" audioRole="dialogue"></asset-clip> </spine> </sequence> </project> The xml element example above was generated by exporting a simple timeline with a single clip. The problem I'm having is the media asset has timecode that gives a start time in relation to the timecode. When I try to remove timecode attributes and change the start time to "0s" <asset-clip ref="r2" offset="0s" name="2013-08-09 19_23_07 (id).mov" start="0s" duration="871871/60000s" audioRole="dialogue"></asset-clip> FCP complains with the import error: 2013-08-09 19_23_07 (id).fcpxml Invalid edit with no respective media. (/fcpxml[1]/project[1]/sequence[1]/spine[1]/asset-clip[1]) I guess the question is, does AVAsset provide a way to get the timecode information and the timecode based start offset, or is there a way to tell FCP to use a default start time independent of timecode?
1
0
383
Sep ’24
AVAssetExportSession is not working on Iphone 16 pro max.
My App is live on app store , user are using it with iPhone 16 pro max and they are getting Operation Stopped while combining videos and audios only specifically on iPhone 16 pro max , on every other device its working fine. And When i adding AVAssetExportPresetPassthrough it able to combine videos and audios but not respecting the encoding and without audio. NSArray *compatiblePresets = [AVAssetExportSession exportPresetsCompatibleWithAsset:composition]; if ([compatiblePresets containsObject:AVAssetExportPresetHighestQuality]) { presetName = AVAssetExportPresetHighestQuality; } else if ([compatiblePresets containsObject:AVAssetExportPreset1920x1080]) { presetName = AVAssetExportPreset1920x1080; } else if ([compatiblePresets containsObject:AVAssetExportPreset1280x720]) { presetName = AVAssetExportPreset1280x720; } else { presetName = AVAssetExportPresetPassthrough; } } else { presetName = AVAssetExportPreset1280x720; }
5
2
823
Sep ’24
Accessing Events from Video Device
I have an intra-oral video device that's supported by Apple's AVCaptureDevice. I can use the AV classes to connect to the device and get video. However, this device has a button that's used to acquire still images from the video stream. I can't use the IOUSBDeviceInterface to do an asynchronous read, because the Apple driver has the device opened exclusively. How do I go about receiving the button event in this scenario? I know which pipe to read, based on a bus analyzer when I run this on Windows, I just need to know how to access that pipe when the device is opened by another process.
1
0
632
Oct ’24
AVPlayerViewController not displaying playback controls in iOS 18
Hi everyone, I’ve encountered an issue with the showsPlaybackControls property in AVPlayerViewController after updating to iOS 18. Even though it’s set to true, the native playback controls (play, pause, etc.) are no longer appearing as they used to in previous iOS versions. This behavior was consistent and worked perfectly prior to iOS 18. Additionally, I’m seeing the same problem when using the VideoPlayer in SwiftUI. The native controls that should appear by default seem to have vanished after the update. Has anyone else experienced this? Is there any workaround or additional configuration required to restore the native controls? Any help or insights would be appreciated. Thanks! struct CustomPlayerView: UIViewControllerRepresentable { let player: AVPlayer func updateUIViewController(_ playerController: AVPlayerViewController, context: Context) { playerController.player = player playerController.showsPlaybackControls = true player.play() } func makeUIViewController(context: Context) -> AVPlayerViewController { return AVPlayerViewController() } }
8
1
1.6k
Oct ’24
Writing video using AVAssetWriter, AVAssetReader, and AVSPEECHSYNTHESIZER
Hello, First, some version and software details: Software: iOS 18.1 Hardware: iPhone 14 Pro Max and later Xcode: 16.0 Summary: AVAssetReader is not concatenating a video at the beginning of the output video. The output video should contain a scene of me introducing the content, followed by a blue screen with AVSpeechSynthesizer reading out a text that I pasted above the "Generate Video" button. Details: Now, let's talk about the app. Basically, I’m developing an app that generates a video with the following features: My app will create an output video that is split into an opening scene followed by a fully blue screen. The opening scene will be taken from a video I choose from my gallery. I will read the opening video using AVAssetReader as usual. After the opening scene, I will use the content of a text read by AVSpeechSynthesizer.write(). After the opening scene, the synthesized audio will start playing while the blue screen is displayed. All of this is already defined in the attached project. Each project file has a comment at the beginning introducing its content. How to test: Write something in the field above the "Generate Video" button. For example, type "Hello, world!" Then, press the "Library" button and select a video from the gallery, about 30 seconds long. That’s it. Press the "Generate Video" button. The result I’ve experienced is a crash or failure to generate the video. Practical example of what I want to achieve: Suppose I record a 30-second video where I say, "I’m going to tell you the story of Snow White." Then, I paste the "Snow White" story into the field above the "Generate Video" button. The output video should contain me saying, "I’m going to tell you the story of Snow White." After that, the AVSpeechSynthesizer will read the story I pasted, while displaying a blue screen. I look forward to a solution. Thank you very much! convertToCMSampleBuffer.swift convertToPixelBuffer.swift createInputs.swift createVideo.swift test.swift saveVideo.swift TestApp.swift editingVideo.swift sampleReaderProvider.swift misc.swift sampleProvider.swift
8
0
1.1k
Oct ’24
Handling YOLOv8 Object Detection in 60FPS UltraWideCamera on iOS: Frame Processing Query
I am developing an iOS app that uses YOLOv8 for object detection and aims to detect objects at 60 FPS using the UltraWide camera. My goal is to process every frame within captureOutput and utilize the detected data (such as coordinates) for each one. I have a question regarding how background thread processing behaves in this scenario. Does the size of the YOLO model (n, s, m, etc.) or the weight of the operations inside captureOutput affect the number of frames that can be successfully processed? Specifically, I would like to know if all frames will be processed sequentially with a delay due to heavy processing in the background, or if some frames will be dropped and not processed at all. Any insights on how to handle this would be greatly appreciated. Thank you!
2
0
1.1k
Oct ’24
Using image instead of circles in particle system
Hi all, I'm working on a particle system. Got it to work using drawn circles. Now I want to replace the circle with an image. Trying to do so in Draw section, but not sure if that's the right place. Any suggestions for coding to: connect the image from BIN to Xcode to replace particles with the image(s). Kindly ty
1
0
335
Oct ’24
How to Load Stereoscopic Video Using AVFoundation?
I’m currently working on an iOS project that involves loading and playing stereoscopic/spatial videos. I’m using the AVFoundation framework, specifically AVURLAsset, but I’m having trouble determining how to correctly load and handle stereoscopic videos. I would like to know: Any guidance or code snippets would be greatly appreciated, I´m not understanding pretty well the apple developer videos... Thank you in advance for your help! Best, Lau
1
0
553
Oct ’24
Why VTPixelTransferSession is mark available above iOS16 on XCode 16
xcode 16: VT_EXPORT void VT_EXPORT OSStatus VTPixelTransferSessionCreate( CM_NULLABLE CFAllocatorRef allocator, CM_RETURNS_RETAINED_PARAMETER CM_NULLABLE VTPixelTransferSessionRef * CM_NONNULL pixelTransferSessionOut) API_AVAILABLE(macos(10.8), ios(16.0), tvos(16.0), visionos(1.0)) API_UNAVAILABLE(watchos); xcode 15: VT_EXPORT OSStatus VTPixelTransferSessionCreate( CM_NULLABLE CFAllocatorRef allocator, CM_RETURNS_RETAINED_PARAMETER CM_NULLABLE VTPixelTransferSessionRef * CM_NONNULL pixelTransferSessionOut) VT_AVAILABLE_STARTING(10_8);
1
0
600
Oct ’24
VideoPlayer crashes in Xcode Preview for macOS app
I think I have the simplest possible Mac app trying to see if I can have VideoPlayer work in an Xcode Preview. It works in an iOS app project. In a Mac app project it builds and runs. But if I preview in Xcode it crashes. The diagnostic says: | [Remote] Unknown Error: The operation couldn’t be completed. XPC error received on message reply handler | | BSServiceConnectionErrorDomain (3): | ==NSLocalizedFailureReason: XPC error received on message reply handler | ==BSErrorCodeDescription: OperationFailed The code I'm using is the exact code from the VideoPlayer documentation page. See this link. Any ideas about this XPC error, and how to work around? I'm using Xcode 16.0 on macOS 14.6.1
2
0
650
Oct ’24