Dive into the world of video on Apple platforms, exploring ways to integrate video functionalities within your iOS,iPadOS, macOS, tvOS, visionOS or watchOS app.

Video Documentation

Posts under Video subtopic

Post

Replies

Boosts

Views

Activity

FxPlug4.3 & Window
1.In the FxRemoteWindowAPI protocol, there is no way to set window.frame.origin. 2.When using NSWindow, you cannot set [Window setLevel:NSFloatingWindowLevel]. 3.How can I keep the window in front of Final Cut Pro without affecting the normal use of Final Cut Pro?
1
0
416
Oct ’24
fcpxml asset-clip "tcFormat" attribute question
I'm trying to create code to generate an fcpxml file so I can automate Final Cut Pro timeline (project) creation. Here's an xml element that FCP successfully imports (and successfully creates a project/timeline). <project name="2013-08-09 19_23_07 (id).mov"> <sequence format="r1"> <spine> <asset-clip ref="r2" offset="0s" name="2013-08-09 19_23_07 (id).mov" start="146173027/60000s" duration="871871/60000s" tcFormat="DF" audioRole="dialogue"></asset-clip> </spine> </sequence> </project> The xml element example above was generated by exporting a simple timeline with a single clip. The problem I'm having is the media asset has timecode that gives a start time in relation to the timecode. When I try to remove timecode attributes and change the start time to "0s" <asset-clip ref="r2" offset="0s" name="2013-08-09 19_23_07 (id).mov" start="0s" duration="871871/60000s" audioRole="dialogue"></asset-clip> FCP complains with the import error: 2013-08-09 19_23_07 (id).fcpxml Invalid edit with no respective media. (/fcpxml[1]/project[1]/sequence[1]/spine[1]/asset-clip[1]) I guess the question is, does AVAsset provide a way to get the timecode information and the timecode based start offset, or is there a way to tell FCP to use a default start time independent of timecode?
1
0
384
Oct ’24
Why VTPixelTransferSession is mark available above iOS16 on XCode 16
xcode 16: VT_EXPORT void VT_EXPORT OSStatus VTPixelTransferSessionCreate( CM_NULLABLE CFAllocatorRef allocator, CM_RETURNS_RETAINED_PARAMETER CM_NULLABLE VTPixelTransferSessionRef * CM_NONNULL pixelTransferSessionOut) API_AVAILABLE(macos(10.8), ios(16.0), tvos(16.0), visionos(1.0)) API_UNAVAILABLE(watchos); xcode 15: VT_EXPORT OSStatus VTPixelTransferSessionCreate( CM_NULLABLE CFAllocatorRef allocator, CM_RETURNS_RETAINED_PARAMETER CM_NULLABLE VTPixelTransferSessionRef * CM_NONNULL pixelTransferSessionOut) VT_AVAILABLE_STARTING(10_8);
1
0
600
Oct ’24
How to Load Stereoscopic Video Using AVFoundation?
I’m currently working on an iOS project that involves loading and playing stereoscopic/spatial videos. I’m using the AVFoundation framework, specifically AVURLAsset, but I’m having trouble determining how to correctly load and handle stereoscopic videos. I would like to know: Any guidance or code snippets would be greatly appreciated, I´m not understanding pretty well the apple developer videos... Thank you in advance for your help! Best, Lau
1
0
553
Oct ’24
AVAssetExportSession is not working on Iphone 16 pro max.
My App is live on app store , user are using it with iPhone 16 pro max and they are getting Operation Stopped while combining videos and audios only specifically on iPhone 16 pro max , on every other device its working fine. And When i adding AVAssetExportPresetPassthrough it able to combine videos and audios but not respecting the encoding and without audio. NSArray *compatiblePresets = [AVAssetExportSession exportPresetsCompatibleWithAsset:composition]; if ([compatiblePresets containsObject:AVAssetExportPresetHighestQuality]) { presetName = AVAssetExportPresetHighestQuality; } else if ([compatiblePresets containsObject:AVAssetExportPreset1920x1080]) { presetName = AVAssetExportPreset1920x1080; } else if ([compatiblePresets containsObject:AVAssetExportPreset1280x720]) { presetName = AVAssetExportPreset1280x720; } else { presetName = AVAssetExportPresetPassthrough; } } else { presetName = AVAssetExportPreset1280x720; }
5
2
824
Oct ’24
AVPlayerViewController not displaying playback controls in iOS 18
Hi everyone, I’ve encountered an issue with the showsPlaybackControls property in AVPlayerViewController after updating to iOS 18. Even though it’s set to true, the native playback controls (play, pause, etc.) are no longer appearing as they used to in previous iOS versions. This behavior was consistent and worked perfectly prior to iOS 18. Additionally, I’m seeing the same problem when using the VideoPlayer in SwiftUI. The native controls that should appear by default seem to have vanished after the update. Has anyone else experienced this? Is there any workaround or additional configuration required to restore the native controls? Any help or insights would be appreciated. Thanks! struct CustomPlayerView: UIViewControllerRepresentable { let player: AVPlayer func updateUIViewController(_ playerController: AVPlayerViewController, context: Context) { playerController.player = player playerController.showsPlaybackControls = true player.play() } func makeUIViewController(context: Context) -> AVPlayerViewController { return AVPlayerViewController() } }
8
1
1.6k
Oct ’24
Handling YOLOv8 Object Detection in 60FPS UltraWideCamera on iOS: Frame Processing Query
I am developing an iOS app that uses YOLOv8 for object detection and aims to detect objects at 60 FPS using the UltraWide camera. My goal is to process every frame within captureOutput and utilize the detected data (such as coordinates) for each one. I have a question regarding how background thread processing behaves in this scenario. Does the size of the YOLO model (n, s, m, etc.) or the weight of the operations inside captureOutput affect the number of frames that can be successfully processed? Specifically, I would like to know if all frames will be processed sequentially with a delay due to heavy processing in the background, or if some frames will be dropped and not processed at all. Any insights on how to handle this would be greatly appreciated. Thank you!
2
0
1.1k
Oct ’24
Accessing Events from Video Device
I have an intra-oral video device that's supported by Apple's AVCaptureDevice. I can use the AV classes to connect to the device and get video. However, this device has a button that's used to acquire still images from the video stream. I can't use the IOUSBDeviceInterface to do an asynchronous read, because the Apple driver has the device opened exclusively. How do I go about receiving the button event in this scenario? I know which pipe to read, based on a bus analyzer when I run this on Windows, I just need to know how to access that pipe when the device is opened by another process.
1
0
632
Oct ’24