Dive into the world of video on Apple platforms, exploring ways to integrate video functionalities within your iOS,iPadOS, macOS, tvOS, visionOS or watchOS app.

Video Documentation

Post

Replies

Boosts

Views

Activity

iOS18 iPhoneXR transcode wrong
When I use the exportPresetsCompatibleWithAsset interface in iOS 17, everything works fine. However, when I upgraded my iPhone XR to iOS 18, calling the exportPresetsCompatibleWithAsset interface to transcode 4K assets returned success, but when I tried to save to the album, it returned error 3302. Is it because this interface has already been deprecated in iOS 16?
0
0
37
4d
What is the programmatic way to capture screen from a connected iOS device to the Mac?
What framework to use to capture screen of a device connected to the Mac in the way OBS or QuickTime Player does when an iOS device is connected to Mac via USB. I tried to list devices with AVFoundation and ScreenCaptureKit but only Mac camera, mic and displays are listed. When you select New Movie Recording in the QuickTime Player you can choose an Connected iPad or iPhone to record it's screan. Same with OBS. What is the way to do it in my own MacOS app?
2
0
228
1w
newCommandQueue got nil
I am using Metal for rendering, and when calling the newCommandQueue interface of Metal, there is a certain probability that I will get a nil return value. However, when I call the MTLCreateSystemDefaultDevice interface, I can get a non-empty return value, which means my device supports Metal. I would like to ask what causes the newCommandQueue to return nil? Is there any way to avoid this situation? Thank you.
1
0
187
2w
Location not visible in video recorded in third party app
I recently bought an insta360 flow gimbal. when recording video with the instaflow app, I cannot see the location in apple photos app and all other apple apps. However I can see the location in windows photos app once I downloaded the videos into my windows PC. The location is also visible in android app once I share it through google account. With an exif app, I can see the location meta data in exif table as well, but again not shown as location. exiftool in my pc can also see the meta data including location as in attached screenshot. Compared to video shot with built-in camera app, I cannot find any difference in terms of location meta data. What could be wrong? I contacted insta360 app support, they do not seem to understand what's going on, just asking for very simple questions again and again like do you enable GPS location access, are you shooting video? I also contacted apple support, they are just saying it's thirdparty issue and refusing to help further. If it's really thirdparty issue how come the location data is actually embeded as meta data, and windows pc and android device can see the location? BTW, I air drop this video to all my apple devices like iPhone 15 ultra and ipad air, and very old iPhone, all of them cannot see the location.
1
0
248
2w
PiP not launching from a WKWebview Sandboxed app
Hi, I am developing an app that has a WKWebView and it can open sites like Youtube. The app is sandboxed as it is meant to be uploaded to the mac App Store. It has a feature PiP where we start the native PiP by calling a browser Javascript where we tell the WKWEBView to fire the PiP. It works well when we are running the code from XCODE in Debug scheme. When we run the code from release mode by archiving it or directly from the build folder, the WKWebView is not able to fire the PiP Agent and thus the Native PiP window is not visible, while the site shows that PiP is opened and we can here the sound being played. But PiP window is not visible. I cannot see PiPAgent in activity monitor. Why does it not work from within the release build outside xcode. But when I try to run the build directly from the Finder in builds folder, this PiP feature does not work. Request technical help for this. Thanks!
0
0
154
2w
AVPlayer & Quicktime Player reporting incorrect total duration
I have an mp4 file (which is around 25min) which i need to play but if i open it in quick time player or AVplayer, it shows as around 55min in those two platfroms. I don't understand why this happens. It correctly shows the time when opening with a browser or VSCode built in video playback. Here is the link to the file: https://www.dropbox.com/scl/fi/hbg59uqx8xdpiqbnx5wz8/videoplayback-1.mp4?rlkey=7o8l8m7j8dhq0o6f3zssgv9bd&st=5lt6apug&dl=0
0
0
164
3w
launch app by scan NFC Tags
My app need a specific scene that play a video when my iPhone close to NFC Tags. and my app can read the data from NFC Tags, the data will tell us which kind of video can be play. I tried to write URLScheme or Universal Link in NFC Tags, but all this ways will pop up notifications. not launch my app and play a video, how can I design my app. please give me some advice, thanks!
1
0
220
3w
Ability to hide/show tvOS AVPlayerViewController's progress bar
I'm working on streaming tvOS app and as you know there are mostly two type of video streams - live and vod. AVPlayerViewController handles these types of streams by showing respective playback controls. Recently I got a task to implement synchronous vod playback(syncVod), it's when we need to simulate live playback while actual vod stream playback. In order to simulate live playback below things needs to be handled: Disabling scrubbing via remote. (Done. playerVc.requiresLinearPlayback = true) Disabling info panel view w/play "From beginning" button. (Done, playerVc.playbackControlsIncludeInfoViews = false) Disabling play/pause button.(Done, not ideally though. On rate change observer - if player.rate == 0 && playbackMode == .syncVod { player.play() return }). Why not ideal solution - tapping on remote causes quite short hiccup in playback - but playback resumes, no actual pause happens. Hiding progress bar and time labels. :( Point #4 is the main problem, we can't hide progress bar and it's related UI elements(time labels) particularly, but only hide all playback controls - playerVc.showsPlaybackControls = false. The thing is I have custom buttons in transportBarCustomMenuItems and hiding all playback controls is not the right option for me. Implementing custom playback controls panel is kind of heavy lift, but as of now it seems the only proper way of implementing syncVod playback ideally. Did anyone face similar issue and could resolve it w/out implementing custom playback controls panel ? Is there way to hide progress bar only in tvOS AVPlayerViewController?
0
0
205
3w
Crash with AVMutableComposition insertTimeRange Method
A small number of crashes are being reported on Firebase. When attempting to use the insertTimeRange:ofAsset:atTime:error: method of AVMutableComposition, a crash occurred with the error message -[__NSArrayM insertObject:atIndex:]: object cannot be nil. Most of them appear in versions of ios 17.0 and above. Here's my code: - (AVMutableComposition *)createtrimAsset:(AVAsset *)asset andStartTime:(CGFloat)startTime endTime:(CGFloat)endTime{ NSError *error = nil; CGFloat timescale = 1000000; AVMutableComposition *mutableComposition = [AVMutableComposition composition]; CMTime sStartTime = CMTimeMakeWithSeconds(CMTimeGetSeconds(asset.duration)*startTime, timescale); CMTime eEndTime = CMTimeMakeWithSeconds(CMTimeGetSeconds(asset.duration)*endTime, timescale); [mutableComposition insertTimeRange:CMTimeRangeMake(sStartTime, CMTimeSubtract(eEndTime, sStartTime)) ofAsset:asset atTime:kCMTimeZero error:&error]; return mutableComposition; } I attempted to reproduce this crash by deliberately setting the timeRange or asset to unusual values, such as asset=nil, or asset.duration=0, or asset.duration=NAN, but all attempts failed. What could be causing this exception? Any advice would be of great help to me.
0
0
171
4w
When adding a VideoPlayerComponent to an Entity placed in ImmersiveSpace and attempting to play an 8K video, the application crashes.
OS:VisionOS 1.0 Xcode:15.2 In the application under development, do the following Open ImmersiveSpace Add VideoPlayerComponent to Entity Play 8K Video the App crash The Apple symbol appears and returned to the Home but, The problem does not occur if the application is created by extracting only the part of the 8K video to be played back. Error Log apply fence tx failed (client=0x6fbf0fcc) [0xfffffecc (ipc/mig) server died] Failed to commit transaction (client=0x58510d43) [0x10000003 (ipc/send) invalid destination port] nw_read_request_report [C1] Receive failed with error "No message available on STREAM" nw_protocol_socket_reset_linger [C1:2] setsockopt SO_LINGER failed [22: Invalid argument] Failed to set override status for bind point component member. Message from debugger: Terminated due to signal 9 I can't share the entire application, but is anyone else experiencing the same problem? Is this a memory issue?
0
0
236
4w
When VideoTool Box compress JPEG and H264, the output format(YUV420) is different from the input format.
The M series utilizes VideoToolBox GPU compression with a YUV422 format kCVPixelFormatType_422YpCbCr8BiPlanarVideoRange input, and the compressed output JPEG image format remains YUV420. For the Intel series GPU compression, a YUV420 format kCVPixelFormatType_420YpCbCr8Planar input is required, and the compressed output JPEG image format is YUV422. The output format after compression is not consistent with the input format. Does VideoToolBox GPU compression support output YUV422 or YUV444 JPEG images and H.264 streams?
0
1
287
Jun ’24
WWDC Lab feedback
I am writing to follow up with my lab in WWDC24. I had 1:1 lab with Mr. Kavin, we had good 30 minutes lab and for follow up questions Kavin asked me to post it using feedback. Following is my questin: We have screenshare in our application and trying to use CFMessagePort for passing CVPixelBufferRef from broadcast extension to Applicaiton. Questions: How to copy planes of IOSurface backed CVPixelBufferRef onto another one without using memcpy, is there a zero-copy method? How to get notified when an IOSurface backed CVPixelBufferRef data get changed by another process. How to send an IOSurface backed CVPixelBufferRef from Broadcast Extension to application. How to pass unowned IOSurfaceRef from the Broadcast Extension to appliction.
0
1
200
Jun ’24
ProRes 4444 blocky compression artifacts
I’m creating a objective C command-line utility to encode RAW image sequences to ProRes 4444, but I’m encountering, blocky compression artifacts in the ProRes 4444 video output. To test the integrity of the image data before encoding to ProRes, I added a snippet in my encoding function that saves a 16-bit PNG before encoding to ProRes and the PNG looks perfect, I can see all detail in every part of the image dynamic range. Here’s a comparison between the 16-bit PNG(on the right) and the ProRes 4444 output. (on the left) As a further test, I re-encoded the ‘test PNG’ to ProRes 4444 using DaVinci Resolve, and the ProRes4444 output video from Resolve doesn’t have any blocky compression artifacts. Looks identical. In short, this is what the utility does: Unpacks the 12-bit raw data into 16-bit values. After unpacking, the raw data is debayered to convert it into a standard color image format (BGR) using OpenCV. Scale the debayered pixel values from their original 12-bit depth to fit into a 16-bit range. Up to this point everything is fine and confirmed by saving 16bit PNGs. The images are encoded to ProRes 4444 using the AVFoundation framework. The pixel buffers are created and managed using dictionary method with ‘kCVPixelFormatType_64RGBALE’. I need help figuring this out, I’m a real novice when it comes to AVfoundation/encoding to ProRes. See relevant parts of my 'encodeToProRes' function: void encodeToProRes(const std::string &outputPath, const std::vector<std::string> &rawPaths, const std::string &proResFlavor) { NSError *error = nil; NSURL *url = [NSURL fileURLWithPath:[NSString stringWithUTF8String:outputPath.c_str()]]; AVAssetWriter *assetWriter = [AVAssetWriter assetWriterWithURL:url fileType:AVFileTypeQuickTimeMovie error:&error]; if (error) { std::cerr << "Error creating AVAssetWriter: " << error.localizedDescription.UTF8String << std::endl; return; } // Load the first image to get the dimensions std::cout << "Debayering the first image to get dimensions..." << std::endl; Mat firstImage; int width = 5320; int height = 3900; if (!debayer_image(rawPaths[0], firstImage, width, height)) { std::cerr << "Error debayering the first image" << std::endl; return; } width = firstImage.cols; height = firstImage.rows; // Save the first frame as a PNG 16-bit image for validation std::string pngFilePath = outputPath + "_frame1.png"; if (!imwrite(pngFilePath, firstImage)) { std::cerr << "Error: Failed to save the first frame as a PNG image" << std::endl; } else { std::cout << "First frame saved as PNG: " << pngFilePath << std::endl; } NSString *codecKey = nil; if (proResFlavor == "4444") { codecKey = AVVideoCodecTypeAppleProRes4444; } else if (proResFlavor == "422HQ") { codecKey = AVVideoCodecTypeAppleProRes422HQ; } else if (proResFlavor == "422") { codecKey = AVVideoCodecTypeAppleProRes422; } else if (proResFlavor == "LT") { codecKey = AVVideoCodecTypeAppleProRes422LT; } else { std::cerr << "Error: Invalid ProRes flavor specified: " << proResFlavor << std::endl; return; } NSDictionary *outputSettings = @{ AVVideoCodecKey: codecKey, AVVideoWidthKey: @(width), AVVideoHeightKey: @(height) }; AVAssetWriterInput *videoInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:outputSettings]; videoInput.expectsMediaDataInRealTime = YES; NSDictionary *pixelBufferAttributes = @{ (id)kCVPixelBufferPixelFormatTypeKey: @(kCVPixelFormatType_64RGBALE), (id)kCVPixelBufferWidthKey: @(width), (id)kCVPixelBufferHeightKey: @(height) }; AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:videoInput sourcePixelBufferAttributes:pixelBufferAttributes]; ... [assetWriter startSessionAtSourceTime:kCMTimeZero]; CMTime frameDuration = CMTimeMake(1, 24); // Frame rate of 24 fps int numFrames = static_cast<int>(rawPaths.size()); ... // Encoding thread std::thread encoderThread([&]() { int frameIndex = 0; std::vector<CVPixelBufferRef> pixelBufferBuffer; while (frameIndex < numFrames) { std::unique_lock<std::mutex> lock(queueMutex); queueCondVar.wait(lock, [&]() { return !frameQueue.empty() || debayeringFinished; }); if (!frameQueue.empty()) { auto [index, debayeredImage] = frameQueue.front(); frameQueue.pop(); lock.unlock(); if (index == frameIndex) { cv::Mat rgbaImage; cv::cvtColor(debayeredImage, rgbaImage, cv::COLOR_BGR2RGBA); CVPixelBufferRef pixelBuffer = NULL; CVReturn result = CVPixelBufferPoolCreatePixelBuffer(NULL, adaptor.pixelBufferPool, &pixelBuffer); if (result != kCVReturnSuccess) { std::cerr << "Error: Could not create pixel buffer" << std::endl; dispatch_group_leave(dispatchGroup); return; } CVPixelBufferLockBaseAddress(pixelBuffer, 0); void *pxdata = CVPixelBufferGetBaseAddress(pixelBuffer); for (int row = 0; row < height; ++row) { memcpy(static_cast<uint8_t*>(pxdata) + row * CVPixelBufferGetBytesPerRow(pixelBuffer), rgbaImage.ptr(row), width * 8); } CVPixelBufferUnlockBaseAddress(pixelBuffer, 0); pixelBufferBuffer.push_back(pixelBuffer); ... Thanks very much!
1
0
307
Jun ’24
AirDrop Requesting to Open Video via 3rd Party App Instead of Photo Library
I film and edit content using my iPhone and PremierePro. I've been doing this since 2017, so I'd like to say I know my way around an iPhone and the PremierePro software enough to get a finished product to my client. Before I send content to clients, I airdrop it to my phone to make sure that the quality of it is up to par. On this most recent project, I've had issues airdropping it to myself. Each time I do so, my iPhone prompts which 3rd party app I would like to open the video in, rather than automatically opening or saving the video into my photo library. I will list the specs below: Filmed on iPhone 15 Plus iOS Version 17.5.1 (at the moment this is the most up to date software update) Filmed in 4K at 60 FPS I have ample storage space on my phone and the video file size is 220MB Premiere Pro Export Settings: Video Settings - H.264, Field Order: Progressive, Bit Rate Encoding: CBR I will say that I purchased a transition and burns bundle and used it for the first time on this project. All materials used are in an .mp4 format and blend mode was set to overlay. Nothing out of pocket. I figured it wouldn't be a problem since my client would just be downloading it via dropbox, but there was an issue there as well. My client received an error message saying, "Sorry, this type of video cannot be saved to this device". The back road workaround was to take the saved video, plug it into a separate PremierePro project, and export it with the preset: Match Source - Adaptive High Bitrate. I was able to airdrop it to myself, it saved in my photo album and my client was able to download it without receiving that error message. If there is an explanation as to why I am having this issue and how I can avoid it, I would really appreciate it as I have never had this problem in the past.
0
0
236
Jun ’24
Reducing storage of similar PNGs by compressing them into a video and retrieving them losslessly--possibility or dumb idea?
My app stores and transports lots of groups of similar PNGs. These aren't compressed well by official algorithms like .lzfse, .lz4, .lzbitmap... not even bz2, but I realized that they are well-suited for compression by video codecs since they're highly similar to one another. I ran an experiment where I compressed a dozen images into an HEVCWithAlpha .mov via AVAssetWriter, and the compression ratio was fantastic, but when I retrieved the PNGs via AVAssetImageGenerator there were lots of artifacts which simply wasn't acceptable. Maybe I'm doing something wrong, or maybe I'm chasing something that doesn't exist. Is there a way to use video compression like a specialized archive to store and retrieve PNGs losslessly while retaining alpha? I have no intention of using the videos except as condensed storage. Any suggestions on how to reduce storage size of many large PNGs are also welcome. I also tried using HEVC instead of PNG via the new UIImage.hevcData(), but the decompression/processing times were just insane (5000%+ increase), on top of there being fatal errors when using async.
18
0
683
Jun ’24