Integrate photo, audio, and video content into your apps.

Posts under Media tag

55 Posts

Post

Replies

Boosts

Views

Activity

PHLivePhotoEditingContext.saveLivePhoto results in AVFoundation error -11800 "The operation could not be completed" reason An unknown error occurred (-12815)
When trying to edit some Live Photos, calling PHLivePhotoEditingContext.saveLivePhoto results in the following error: Error Domain=AVFoundationErrorDomain Code=-11800 "The operation could not be completed" UserInfo={NSLocalizedFailureReason=An unknown error occurred (-12815), NSLocalizedDescription=The operation could not be completed, NSUnderlyingError=0x300d05380 {Error Domain=NSOSStatusErrorDomain Code=-12815 "(null)"}} I was able to replicate it on my device by taking a new Live Photo. Not sure what's wrong with that one specifically, not all Live Photos replicate the issue. I've submitted FB15880825 with a sysdiagnose and a Photos Diagnostics as well. Any ideas what's going on here? It's impacting multiple customers. Thanks!
1
0
557
Jun ’25
AirDropped Videos from Photos Save to Files Instead of Photos on Receiving Device
My app allows users to capture and save videos to the Photos app using the following Swift code: PHPhotoLibrary.shared().performChanges { PHAssetChangeRequest.creationRequestForAssetFromVideo(atFileURL: fileURL) } completionHandler: { success, error in Videos are successfully saved to Photos and play correctly. However, users report that when they AirDrop these videos from the Photos app to another device (e.g., iPad to iPhone), the videos are saved in the Files app on the receiving device instead of the Photos app. This issue is more common with higher-resolution videos, such as 2K, recorded in HEVC format at 30 fps. I wasn't able to reproduce the issue locally. I've found a thread in public apple forum: https://discussions.apple.com/thread/255276865?sortBy=rank but I wonder maybe there are some special flags that I should clear or add to my videos (e.g. PHAssetChangeRequest)? Thank you!
0
0
82
May ’25
Screen recording audio and video out of sync
I use startCaptureWithHandler to record screen and AVAssetWriter appendSampleBuffer: to save audio and video ,but when played the saved file audio and video are out of sync. I don t know if it s a AVAssetWriterInputr setup problem,here is my code NSDictionary *audioCompressionSettings = @{ AVEncoderBitRatePerChannelKey : @(64000), AVFormatIDKey : @(kAudioFormatMPEG4AAC), AVNumberOfChannelsKey : @(2), AVSampleRateKey : @(44100) }; AVAssetWriterInput *audioAssetWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeAudio outputSettings:audioCompressionSettings]; audioAssetWriterInput.expectsMediaDataInRealTime = YES; [_assetWriter addInput:audioAssetWriterInput]; NSDictionary *videoCompressSetting = @{AVVideoAverageBitRateKey:@(screenWidth*screenHeight*5), AVVideoMaxKeyFrameIntervalKey:@(30), AVVideoProfileLevelKey : AVVideoProfileLevelH264MainAutoLevel}; NSDictionary *codecSetting = @{AVVideoCodecKey:AVVideoCodecTypeH264, AVVideoScalingModeKey : AVVideoScalingModeResize, AVVideoWidthKey:@(screenWidth*2), AVVideoHeightKey:@(screenHeight*2), AVVideoCompressionPropertiesKey:videoCompressSetting }; AVAssetWriterInput* videoAssetWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:codecSetting]; videoAssetWriterInput.expectsMediaDataInRealTime = YES; [_assetWriter addInput:videoAssetWriterInput];
1
0
83
Apr ’25
Creating an initial Now Playing state of paused - impossible?
I am working on an app which plays audio - https://youtu.be/VbAfUk_eYl0?si=nJg5ayy2faWE78-g - and one of the features is, on restart, if you had paused playback of a file at the time the app was previously shut down (or were playing one at the time of shutdown), the paused state and position in the file is restored exactly as it was, on restart. The functionality works. However, it seems impossible to get the "now playing" information in iOS into the right state to reflect that via the MediaPlayer API. On restart, handlers are attached to the play/pause/togglePlayPause actions on MPRemoteCommandCenter.shared(), and the map of media info is updated on MPNowPlayingInfoCenter.default().nowPlayingInfo. What happens is that iOS's media view shows the audio as playing and offers a pause button - even though the play action is enabled and the pause action is disabled. Once playback has been initiated (my workaround is to have the pause action toggle the play state, since otherwise you wouldn't be able to initiate playback from controls in a car without initiating it once from a device first). I've created a simplified white-noise-player demo to illustrate the problem - simply build and deploy it, and then start the app, lock your device and look at the playback controls on the lock screen. It will show a pause button - same behavior I've described. https://github.com/timboudreau/ios-play-pause-demo I've tried a few things to narrow down the source of the issue - for example, thinking that not MPNowPlayingInfoPropertyPlaybackProgress and MPMediaItemPropertyPlaybackDuration might be the culprit (since the system interpolates elapsed time and it's recommended to update those properties infrequently) on startup might do the trick, but the result is the same, just without a duration or progress shown. What governs this behavior, and is there some way to explicitly tell the media player API your current state is paused?
0
2
97
Apr ’25
PHPickerResult return different data for the one media
On some devices, when i select the same media multiple times, the data by` loadFileRepresentation(forTypeIdentifier: completionHandler) ` returned is different(data.count is not equal). environment: * Model: iPhone 12 * Model Number: MGGM3CH/A * iOS Version: 18.3.2 ```Swift // import PhotosUI func picker(_ picker: PHPickerViewController, didFinishPicking results: [PHPickerResult]) { picker.dismiss(animated: true, completion: nil) guard let provider = results.last?.itemProvider else { return } guard provider.hasItemConformingToTypeIdentifier(UTType.movie.identifier) else { return } Task { provider.loadFileRepresentation(forTypeIdentifier: UTType.movie.identifier) { url, error in guard let url = url else { return } if let data = try? Data(contentsOf: url) { print("data count is: \(data.count)") } } } } ``` ps: I also try some other function, eg: ` provide.loadItem(forTypeIdentifier:)`, but not work too.
1
0
45
Mar ’25
iOS Mobile Video Audio Playback Issues in React
I'm experiencing issues with audio playback in my React video player component specifically on iOS mobile devices (iPhone/iPad). Even after implementing several recommended solutions, including Apple's own guidelines, the audio still isn't working properly on iOS Safari. It works completely fine on Android. On iOS, I ensured the video doesn't autoplay (it requires user interaction). Here are all the details: Environment iOS Safari (latest version) React 18 TypeScript Video files: MP4 with AAC audio codec Current Implementation const VideoPlayer: React.FC<VideoPlayerProps> = ({ src, autoplay = true, }) => { const videoRef = useRef<HTMLVideoElement>(null); const isIOSDevice = isIOS(); // Custom iOS detection const [touchStartY, setTouchStartY] = useState<number | null>(null); const [touchStartTime, setTouchStartTime] = useState<number | null>(null); // Handle touch start event for gesture detection const handleTouchStart = (e: React.TouchEvent) => { setTouchStartY(e.touches[0].clientY); setTouchStartTime(Date.now()); }; // Handle touch end event with gesture validation const handleTouchEnd = (e: React.TouchEvent) => { if (touchStartY === null || touchStartTime === null) return; const touchEndY = e.changedTouches[0].clientY; const touchEndTime = Date.now(); // Validate if it's a legitimate tap (not a scroll) const verticalDistance = Math.abs(touchEndY - touchStartY); const touchDuration = touchEndTime - touchStartTime; // Only trigger for quick taps (< 200ms) with minimal vertical movement if (touchDuration < 200 && verticalDistance < 10) { handleVideoInteraction(e); } setTouchStartY(null); setTouchStartTime(null); }; // Simplified video interaction handler following Apple's guidelines const handleVideoInteraction = (e: React.MouseEvent | React.TouchEvent) => { console.log('Video interaction detected:', { type: e.type, timestamp: new Date().toISOString() }); // Ensure keyboard is dismissed (iOS requirement) if (document.activeElement instanceof HTMLElement) { document.activeElement.blur(); } e.stopPropagation(); const video = videoRef.current; if (!video || !video.paused) return; // Attempt playback in response to user gesture video.play().catch(err => console.error('Error playing video:', err)); }; // Effect to handle video source and initial state useEffect(() => { console.log('VideoPlayer props:', { src, loadingState }); setError(null); setLoadingState('initial'); setShowPlayButton(false); // Never show custom play button on iOS if (videoRef.current) { // Set crossOrigin attribute for CORS videoRef.current.crossOrigin = "anonymous"; if (autoplay && !hasPlayed && !isIOSDevice) { // Only autoplay on non-iOS devices dismissKeyboard(); setHasPlayed(true); } } }, [src, autoplay, hasPlayed, isIOSDevice]); return ( <Paper shadow="sm" radius="md" withBorder onClick={handleVideoInteraction} onTouchStart={handleTouchStart} onTouchEnd={handleTouchEnd} > <video ref={videoRef} autoPlay={!isIOSDevice && autoplay} playsInline controls crossOrigin="anonymous" preload="auto" onLoadedData={handleLoadedData} onLoadedMetadata={handleMetadataLoaded} onEnded={handleVideoEnd} onError={handleError} onPlay={dismissKeyboard} onClick={handleVideoInteraction} onTouchStart={handleTouchStart} onTouchEnd={handleTouchEnd} {...(!isFirefoxBrowser && { "x-webkit-airplay": "allow", "x-webkit-playsinline": true, "webkit-playsinline": true })} > <source src={videoSrc} type="video/mp4" /> </video> </Paper> ); }; Apple's Guidelines Implementation Removed custom play controls on iOS Using native video controls for user interaction Ensuring audio playback is triggered by user gesture Following Apple's audio session guidelines Properly handling the canplaythrough event Current Behavior Video plays but without sound on iOS mobile Mute/unmute button in native video controls doesn't work Audio works fine on desktop browsers and Android devices Videos are confirmed to have AAC audio codec No console errors related to audio playback User interaction doesn't trigger audio as expected Questions Are there any additional iOS-specific requirements I'm missing? Could this be related to iOS audio session handling? Are there known issues with React's handling of video elements on iOS? Should I be implementing additional audio context initialization? Any insights or suggestions would be greatly appreciated!
0
0
346
Mar ’25
Photo permission dialog not shown when iOS app runs on Mac
According to the docs: The first time your app performs an operation that requires [photo library] authorization, the system automatically and asynchronously prompts the user for it. (https://developer.apple.com/documentation/photokit/delivering-an-enhanced-privacy-experience-in-your-photos-app) I.e. it's not necessary for the app to call PHPhotoLibrary.requestAuthorization. This does seem to be what happens when my app runs on an iPhone or iPad; the prompt is shown. But when it runs on a Mac in "designed for iPad" mode, the permission dialog is not presented. Instead the code continues to see status == .notDetermined. That's today, on macOS 15.3. It may have worked in the past. Is anyone else seeing issues with this? Should I call requestAuthorization explicitly? (Would that actually work?)
1
0
417
Mar ’25
Unable to Capture 24MP Photos
Hello, I'm wondering how to capture 24MP photos. I'm currently testing on an iPhone 16 Pro Max. By default, the device's activeFormat supports 24MP (photo dimensions: {4032x3024, 5712x4284}). For the photoOutput, I'm setting the maxPhotoDimensions to videoDevice.activeFormat.supportedMaxPhotoDimensions.lastObject, and setting MaxPhotoQualityPrioritization to quality. When capturing, I'm applying the same maxPhotoDimensions and photoQualityPrioritization settings from the photoOutput directly to the AVCapturePhotoSettings. What could be the issue? // Objective-C // setup [self.photoOutput setMaxPhotoQualityPrioritization:AVCapturePhotoQualityPrioritizationQuality]; CMVideoDimensions maxPhotoDimensions = [(NSValue *)videoDevice.activeFormat.supportedMaxPhotoDimensions.lastObject CMVideoDimensionsValue]; [self.photoOutput setMaxPhotoDimensions:maxPhotoDimensions]; // capturing AVCapturePhotoSettings *photoSettings = [AVCapturePhotoSettings photoSettings]; photoSettings.maxPhotoDimensions = self.photoOutput.maxPhotoDimensions; photoSettings.photoQualityPrioritization = self.photoOutput.maxPhotoQualityPrioritization; [self.photoOutput capturePhotoWithSettings:photoSettings delegate:photoCaptureDelegate]; ...
1
0
548
Mar ’25
Set the capture device color space to apple log not works.
I set the device format and colorspace to Apple Log and turn off the HDR, why the movie output is still in HDR format rather than ProRes Log? Full runnable demo here: https://github.com/SpaceGrey/ColorSpaceDemo session.sessionPreset = .inputPriority // get the back camera let deviceDiscoverySession = AVCaptureDevice.DiscoverySession(deviceTypes: [.builtInWideAngleCamera], mediaType: .video, position: .back) backCamera = deviceDiscoverySession.devices.first! try! backCamera.lockForConfiguration() backCamera.automaticallyAdjustsVideoHDREnabled = false backCamera.isVideoHDREnabled = false let formats = backCamera.formats let appleLogFormat = formats.first { format in format.supportedColorSpaces.contains(.appleLog) } print(appleLogFormat!.supportedColorSpaces.contains(.appleLog)) backCamera.activeFormat = appleLogFormat! backCamera.activeColorSpace = .appleLog print("colorspace is Apple Log \(backCamera.activeColorSpace == .appleLog)") backCamera.unlockForConfiguration() do { let input = try AVCaptureDeviceInput(device: backCamera) session.addInput(input) } catch { print(error.localizedDescription) } // add output output = AVCaptureMovieFileOutput() session.addOutput(output) let connection = output.connection(with: .video)! print( output.outputSettings(for: connection) ) /* ["AVVideoWidthKey": 1920, "AVVideoHeightKey": 1080, "AVVideoCodecKey": apch,<----- prores has enabled. "AVVideoCompressionPropertiesKey": { AverageBitRate = 220029696; ExpectedFrameRate = 30; PrepareEncodedSampleBuffersForPaddedWrites = 1; PrioritizeEncodingSpeedOverQuality = 0; RealTime = 1; }] */ previewSource = DefaultPreviewSource(session: session) queue.async { self.session.startRunning() } }
1
0
396
Mar ’25
Alternative for crashing API MPMediaItemArtwork
When setting the now playing info for playing media in MPNowPlayingInfoCenter we can set artwork. But it seems the Apple API for creating the artwork is crashing on iOS 18 (FB15145734). On iOS 17 this gave the warning that the completion handler was not run on the main thread. I've tried to seek help here: https://stackoverflow.com/questions/78989543/swift-data-race-with-appkit-mpmediaitemartwork-function/78990231?noredirect=1#comment139277425_78990231 but it seems that it's not possible to override the completion handler and therefor it's up to Apple to fix this issue. .task { await MainActor.run { let nowPlayingInfoCenter = MPNowPlayingInfoCenter.default() var nowPlayingInfo = [String: Any]() let image = NSImage(named: "image")! // warning: data race detected: @MainActor function at MPMediaItemArtwork/ContentView.swift:22 was not called on the main thread nowPlayingInfo[MPMediaItemPropertyArtwork] = MPMediaItemArtwork(boundsSize: image.size, requestHandler: { _ in // Not on main thread here! return image }) nowPlayingInfoCenter.nowPlayingInfo = nowPlayingInfo } } I'm wondering if there is an alternative method to set the now playing artwork?
4
0
881
Feb ’25
Multiview HLS with HDR
I have an HDR10+ encoded video that if loaded as a mov plays back on the Apple Vision Pro but when that video is encoded using the latest (1.23b) Apple HLS tools to generate an fMP4 - the resulting m3u8 cannot be played back in the Apple Vision Pro and I only get back a "Cannot Open" error. To generate the m3u8, I'm just calling mediafilesegmenter (with -iso-fragmented) and then variantplaylistcreator. This completes with no errors but the m3u8 will playback on the Mac using VLC but not on the Apple Vision Pro. The relevant part of the m3u8 is: #EXT-X-STREAM-INF:AVERAGE-BANDWIDTH=40022507,BANDWIDTH=48883974,VIDEO-RANGE=PQ,CODECS="ec-3,hvc1.1.60000000.L180.B0",RESOLUTION=4096x4096,FRAME-RATE=24.000,CLOSED-CAPTIONS=NONE,AUDIO="audio1",REQ-VIDEO-LAYOUT="CH-STEREO" {{url}} Has anyone been able to use the HLS tools to generate fMP4s of MV-HEVC videos with HDR10?
3
2
1k
Feb ’25
Playing fMP4 Raw Chunks in AVPlayer on iOS
Hello Apple Community, We are working on a real-time streaming feature where we receive chunks of raw MP4 data through a custom protocol and store them in a buffer (array). Our goal is to use these data chunks to play a continuous video stream in AVPlayer. What We've Tried: Custom URL Scheme with AVAssetResourceLoaderDelegate: We implemented a custom URL scheme (customscheme://) to serve the buffered data using AVAssetResourceLoaderDelegate. The method shouldWaitForLoadingOfRequestedResource is called only during the initial allocation. It doesn't get triggered when new chunks are appended to the buffer. Despite appending new data to the buffer, AVPlayer doesn’t request further chunks from the delegate. What We Need: We are looking for a solution where: The player continuously fetches data from the buffer as new chunks are added. The playback remains smooth and uninterrupted, even with real-time data being appended. Ideally, this solution works with AVPlayer while adhering to HLS-like behavior without implementing an HLS server. Questions: Is AVAssetResourceLoaderDelegate the right approach for this use case? If so, how can we ensure shouldWaitForLoadingOfRequestedResource is called whenever new data is available in the buffer? Are there alternative APIs or recommended patterns for playing real-time MP4 data chunks in AVPlayer? Would implementing a custom FFmpeg-based player be necessary, or can this be achieved using AVPlayer and its APIs? We appreciate any guidance, suggestions, or examples that can help us achieve this. Thank you!
0
1
540
Jan ’25
AVPlayerItem step(byCount:) callback or notification
Hello there, I need to move through video loaded in an AVPlayer one frame at a time back or forth. For that I tried to use AVPlayerItem's method step(byCount:) and it works just fine. However I need to know when stepping happened and as far as I observed it is not immediate using the method. If I check the currentTime() just after calling the method it's the same and if I do it slightly later (depending of the video itself) it shows the correct "jumped" time. To achieve my goal I tried subclassing AVPlayerItem and implement my own async method utilizing NotificationCenter and the timeJumpedNotification assuming it would deliver it as the time actually jumps but it's not the case. Here is my "stripped" and simplified version of the custom Player Item: import AVFoundation final class PlayerItem: AVPlayerItem { private var jumpCompletion: ( (CMTime) -> () )? override init(asset: AVAsset, automaticallyLoadedAssetKeys: [String]?) { super .init(asset: asset, automaticallyLoadedAssetKeys: automaticallyLoadedAssetKeys) NotificationCenter.default.addObserver(self, selector: #selector(timeDidChange(_:)), name: AVPlayerItem.timeJumpedNotification, object: self) } deinit { NotificationCenter.default.removeObserver(self, name: AVPlayerItem.timeJumpedNotification, object: self) jumpCompletion = nil } @discardableResult func step(by count: Int) async -> CMTime { await withCheckedContinuation { continuation in step(by: count) { time in continuation.resume(returning: time) } } } func step(by count: Int, completion: @escaping ( (CMTime) -> () )) { guard jumpCompletion == nil else { completion(currentTime()) return } jumpCompletion = completion step(byCount: count) } @objc private func timeDidChange(_ notification: Notification) { switch notification.name { case AVPlayerItem.timeJumpedNotification where notification.object as? AVPlayerItem [==](https://www.example.com/) self: jumpCompletion?(currentTime()) jumpCompletion = nil default: return } } } In short the notification never gets called thus the above is not working. I guess the key there is that in the docs about the timeJumpedNotification: is said: "A notification the system posts when a player item’s time changes discontinuously." so the step(byCount:) is not considered as discontinuous operation and doesn't trigger it. I'd be really helpful if somebody can help as I don't want to use seek(to:toleranceBefore:toleranceAfter:) mainly cause it's not accurate in terms of the exact next/previous frame as the video might have VFR and that causes repeating frames sometimes or even skipping one or another. Thanks a lot
2
0
617
Jan ’25
Is there a way to filter PHPickerViewController by the creation date of the assets?
Our app filters the photo library to a certain date range for ease of picking photos. However, to do this, we have to require full permissions to the photo library. We would like to use the PHPickerViewController and have it filter the results by the assets creation date? This would allow us to use it. I see other filter options, but not this one. And if it isn't there, is this something that is being thought about or on a roadmap?
1
0
546
Jan ’25
Slow performance decoding large images with Core Image.
I'm building a camera app that does some post processing after the photo has been taken. With 12MP the processing is pretty good, but larger images 24MP is very slow. I created a very simple example to demonstrate the issue, which is loading an image and the rendering it to data. let context = CIContext() let imageUrl = Bundle.main.url(forResource: "12mp", withExtension: "jpg")! let data = try! Data(contentsOf: imageUrl) let ciImage = CIImage(data: data)! let start = CFAbsoluteTimeGetCurrent() let data = context.jpegRepresentation(of: ciImage, colorSpace: context.workingColorSpace!) print(data?.count) print("Resize Completed: " + String(CFAbsoluteTimeGetCurrent() - start)) Running this code on an iPhone 16 Pro with different images produces these benchmarks: 12MP => 0.03s 24MP => 1.22s 48MP => 2.98s I understand that processing time will increase with resolution but it doesn't seem linear. I have tried setting different CiContext options such as .useSoftwareRenderer: false but it has made no difference. From profiling the process it looks like the JPEG decoding is the bottle neck. This is for a 48MP Image: Is there any way this can be improved?
0
0
597
Dec ’24
Sharing Photos and Videos from the Photos app to SwiftUI app
I have a SwiftUI app that needs to be able to receive photos and videos from the Photos app. When the user shares an item from the Photos app they can choose to share to a destination app. When doing so from the Files app, my app appears as a share destination and the .onOpenURL successfully handles the incoming content. The CFBundleTypeName and CFBundleTypeRole have been configured accordingly. What I don't understand is why I can share from the Files app and select my app as the destination, while not being able to select my app when sharing from the Photos app. What does the Photos app require to allow a given app to available as a share destination? I'd also like to be able to do this from other apps such as the YouTube app.
0
1
488
Dec ’24