Integrate photo, audio, and video content into your apps.

Posts under Media tag

42 Posts

Post

Replies

Boosts

Views

Activity

iOS 26.0 (23A5276f) - Bluetooth Call Audio Broken (AirPods + Car)
iOS 26.0 (23A5276f) – Bluetooth Call Audio Issue I’m experiencing a Bluetooth audio issue on iOS 26.0 (build 23A5276f). I cannot make or receive phone calls properly using Bluetooth devices — this affects both my car’s Bluetooth system and my AirPods Pro (2nd generation). Notably: Regular phone calls have no audio (either I can’t hear the other person, or they can’t hear me). WhatsApp and other VoIP apps work fine with the same Bluetooth devices. Media playback (music, video, etc.) works without issues over Bluetooth. It seems this bug is limited to the native Phone app or the system audio routing for regular cellular calls. Please advise if this is a known issue or if a fix is expected in upcoming beta releases.
1
1
296
Jun ’25
WideFOV - APMP - Stereo
Does anyone have a template of an Apple Projected Media Profile Format Description or a File of a Stereo wideFOV video? Use case I have 2 compatible cameras that I stereo sync and I want to move the projection information from the compatible video to the Spatial video that combines them. Every version I can come up with crashes the AVP and when viewing as Spatial in Tahoe I just get a black screen.
4
0
193
Jun ’25
Cannot extract imagePair from generated Spatial Photos
Hi I am trying to implement something simple as people can share their Spatial Photos with others (just like this post). I encountered the same issue with him, but his answer doesn't help me out here. Briefly speaking, I am using CGImgaeSoruce to extract paired leftImage and rightImage from one fetched spatial photo let photos = PHAsset.fetchAssets(with: .image, options: nil) // enumerating photos .... if asset.mediaSubtypes.contains(PHAssetMediaSubtype.spatialMedia) { spatialAsset = asset } // other code show below I can fetch left and right images from native Spatial Photo (taken by Apple Vision Pro or iPhone 15+), but it didn't work on generated spatial photo (2D -> 3D feat in Photos). // imageCount is 1 when it comes to generated spatial photo let imageCount = CGImageSourceGetCount(source) I searched over the net and someone says the generated version is having a depth image instead of left/right pair. But still I cannot extract any depth image from imageSource. The full code below, the imagePair extraction will stop at "no groups found": func extractPairedImage(phAsset: PHAsset, completion: @escaping (StereoImagePair?) -> Void) { let options = PHImageRequestOptions() options.isNetworkAccessAllowed = true options.deliveryMode = .highQualityFormat options.resizeMode = .none options.version = .original return PHImageManager.default().requestImageDataAndOrientation(for: phAsset, options: options) { imageData, _, _, _ in guard let imageData, let imageSource = CGImageSourceCreateWithData(imageData as CFData, nil) else { completion(nil) return } let stereoImagePair = stereoImagePair(from: imageSource) completion(stereoImagePair) } } } func stereoImagePair(from source: CGImageSource) -> StereoImagePair? { guard let properties = CGImageSourceCopyProperties(source, nil) as? [CFString: Any] else { return nil } let imageCount = CGImageSourceGetCount(source) print(String(format: "%d images found", imageCount)) guard let groups = properties[kCGImagePropertyGroups] as? [[CFString: Any]] else { /// function returns here print("no groups found") return nil } guard let stereoGroup = groups.first(where: { let groupType = $0[kCGImagePropertyGroupType] as! CFString return groupType == kCGImagePropertyGroupTypeStereoPair }) else { return nil } guard let leftIndex = stereoGroup[kCGImagePropertyGroupImageIndexLeft] as? Int, let rightIndex = stereoGroup[kCGImagePropertyGroupImageIndexRight] as? Int, let leftImage = CGImageSourceCreateImageAtIndex(source, leftIndex, nil), let rightImage = CGImageSourceCreateImageAtIndex(source, rightIndex, nil), let leftProperties = CGImageSourceCopyPropertiesAtIndex(source, leftIndex, nil), let rightProperties = CGImageSourceCopyPropertiesAtIndex(source, rightIndex, nil) else { return nil } return (leftImage, rightImage, self.identifier) } Any suggestion? Thanks visionOS 2.4
3
0
155
Jun ’25
PHLivePhotoEditingContext.saveLivePhoto results in AVFoundation error -11800 "The operation could not be completed" reason An unknown error occurred (-12815)
When trying to edit some Live Photos, calling PHLivePhotoEditingContext.saveLivePhoto results in the following error: Error Domain=AVFoundationErrorDomain Code=-11800 "The operation could not be completed" UserInfo={NSLocalizedFailureReason=An unknown error occurred (-12815), NSLocalizedDescription=The operation could not be completed, NSUnderlyingError=0x300d05380 {Error Domain=NSOSStatusErrorDomain Code=-12815 "(null)"}} I was able to replicate it on my device by taking a new Live Photo. Not sure what's wrong with that one specifically, not all Live Photos replicate the issue. I've submitted FB15880825 with a sysdiagnose and a Photos Diagnostics as well. Any ideas what's going on here? It's impacting multiple customers. Thanks!
1
0
587
Jun ’25
AirDropped Videos from Photos Save to Files Instead of Photos on Receiving Device
My app allows users to capture and save videos to the Photos app using the following Swift code: PHPhotoLibrary.shared().performChanges { PHAssetChangeRequest.creationRequestForAssetFromVideo(atFileURL: fileURL) } completionHandler: { success, error in Videos are successfully saved to Photos and play correctly. However, users report that when they AirDrop these videos from the Photos app to another device (e.g., iPad to iPhone), the videos are saved in the Files app on the receiving device instead of the Photos app. This issue is more common with higher-resolution videos, such as 2K, recorded in HEVC format at 30 fps. I wasn't able to reproduce the issue locally. I've found a thread in public apple forum: https://discussions.apple.com/thread/255276865?sortBy=rank but I wonder maybe there are some special flags that I should clear or add to my videos (e.g. PHAssetChangeRequest)? Thank you!
0
0
100
May ’25
Screen recording audio and video out of sync
I use startCaptureWithHandler to record screen and AVAssetWriter appendSampleBuffer: to save audio and video ,but when played the saved file audio and video are out of sync. I don t know if it s a AVAssetWriterInputr setup problem,here is my code NSDictionary *audioCompressionSettings = @{ AVEncoderBitRatePerChannelKey : @(64000), AVFormatIDKey : @(kAudioFormatMPEG4AAC), AVNumberOfChannelsKey : @(2), AVSampleRateKey : @(44100) }; AVAssetWriterInput *audioAssetWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeAudio outputSettings:audioCompressionSettings]; audioAssetWriterInput.expectsMediaDataInRealTime = YES; [_assetWriter addInput:audioAssetWriterInput]; NSDictionary *videoCompressSetting = @{AVVideoAverageBitRateKey:@(screenWidth*screenHeight*5), AVVideoMaxKeyFrameIntervalKey:@(30), AVVideoProfileLevelKey : AVVideoProfileLevelH264MainAutoLevel}; NSDictionary *codecSetting = @{AVVideoCodecKey:AVVideoCodecTypeH264, AVVideoScalingModeKey : AVVideoScalingModeResize, AVVideoWidthKey:@(screenWidth*2), AVVideoHeightKey:@(screenHeight*2), AVVideoCompressionPropertiesKey:videoCompressSetting }; AVAssetWriterInput* videoAssetWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:codecSetting]; videoAssetWriterInput.expectsMediaDataInRealTime = YES; [_assetWriter addInput:videoAssetWriterInput];
1
0
109
Apr ’25
Creating an initial Now Playing state of paused - impossible?
I am working on an app which plays audio - https://youtu.be/VbAfUk_eYl0?si=nJg5ayy2faWE78-g - and one of the features is, on restart, if you had paused playback of a file at the time the app was previously shut down (or were playing one at the time of shutdown), the paused state and position in the file is restored exactly as it was, on restart. The functionality works. However, it seems impossible to get the "now playing" information in iOS into the right state to reflect that via the MediaPlayer API. On restart, handlers are attached to the play/pause/togglePlayPause actions on MPRemoteCommandCenter.shared(), and the map of media info is updated on MPNowPlayingInfoCenter.default().nowPlayingInfo. What happens is that iOS's media view shows the audio as playing and offers a pause button - even though the play action is enabled and the pause action is disabled. Once playback has been initiated (my workaround is to have the pause action toggle the play state, since otherwise you wouldn't be able to initiate playback from controls in a car without initiating it once from a device first). I've created a simplified white-noise-player demo to illustrate the problem - simply build and deploy it, and then start the app, lock your device and look at the playback controls on the lock screen. It will show a pause button - same behavior I've described. https://github.com/timboudreau/ios-play-pause-demo I've tried a few things to narrow down the source of the issue - for example, thinking that not MPNowPlayingInfoPropertyPlaybackProgress and MPMediaItemPropertyPlaybackDuration might be the culprit (since the system interpolates elapsed time and it's recommended to update those properties infrequently) on startup might do the trick, but the result is the same, just without a duration or progress shown. What governs this behavior, and is there some way to explicitly tell the media player API your current state is paused?
0
2
125
Apr ’25
PHPickerResult return different data for the one media
On some devices, when i select the same media multiple times, the data by` loadFileRepresentation(forTypeIdentifier: completionHandler) ` returned is different(data.count is not equal). environment: * Model: iPhone 12 * Model Number: MGGM3CH/A * iOS Version: 18.3.2 ```Swift // import PhotosUI func picker(_ picker: PHPickerViewController, didFinishPicking results: [PHPickerResult]) { picker.dismiss(animated: true, completion: nil) guard let provider = results.last?.itemProvider else { return } guard provider.hasItemConformingToTypeIdentifier(UTType.movie.identifier) else { return } Task { provider.loadFileRepresentation(forTypeIdentifier: UTType.movie.identifier) { url, error in guard let url = url else { return } if let data = try? Data(contentsOf: url) { print("data count is: \(data.count)") } } } } ``` ps: I also try some other function, eg: ` provide.loadItem(forTypeIdentifier:)`, but not work too.
1
0
59
Mar ’25
iOS Mobile Video Audio Playback Issues in React
I'm experiencing issues with audio playback in my React video player component specifically on iOS mobile devices (iPhone/iPad). Even after implementing several recommended solutions, including Apple's own guidelines, the audio still isn't working properly on iOS Safari. It works completely fine on Android. On iOS, I ensured the video doesn't autoplay (it requires user interaction). Here are all the details: Environment iOS Safari (latest version) React 18 TypeScript Video files: MP4 with AAC audio codec Current Implementation const VideoPlayer: React.FC<VideoPlayerProps> = ({ src, autoplay = true, }) => { const videoRef = useRef<HTMLVideoElement>(null); const isIOSDevice = isIOS(); // Custom iOS detection const [touchStartY, setTouchStartY] = useState<number | null>(null); const [touchStartTime, setTouchStartTime] = useState<number | null>(null); // Handle touch start event for gesture detection const handleTouchStart = (e: React.TouchEvent) => { setTouchStartY(e.touches[0].clientY); setTouchStartTime(Date.now()); }; // Handle touch end event with gesture validation const handleTouchEnd = (e: React.TouchEvent) => { if (touchStartY === null || touchStartTime === null) return; const touchEndY = e.changedTouches[0].clientY; const touchEndTime = Date.now(); // Validate if it's a legitimate tap (not a scroll) const verticalDistance = Math.abs(touchEndY - touchStartY); const touchDuration = touchEndTime - touchStartTime; // Only trigger for quick taps (< 200ms) with minimal vertical movement if (touchDuration < 200 && verticalDistance < 10) { handleVideoInteraction(e); } setTouchStartY(null); setTouchStartTime(null); }; // Simplified video interaction handler following Apple's guidelines const handleVideoInteraction = (e: React.MouseEvent | React.TouchEvent) => { console.log('Video interaction detected:', { type: e.type, timestamp: new Date().toISOString() }); // Ensure keyboard is dismissed (iOS requirement) if (document.activeElement instanceof HTMLElement) { document.activeElement.blur(); } e.stopPropagation(); const video = videoRef.current; if (!video || !video.paused) return; // Attempt playback in response to user gesture video.play().catch(err => console.error('Error playing video:', err)); }; // Effect to handle video source and initial state useEffect(() => { console.log('VideoPlayer props:', { src, loadingState }); setError(null); setLoadingState('initial'); setShowPlayButton(false); // Never show custom play button on iOS if (videoRef.current) { // Set crossOrigin attribute for CORS videoRef.current.crossOrigin = "anonymous"; if (autoplay && !hasPlayed && !isIOSDevice) { // Only autoplay on non-iOS devices dismissKeyboard(); setHasPlayed(true); } } }, [src, autoplay, hasPlayed, isIOSDevice]); return ( <Paper shadow="sm" radius="md" withBorder onClick={handleVideoInteraction} onTouchStart={handleTouchStart} onTouchEnd={handleTouchEnd} > <video ref={videoRef} autoPlay={!isIOSDevice && autoplay} playsInline controls crossOrigin="anonymous" preload="auto" onLoadedData={handleLoadedData} onLoadedMetadata={handleMetadataLoaded} onEnded={handleVideoEnd} onError={handleError} onPlay={dismissKeyboard} onClick={handleVideoInteraction} onTouchStart={handleTouchStart} onTouchEnd={handleTouchEnd} {...(!isFirefoxBrowser && { "x-webkit-airplay": "allow", "x-webkit-playsinline": true, "webkit-playsinline": true })} > <source src={videoSrc} type="video/mp4" /> </video> </Paper> ); }; Apple's Guidelines Implementation Removed custom play controls on iOS Using native video controls for user interaction Ensuring audio playback is triggered by user gesture Following Apple's audio session guidelines Properly handling the canplaythrough event Current Behavior Video plays but without sound on iOS mobile Mute/unmute button in native video controls doesn't work Audio works fine on desktop browsers and Android devices Videos are confirmed to have AAC audio codec No console errors related to audio playback User interaction doesn't trigger audio as expected Questions Are there any additional iOS-specific requirements I'm missing? Could this be related to iOS audio session handling? Are there known issues with React's handling of video elements on iOS? Should I be implementing additional audio context initialization? Any insights or suggestions would be greatly appreciated!
0
0
396
Mar ’25
Photo permission dialog not shown when iOS app runs on Mac
According to the docs: The first time your app performs an operation that requires [photo library] authorization, the system automatically and asynchronously prompts the user for it. (https://developer.apple.com/documentation/photokit/delivering-an-enhanced-privacy-experience-in-your-photos-app) I.e. it's not necessary for the app to call PHPhotoLibrary.requestAuthorization. This does seem to be what happens when my app runs on an iPhone or iPad; the prompt is shown. But when it runs on a Mac in "designed for iPad" mode, the permission dialog is not presented. Instead the code continues to see status == .notDetermined. That's today, on macOS 15.3. It may have worked in the past. Is anyone else seeing issues with this? Should I call requestAuthorization explicitly? (Would that actually work?)
1
0
446
Mar ’25
Unable to Capture 24MP Photos
Hello, I'm wondering how to capture 24MP photos. I'm currently testing on an iPhone 16 Pro Max. By default, the device's activeFormat supports 24MP (photo dimensions: {4032x3024, 5712x4284}). For the photoOutput, I'm setting the maxPhotoDimensions to videoDevice.activeFormat.supportedMaxPhotoDimensions.lastObject, and setting MaxPhotoQualityPrioritization to quality. When capturing, I'm applying the same maxPhotoDimensions and photoQualityPrioritization settings from the photoOutput directly to the AVCapturePhotoSettings. What could be the issue? // Objective-C // setup [self.photoOutput setMaxPhotoQualityPrioritization:AVCapturePhotoQualityPrioritizationQuality]; CMVideoDimensions maxPhotoDimensions = [(NSValue *)videoDevice.activeFormat.supportedMaxPhotoDimensions.lastObject CMVideoDimensionsValue]; [self.photoOutput setMaxPhotoDimensions:maxPhotoDimensions]; // capturing AVCapturePhotoSettings *photoSettings = [AVCapturePhotoSettings photoSettings]; photoSettings.maxPhotoDimensions = self.photoOutput.maxPhotoDimensions; photoSettings.photoQualityPrioritization = self.photoOutput.maxPhotoQualityPrioritization; [self.photoOutput capturePhotoWithSettings:photoSettings delegate:photoCaptureDelegate]; ...
1
0
590
Mar ’25
Set the capture device color space to apple log not works.
I set the device format and colorspace to Apple Log and turn off the HDR, why the movie output is still in HDR format rather than ProRes Log? Full runnable demo here: https://github.com/SpaceGrey/ColorSpaceDemo session.sessionPreset = .inputPriority // get the back camera let deviceDiscoverySession = AVCaptureDevice.DiscoverySession(deviceTypes: [.builtInWideAngleCamera], mediaType: .video, position: .back) backCamera = deviceDiscoverySession.devices.first! try! backCamera.lockForConfiguration() backCamera.automaticallyAdjustsVideoHDREnabled = false backCamera.isVideoHDREnabled = false let formats = backCamera.formats let appleLogFormat = formats.first { format in format.supportedColorSpaces.contains(.appleLog) } print(appleLogFormat!.supportedColorSpaces.contains(.appleLog)) backCamera.activeFormat = appleLogFormat! backCamera.activeColorSpace = .appleLog print("colorspace is Apple Log \(backCamera.activeColorSpace == .appleLog)") backCamera.unlockForConfiguration() do { let input = try AVCaptureDeviceInput(device: backCamera) session.addInput(input) } catch { print(error.localizedDescription) } // add output output = AVCaptureMovieFileOutput() session.addOutput(output) let connection = output.connection(with: .video)! print( output.outputSettings(for: connection) ) /* ["AVVideoWidthKey": 1920, "AVVideoHeightKey": 1080, "AVVideoCodecKey": apch,<----- prores has enabled. "AVVideoCompressionPropertiesKey": { AverageBitRate = 220029696; ExpectedFrameRate = 30; PrepareEncodedSampleBuffersForPaddedWrites = 1; PrioritizeEncodingSpeedOverQuality = 0; RealTime = 1; }] */ previewSource = DefaultPreviewSource(session: session) queue.async { self.session.startRunning() } }
1
0
423
Mar ’25
Alternative for crashing API MPMediaItemArtwork
When setting the now playing info for playing media in MPNowPlayingInfoCenter we can set artwork. But it seems the Apple API for creating the artwork is crashing on iOS 18 (FB15145734). On iOS 17 this gave the warning that the completion handler was not run on the main thread. I've tried to seek help here: https://stackoverflow.com/questions/78989543/swift-data-race-with-appkit-mpmediaitemartwork-function/78990231?noredirect=1#comment139277425_78990231 but it seems that it's not possible to override the completion handler and therefor it's up to Apple to fix this issue. .task { await MainActor.run { let nowPlayingInfoCenter = MPNowPlayingInfoCenter.default() var nowPlayingInfo = [String: Any]() let image = NSImage(named: "image")! // warning: data race detected: @MainActor function at MPMediaItemArtwork/ContentView.swift:22 was not called on the main thread nowPlayingInfo[MPMediaItemPropertyArtwork] = MPMediaItemArtwork(boundsSize: image.size, requestHandler: { _ in // Not on main thread here! return image }) nowPlayingInfoCenter.nowPlayingInfo = nowPlayingInfo } } I'm wondering if there is an alternative method to set the now playing artwork?
4
0
920
Feb ’25
Multiview HLS with HDR
I have an HDR10+ encoded video that if loaded as a mov plays back on the Apple Vision Pro but when that video is encoded using the latest (1.23b) Apple HLS tools to generate an fMP4 - the resulting m3u8 cannot be played back in the Apple Vision Pro and I only get back a "Cannot Open" error. To generate the m3u8, I'm just calling mediafilesegmenter (with -iso-fragmented) and then variantplaylistcreator. This completes with no errors but the m3u8 will playback on the Mac using VLC but not on the Apple Vision Pro. The relevant part of the m3u8 is: #EXT-X-STREAM-INF:AVERAGE-BANDWIDTH=40022507,BANDWIDTH=48883974,VIDEO-RANGE=PQ,CODECS="ec-3,hvc1.1.60000000.L180.B0",RESOLUTION=4096x4096,FRAME-RATE=24.000,CLOSED-CAPTIONS=NONE,AUDIO="audio1",REQ-VIDEO-LAYOUT="CH-STEREO" {{url}} Has anyone been able to use the HLS tools to generate fMP4s of MV-HEVC videos with HDR10?
3
2
1.1k
Feb ’25
Playing fMP4 Raw Chunks in AVPlayer on iOS
Hello Apple Community, We are working on a real-time streaming feature where we receive chunks of raw MP4 data through a custom protocol and store them in a buffer (array). Our goal is to use these data chunks to play a continuous video stream in AVPlayer. What We've Tried: Custom URL Scheme with AVAssetResourceLoaderDelegate: We implemented a custom URL scheme (customscheme://) to serve the buffered data using AVAssetResourceLoaderDelegate. The method shouldWaitForLoadingOfRequestedResource is called only during the initial allocation. It doesn't get triggered when new chunks are appended to the buffer. Despite appending new data to the buffer, AVPlayer doesn’t request further chunks from the delegate. What We Need: We are looking for a solution where: The player continuously fetches data from the buffer as new chunks are added. The playback remains smooth and uninterrupted, even with real-time data being appended. Ideally, this solution works with AVPlayer while adhering to HLS-like behavior without implementing an HLS server. Questions: Is AVAssetResourceLoaderDelegate the right approach for this use case? If so, how can we ensure shouldWaitForLoadingOfRequestedResource is called whenever new data is available in the buffer? Are there alternative APIs or recommended patterns for playing real-time MP4 data chunks in AVPlayer? Would implementing a custom FFmpeg-based player be necessary, or can this be achieved using AVPlayer and its APIs? We appreciate any guidance, suggestions, or examples that can help us achieve this. Thank you!
0
1
579
Jan ’25