Dive into the world of video on Apple platforms, exploring ways to integrate video functionalities within your iOS,iPadOS, macOS, tvOS, visionOS or watchOS app.

Video Documentation

Posts under Video subtopic

Post

Replies

Boosts

Views

Activity

UIViewController + AVPlayerLayer + AVPlayer cannot AirPlay to HomePod on tvOS 18.5, but AVPlayerViewController + AVPlayer is ok
1, Devices Apple TV HD tvOS 18.5 (22L572) Mode(A1625) HomePod Gen1 OS18.5 (22L572) 2, Test cases: 2.1 UIViewController + AVPlayerLayer + AVPlayer Result: Can not AirPlay to HomePod Sample code for UIViewController + AVPlayer 2.1 AVPlayerViewController + AVPlayer Result: Can AirPlay to HomePod Sample code for AVPlayerViewController + AVPlayer
0
0
193
Aug ’25
iOS 18 Picture-in-Picture dynamically changes UIScreen.main.bounds on iPhone 11 causing keyboard clipping and layout issues
When creating and activating AVPictureInPictureController on iPhone 11 running iOS 18.x, the main screen viewport unexpectedly changes from the native 375×812 logical points to 414×896 points (the size of a different device such as iPhone XR/11 Plus). Additionally, a separate UIWindow with zero CGRect is created. This causes UIKit layout calculations, especially related to the keyboard and safeAreaInsets, to malfunction and results in issues like keyboard clipping and general UI misalignment. The problem appears tied specifically to iOS 18+ behavior and does not manifest on earlier iOS versions or other devices consistently. Steps to Reproduce: Run the app on iPhone 11 with iOS 18.x (including the latest minor updates). Instantiate and start an AVPictureInPictureController during app runtime. Observe that UIScreen.main.bounds changes from 375×812 (native iPhone 11 size) to 414×896. Notice a secondary UIWindow appears with frame CGRectZero. Interact with text inputs that cause the keyboard to appear. The keyboard is clipped or partially obscured, causing UI usability issues. Layout elements dependent on UIScreen.main.bounds or safeAreaInsets behave incorrectly. Expected Behavior: UIScreen.main.bounds should not dynamically change upon PiP controller creation. The coordinate space and viewport should remain accurate to device native dimensions. The keyboard and UI layout should adjust properly with respect to the actual on-screen size and safe areas. Actual Behavior: UIScreen.main.bounds changes to 414×896 upon PiP controller creation on iPhone 11 iOS 18+. A zero-rect secondary UIWindow is created, impacting layout queries. Keyboard clipping and UI layout bugs occur. The app viewport and coordinate space are inconsistent with device hardware.
0
0
163
Jul ’25
AVSampleBufferDisplayLayer drop frame when play uncompressed video with bframe>3
When using AVSampleBufferDisplayLayer to play uncompressed H.264 and H.265 video with B-frames more than 7, frame drops occur. The more B-frames there are, the more noticeable the frame drops become, for example 15 bframes. Use FFmpeg to transcode a video file with visible timestamps and frame numbers (x264 or x265 ): ffmpeg -i test.mp4 -vf "drawtext=fontsize=45:text=%{pts} %{n}:y=400" -c:v libx264 -x264-params "bframes=15:b-adapt=0" -crf 30 -y x264_bf15.mp4 ffmpeg -i test.mp4 -vf "drawtext=fontsize=45:text=%{pts} %{n}:y=400" -c:v libx265 -x265-params "bframes=15:b-adapt=0" -crf 30 -y x265_bf15.mp4 Use the demo player from this repository to reproduce the issue: https://github.com/msfrms/CustomPlayer frame drops can be observed. And following log can be found in devices console. mediaserverd <<<< IQ-CA >>>> piqca_gmstats_dump: FIQCA(0x1266f4000) recent frames: enqueued: 184, displayed: 138, dropped: 42, flushed: 0, evicted: 3, >16ms late: 2 PS. I was using iphone11 iOS14.6, to replay this issue. May I ask why frame drops occur in this case? Is there any configuration or API usage change that could help fix the frame drop issue? Many thanks!
2
1
200
Jul ’25
AVAssetReaderOutput.Provider Missing symbols
Recurring crash on install of any app with the new sourceVideoTrackProvider.next() dyld[41966]: Symbol not found: _$sSo19AVAssetReaderOutputC12AVFoundationE8ProviderC4nextxSgyYaKFTjTu Referenced from: <79AA2BE0-A6B4-32F5-A804-E84BBE5D1AEA> /Users/<username>/Library/Developer/Xcode/DerivedData/TrackProviderCrash-bbbhjptcxnmfdcackxtpucnunxyc/Build/Products/Debug-maccatalyst/TrackProviderCrash.app/Contents/MacOS/TrackProviderCrash.debug.dylib Expected in: <1B847AF9-7973-3B28-95C2-09E73F6DD50B> /usr/lib/swift/libswiftAVFoundation.dylib Can be reproduced with the current Xcode Beta 4 by running on to MacCatalyst and macOS https://developer.apple.com/documentation/AVFoundation/converting-projected-video-to-apple-projected-media-profile Crash goes away of you comment out lines 154-158 and 164-170 which are while let sampleBuffer = try await sourceVideoTrackProvider.next(){/*other code*/} Can also be reproduced if you add the code below to a MacCatalyst project import AVKit let asset: AVURLAsset = .init(url: Bundle.main.url(forResource: "SomeVideo.mp4", withExtension: nil)!) let videoReader = try! AVAssetReader(asset: asset) let videoTracks = try! await asset.loadTracks(withMediaCharacteristic: .visual) // Get the side-by-side video track. let videoTrack = videoTracks.first! let videoInputTrack = AVAssetReaderTrackOutput(track: videoTrack, outputSettings: nil) let sourceVideoTrackProvider: AVAssetReaderOutput.Provider<CMReadySampleBuffer<CMSampleBuffer.DynamicContent>> = videoReader.outputProvider(for: videoInputTrack) //Comment out this while let sb = try! await sourceVideoTrackProvider.next() { }
1
0
586
Jul ’25
PHPickerViewController Not Offering public.hevc UTI for a Known HEVC Video
I'm working on an app where a user needs to select a video from their Photos library, and I need to get the original, unmodified HEVC (H.265) data stream to preserve its encoding. The Problem I have confirmed that my source videos are HEVC. I can record a new video with my iPhone 15 Pro Max camera set to "High Efficiency," export the "Unmodified Original" from Photos on my Mac, and verify that the codec is MPEG-H Part2/HEVC (H.265). However, when I select that exact same video in my app using PHPickerViewController, the itemProvider does not list public.hevc as an available type identifier. This forces me to fall back to a generic movie type, which results in the system providing me with a transcoded H.264 version of the video. Here is the debug output from my app after selecting a known HEVC video: ⚠️ 'public.hevc' not found. Falling back to generic movie type (likely H.264). What I've Tried My code explicitly checks for the public.hevc identifier in the registeredTypeIdentifiers array. Since it's not found, my HEVC-specific logic is never triggered. Here is a minimal version of my PHPickerViewControllerDelegate implementation: import UniformTypeIdentifiers // ... inside the Coordinator class ... func picker(_ picker: PHPickerViewController, didFinishPicking results: [PHPickerResult]) { picker.dismiss(animated: true) guard let result = results.first else { return } let itemProvider = result.itemProvider let hevcIdentifier = "public.hevc" let identifiers = itemProvider.registeredTypeIdentifiers print("Available formats from itemProvider: \(identifiers)") if identifiers.contains(hevcIdentifier) { print("✅ HEVC format found, requesting raw data...") itemProvider.loadDataRepresentation(forTypeIdentifier: hevcIdentifier) { (data, error) in // ... process H.265 data ... } } else { print("⚠️ 'public.hevc' not found. Falling back to generic movie type (likely H.264).") itemProvider.loadFileRepresentation(forTypeIdentifier: UTType.movie.identifier) { url, error in // ... process H.264 fallback ... } } } My Environment Device: iPhone 15 Pro Max iOS Version: iOS 18.5 Xcode Version: 16.2 My Questions Are there specific conditions (e.g., the video being HDR/Dolby Vision, Cinematic, or stored in iCloud) under which PHPickerViewController's itemProvider would intentionally not offer the public.hevc type identifier, even for an HEVC video? What is the definitive, recommended API sequence to guarantee that I receive the original, unmodified data stream for a video asset, ensuring that no transcoding to H.264 occurs during the process? Any insight into why public.hevc might be missing from the registeredTypeIdentifiers for a known HEVC asset would be greatly appreciated. Thank you.
3
0
110
Jul ’25
play videos in webM format on iOS Deveices
I would like to play videos in webM format on my iPhone. I understand that it is basically impossible to play videos in webM format on an iPhone, but is there any way to display videos in webM format? I would like to know if there is an official Swift SDK or development kit released by Apple. Or if there are any third-party products, please let me know.
1
0
266
Jul ’25
Final Cut Pro Workflow Extension Drag and Drop to Timeline Not Working Despite Proper FCPXML Clip Structure
Our Final Cut Pro workflow extension built with ProExtensionHost framework uses an advanced NSPasteboardItemDataProvider system with multi-version FCPXML support (1.9, 1.10, 1.13) and proper relative path UIDs for Motion templates. We've implemented clip wrapper approach with placeholder assets and elements containing effects to enable direct timeline drag functionality. However, drag and drop from our Final Cut Pro workflow extension directly to timeline is still not working despite proper element structure in our FCPXML. Our implementation creates valid clip elements with effects applied, but Final Cut Pro timeline doesn't accept them during drag operations from our ProExtensionHost-based workflow extension. Steps to Reproduce: Create Final Cut Pro workflow extension using ProExtensionHost framework with NSPasteboardItemDataProvider implementation Generate FCPXML with proper element structure: Expected Result: Clip should be accepted by timeline and effect applied from workflow extension Actual Result: Timeline rejects drag operation from ProExtensionHost-based workflow extension Question: Are there additional requirements or ProExtensionHost API calls needed beyond standard NSPasteboardItemDataProvider for Final Cut Pro workflow extension timeline drag functionality?
0
0
122
Jul ’25
HDR video & screen brightness
When I play an HDR video in the iPhone Photos app, I can see the HDR effect obviously. But if this HDR video is played continuously for more than 30-40 minutes, the HDR effect will disappear and the brightness will be compressed to the SDR range. This issue will appear on any iPhone. Depending on the phone, it may be 20-30 minutes, or 30-40 minutes, or even a few minutes, such as iPhone 12 mini. Similarly, if I use AVPlayer to play and preview an HDR video, if it plays more than 30-40 minutes, the HDR effect will disappear and the screen brightness will dim. Also the currentEDRHeadroom will gradually decrease to 1 Note, test it with an HDR video longer than 1 hour, and if the video is short, please loop it. My question is how to avoid losing the HDR effect after 30-40 minutes when I use CAMetalLayer to render any HDR video.
1
0
106
Jul ’25
After playing an HDR video on iPhone for a while, the HDR effect disappears and the screen brightness decrease
When i use AVPlayer to obtain the video frame CVPixelBufferRef of an HDR video, and use AVSampleBufferDisplayLayer to display it on the screen, after a period of time, the HDR video content and screen gradually darken, losing the HDR effect. Steps to reproduce: Create an AVPlayer to loop an HDR video, specify the video frame format as kCVPixelFormatType_420YpCbCr10BiPlanarVideoRange Create a timer to get the video frame CVPixelBufferRef at 30 frames per second Use AVSampleBufferDisplayLayer to display CVPixelBufferRef on the screen Don't operate the phone, wait for a period of time (such as 40 minutes), the HDR effect disappears and the screen darkens Note: You need to use an iPhone device, iOS 18.5 and below operating system You need to ensure that the HDR video is played in a loop, that is, to ensure that the screen continues to display HDR content, wait for a period of time, depending on different devices, you need to wait for 20-40 minutes. In the iPhone Photos app,the same problem will occur after playing HDR video in a loop for a long time Expected Results: When rendering HDR content for a long time, it is guaranteed that there is always an HDR effect, and the HDR content and screen will not be darkened. Current Results: After about 20-40 minutes, the HDR effect disappears and the screen darkens.
4
0
557
Jul ’25
Live HLS Stream Not Playing After Window Resize in Vision Pro
Hi everyone, I'm developing a visionOS app for Apple Vision Pro, and I've encountered an issue related to window resizing at runtime when using AVPlayer to play a live HLS stream. ✅ What I'm Trying to Do Play a live HLS stream (from Wowza) inside my app using AVPlayer. Support resizing the immersive window using Vision Pro’s built-in runtime scaling gesture. Stream works fine at default window size when the app launches. ❌ Problem If I resize the app’s window at runtime (using the Vision Pro pinch-drag gesture), then try to start the stream, it does not play. Instead, it just shows the "Loading live stream..." state and never proceeds to playback. This issue only occurs after resizing the window — if I don’t resize, the stream works perfectly every time. 🧪 What I’ve Tried Verified the HLS URL — it’s working and plays fine in Safari and in the app before resizing. Set .automaticallyWaitsToMinimizeStalling = false on AVPlayer. Observed that .status on AVPlayerItem never reaches .readyToPlay after resizing. Tried to force window size back using UIWindowScene.requestGeometryUpdate(...), but behavior persists.
2
0
188
Jul ’25
FCPX Workflow Extension: Drag & Drop FCPXML with Titles to Timeline Not Working Despite Following File Promise Guidelines
I'm developing a Final Cut Pro X workflow extension that transcribes audio and creates a text output. I need to allow users to drag this text directly from my extension into FCPX's timeline as titles. Current Implementation: Using NSFilePromiseProvider as per Apple's guidelines for drag and drop Generating valid FCPXML (v1.10) with proper structure: Complete resources section with format and asset references Event and project hierarchy Asset clip with connected title elements Proper timing and duration calculations Supporting multiple pasteboard types: com.apple.finalcutpro.xml.v1-10 com.apple.finalcutpro.xml.v1-9 com.apple.finalcutpro.xml What's Working: Drag operation initiates correctly File promise provider is set up properly FCPXML generation is successful (verified content) All required pasteboard types are registered Proper logging confirms data is being requested and provided Current Pasteboard Types Offered: com.apple.NSFilePromiseItemMetaData com.apple.pasteboard.promised-file-name com.apple.pasteboard.promised-suggested-file-name com.apple.pasteboard.promised-file-content-type Apple files promise pasteboard type com.apple.pasteboard.NSFilePromiseID com.apple.pasteboard.promised-file-url com.apple.finalcutpro.xml.v1-10 com.apple.finalcutpro.xml.v1-9 com.apple.finalcutpro.xml What additional requirements or considerations are needed to make FCPX accept the dragged FCPXML content? Are there specific requirements for workflow extensions regarding drag and drop operations with titles that aren't documented? Any insights, especially from those who have implemented similar functionality in FCPX workflow extensions, would be greatly appreciated. Technical Details: macOS Version: 15.5 (24F74) FCPX Version: 11.1.1 Extension built with SwiftUI and AppKit integration Using NSFilePromiseProvider and NSPasteboardItemDataProvider Full pasteboard type support for FCPXML versions
0
0
129
Jul ’25
AVFoundation — MJPEG Custom-Resolution UVC Stream Not Working on macOS
Hello, I'm Soonwon. We’re currently developing a UVC camera device and trying to stream MJPEG video via AVFoundation on macOS. However, we’re running into a problem with custom resolutions. When we try to use AVFoundation on macOS to capture MJPEG video at 1000x6000, the stream is not accepted or simply doesn’t work. Lower resolutions work fine. (Interestingly, using the same device on iPadOS, we can capture the 1000x6000 MJPEG stream successfully by using AVCaptureSessionPresetInputPriority.) Is there any way to receive custom-resolution MJPEG streams (like 1000x6000) from a UVC device using AVFoundation on macOS? Are there specific session presets, entitlements, or known limitations that affect MJPEG handling at custom resolutions on macOS? Does macOS handle MJPEG differently from iPadOS in AVFoundation? Any insight or guidance would be greatly appreciated. Thank you! NSError *error = nil; if ([selectedDevice lockForConfiguration:&error]) { [session beginConfiguration]; session.sessionPreset = AVCaptureSessionPresetHigh; bool foundFormat = false; for (AVCaptureDeviceFormat *format in selectedDevice.formats) { CMVideoDimensions dims = CMVideoFormatDescriptionGetDimensions(format.formatDescription); FourCharCode pixelFormat = CMFormatDescriptionGetMediaSubType(format.formatDescription); foundFormat = true; if (dims.width == 1000 && dims.height == 6000) { selectedDevice.activeFormat = format; foundFormat = true; break; } } if(foundFormat == false) { NSLog(@"Failed to foundFormat : "); [session commitConfiguration]; return false; } NSError* error = nil; AVCaptureDeviceInput* input = [AVCaptureDeviceInput deviceInputWithDevice:selectedDevice error:&error]; if (error || ![session canAddInput:input]) { NSLog(@"Failed to add video input: %@", error.localizedDescription); [session commitConfiguration]; return false; } [session addInput:input]; AVCaptureVideoDataOutput* output = [[AVCaptureVideoDataOutput alloc] init]; output.alwaysDiscardsLateVideoFrames = YES; output.videoSettings = @{ (NSString*)kCVPixelBufferPixelFormatTypeKey : @(kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange) }; [output setSampleBufferDelegate:delegate queue:queue]; if ([session canAddOutput:output]) { [session addOutput:output]; } [session commitConfiguration]; [selectedDevice unlockForConfiguration]; } else { NSLog(@"Failed to lock device for configuration: %@", error.localizedDescription); } // start~
1
0
361
Jul ’25
Is it possible to query the TV's current refresh rate?
I'm working on a media app that would like to be able to tell if the TV connected to tvOS is running at 59.94hz or 60.00hz, so it can optimize a video stream. It looks like the best I can currently do is to check if the user has Match Content Rate enabled, and based on that, when calling displayManager.preferredDisplayCriteria to change video modes, I could guess which rate their TV might be in. It's not very ideal, because not all TVs support both of these rates, and my request for 59.94 might end up as 60 and vice versa. I dug around and can't find any available method in UIScreen to get this info. The odd thing is, the data is right there in currentMode when I look in the debugger, but it seems to be in a private or undocumented class. Is there any way to get at it?
0
0
171
Jun ’25
AsyncPublisher for AVPlayerItem not working
Hello, I'm trying to subscribe to AVPlayerItem status updates using Combine and it's bridge to Swift Concurrency – .values. This is my sample code. struct ContentView: View { @State var player: AVPlayer? @State var loaded = false var body: some View { VStack { if let player { Text("loading status: \(loaded)") Spacer() VideoPlayer(player: player) Button("Load") { Task { let item = AVPlayerItem( url: URL(string: "https://sample-videos.com/video321/mp4/360/big_buck_bunny_360p_5mb.mp4")! ) player.replaceCurrentItem(with: item) let publisher = player.publisher(for: \.status) for await status in publisher.values { print(status.rawValue) if status == .readyToPlay { loaded = true break } } print("we are out") } } } else { Text("No video selected") } } .task { player = AVPlayer() } } } After I click on the "load" button it prints out 0 (as the initial status of .unknown) and nothing after – even when the video is fully loaded. At the same time this works as expected (loading status is set to true): struct ContentView: View { @State var player: AVPlayer? @State var loaded = false @State var cancellable: AnyCancellable? var body: some View { VStack { if let player { Text("loading status: \(loaded)") Spacer() VideoPlayer(player: player) Button("Load") { Task { let item = AVPlayerItem( url: URL(string: "https://sample-videos.com/video321/mp4/360/big_buck_bunny_360p_5mb.mp4")! ) player.replaceCurrentItem(with: item) let stream = AsyncStream { continuation in cancellable = item.publisher(for: \.status) .sink { if $0 == .readyToPlay { continuation.yield($0) continuation.finish() } } } for await _ in stream { loaded = true cancellable?.cancel() cancellable = nil break } } } } else { Text("No video selected") } } .task { player = AVPlayer() } } } Is this a bug or something?
0
0
91
Jun ’25
VTDecompressionSession consistently drops frames after sync frame
I'm seeking to a specific sync frame in a video file (HEVC, recorded on iPad). When I feed the buffers from that sync frame on to VTDecompressionSession it consistently drops the 2.,3.,4. buffer with a kVTVideoDecoderReferenceMissingErr (or no error but no buffer on the simulator). If I feed all the buffers from the penultimate sync frame prior to the desired frame the buffers come out fine but that would just create a massive overhead to always do it. Tried multiple OS versions, devices etc. Seems a consistent problem. Here's a sample project with the offending video (disregard memory handling etc): https://github.com/marcuseckert/vtSample I've filed a radar FB18228296 but would appreciate any feedback on circumventing or at least detecting this behavior prior to decoding.
0
0
67
Jun ’25
How to request for Video Subscriber SSO entitlement from Apple
Hi All. I'm working on Single-Sign-On feature in my application to let customers sign into their TV Provider. I need to add Video Subscriber SSO entitlement (com.apple.developer.video-subscriber-single-sign-on) to the app, but I found out that it's a special entitlement, need to contact Apple to enable it for my Apple account. On https://developer.apple.com/account I navigated to Support -&gt; Contact Us -&gt; Development and Technical -&gt; Entitlements and ask in the email about missing entitlement (ticket ID 102478794279). The support team couldn't help me, they redirected me to the operations team. I've been waiting for a few months now but they inform me to keep waiting. Is there a better way to contact Apple and get Video Subscriber SSO entitlement in an efficient way?
1
0
76
Jun ’25
WideCamera consumes more CPU that telePhotoCamera
I have beet taking images from the iOS video camera feed and have encountered an issue. When you take images form the wideCamera this consumes about half the phone's CPU. The same is not the case when you take images from the telephotoCamera video stream. Is there a way of disabling the extra processing that is being done?
1
0
45
Jun ’25
Obtain the screen rotation direction in the background
I use replaykit for system-level screen recording. I want to determine whether the screen is in landscape mode by calling back CMSamplebuffer, but CMSamplebuffer does not come with this information. The other several apis related to obtaining the screen orientation are also restricted by the background. I want to know whether the information of the screen rotation direction can be obtained in real time in the background
1
0
58
Jun ’25
WideFOV - APMP - Stereo
Does anyone have a template of an Apple Projected Media Profile Format Description or a File of a Stereo wideFOV video? Use case I have 2 compatible cameras that I stereo sync and I want to move the projection information from the compatible video to the Spatial video that combines them. Every version I can come up with crashes the AVP and when viewing as Spatial in Tahoe I just get a black screen.
4
0
141
Jun ’25
How to disable automatic updates to MPNowPlayingInfoCenter from AVPlayer
I’m building a SwiftUI app whose primary job is to play audio. I manage all of the Now-Playing metadata and Command center manually via the available shared instances: MPRemoteCommandCenter.shared() MPNowPlayingInfoCenter.default().nowPlayingInfo In certain parts of the app I also need to display videos, but as soon as I attach another AVPlayer, it automatically pushes its own metadata into the Control Center and overwrites my audio info. What I need: a way to show video inline without ever having that video player update the system’s Now-Playing info (or Control Center). In my app, I start by configuring the shared audio session do { try AVAudioSession.sharedInstance().setCategory(.playback, mode: .default, options: [ .allowAirPlay, .allowBluetoothA2DP ]) try AVAudioSession.sharedInstance().setActive(true) } catch { NSLog("%@", "**** Failed to set up AVAudioSession \(error.localizedDescription)") } and then set the MPRemoteCommandCenter commands and MPNowPlayingInfoCenter nowPlayingInfo like mentioned above. All this works without any issues as long as I only have one AVPlayer in my app. But when I add other AVPlayers to display some videos (and keep the main AVPlayer for the sound) they push undesired updates to MPNowPlayingInfoCenter: struct VideoCardView: View { @State private var player: AVPlayer let videoName: String init(player: AVPlayer = AVPlayer(), videoName: String) { self.player = player self.videoName = videoName guard let path = Bundle.main.path(forResource: videoName, ofType: nil) else { return } let url = URL(fileURLWithPath: path) let item = AVPlayerItem(url: url) self.player.replaceCurrentItem(with: item) } var body: some View { VideoPlayer(player: player) .aspectRatio(contentMode: .fill) .onAppear { player.isMuted = true player.allowsExternalPlayback = false player.actionAtItemEnd = .none player.play() MPNowPlayingInfoCenter.default().nowPlayingInfo = nil MPNowPlayingInfoCenter.default().playbackState = .stopped NotificationCenter.default.addObserver(forName: .AVPlayerItemDidPlayToEndTime, object: player.currentItem, queue: .main) { notification in guard let finishedItem = notification.object as? AVPlayerItem, finishedItem === player.currentItem else { return } player.seek(to: .zero) player.play() } } .onDisappear { player.pause() } } } Which is why I tried adding: MPNowPlayingInfoCenter.default().nowPlayingInfo = nil MPNowPlayingInfoCenter.default().playbackState = .stopped // or .interrupted, .unknown But that didn't work. I also tried making a wrapper around the AVPlayerViewController in order to set updatesNowPlayingInfoCenter to false, but that didn’t work either: struct CustomAVPlayerView: UIViewControllerRepresentable { let player: AVPlayer func makeUIViewController(context: Context) -> AVPlayerViewController { let vc = AVPlayerViewController() vc.player = player vc.updatesNowPlayingInfoCenter = false vc.showsPlaybackControls = false return vc } func updateUIViewController(_ controller: AVPlayerViewController, context: Context) { controller.player = player } } Hence any help on how to embed video in SwiftUI without its AVPlayer touching MPNowPlayingInfoCenter would be greatly appreciated. All this was tested on an actual device with iOS 18.4.1, and built with Xcode 16.2 on macOS 15.5
2
0
147
Jun ’25