Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.

All subtopics
Posts under Media Technologies topic

Post

Replies

Boosts

Views

Activity

CarPlay music issues iOS 18.1 (22B5054e)
When my CarPlay connects and tries to play music it plays it through the vehicles phone speakers. If I make a phone call and then hang up it pushes the sound back to the vehicles stereo speakers. Does anyone have a fix for this because this is very annoying. Sometimes it will just switch to the phone speakers and I have to complete the process again.
2
1
533
Sep ’24
ProRAW to CIRAWFilter to HEIF producing borked HDR results
Following WWDC 2023 "Support HDR images in your app", I'm trying to save 48-megapixel ProRAWs (taken on an iPhone 14 Pro Max) as HDR HEICs to the Photo Library. After processing the ProRAW file using CIRAWFilter, whether I use CIContext.heif10Representation() or convert to a CGImage, then UIImage, and use UIImage.heicData(), I get photos that behave oddly in the Photo Library. They appear too dark, and visibly brighten when first viewed, but more problematic is that the photos brighten a great deal more when you edit them with the Photos editor. This is the behavior when using the itur_2100_PQ color space, but itur_2100_HLG behaves similarly, except that it gets dramatically darker when edited. This behavior occurs whether CIRAWFilter.extendedDynamicRangeAmount is set to 0.0, or 2.0, or not set at all. So what am I doing wrong? Here is a minimal iOS app -- well, just the ContentView -- that demonstrates the issue. You also need a .dng ProRAW file included in the project directory named test.dng. I'd love to include such a file, but I can't. Be prepared for a multi-second wait when you save the photo. import SwiftUI import Photos struct ContentView: View { let context = CIContext() let hdrColorSpace = CGColorSpace(name: CGColorSpace.itur_2100_PQ)! var body: some View { VStack(spacing: 100) { Button("Save Photo From CGImage/UIImage") { savePhotoFromUIImage() } Button("Save Photo From CIImage") { savePhotoDirectFromCIImage() } }.padding(60) } //convert RAW with CIRAWFilter to CIImage, then convert to CGImage, then UIImage, then HEIF private func savePhotoFromUIImage() { if let ciImage = processRAW(url: Bundle.main.url(forResource:"test", withExtension: "dng")!) { guard let outputCGImage = context.createCGImage(ciImage, from: ciImage.extent, format: .RGB10, colorSpace: hdrColorSpace) else { return } let uiImage = UIImage(cgImage: outputCGImage) if let heicData = uiImage.heicData() { saveHEIFPhotoToLibrary(imageData: heicData) } else { print("Failed to convert UIImage to HEIC") } } } //convert RAW with CIRAWFilter to CIImage, then to HEIF private func savePhotoDirectFromCIImage() { if let ciImage = processRAW(url: Bundle.main.url(forResource:"test", withExtension: "dng")!) { do { let heif = try context.heif10Representation(of: ciImage, colorSpace: hdrColorSpace) saveHEIFPhotoToLibrary(imageData: heif) } catch { print("Failed to get HEIF representation from CIContext") } } } private func processRAW(url: URL) -> CIImage? { guard let coreRawFilter = CIRAWFilter(imageURL: url) else { return nil } coreRawFilter.extendedDynamicRangeAmount = 2.0 //the issue persists whether this is not set, or set to 0, or set to, say, 2.0 guard let ciImage = coreRawFilter.outputImage else { return nil } return ciImage } private func saveHEIFPhotoToLibrary(imageData: Data) { PHPhotoLibrary.shared().performChanges({ let creationRequest = PHAssetCreationRequest.forAsset() let options = PHAssetResourceCreationOptions() creationRequest.addResource(with: .photo, data: imageData, options: options) }) { success, error in if let error = error { print("Error saving photo: \(error.localizedDescription)") } else { print("Photo saved.") } } } }
0
1
522
Jan ’25
[VisionOS Audio] AVAudioPlayerNode occasionally produces loud popping/distortion when playing PCM data
I'm experiencing audio issues while developing for visionOS when playing PCM data through AVAudioPlayerNode. Issue Description: Occasionally, the speaker produces loud popping sounds or distorted noise This occurs during PCM audio playback using AVAudioPlayerNode The issue is intermittent and doesn't happen every time Technical Details: Platform: visionOS Device: vision pro / simulator Audio Framework: AVFoundation Audio Node: AVAudioPlayerNode Audio Format: PCM I would appreciate any insights on: Common causes of audio distortion with AVAudioPlayerNode Recommended best practices for handling PCM playback in visionOS Potential configuration issues that might cause this behavior Has anyone encountered similar issues or found solutions? Any guidance would be greatly helpful. Thank you in advance!
2
1
604
Jan ’25
macOS Sonoma 'Cannot Decode' HLS Video
I use AVPlayer to play HLS video successfully on macOS Sonoma, but I encountered this error on macOS Sequoia. Please help me: Error Domain=AVFoundationErrorDomain Code=-11833 ‘Cannot Decode’ UserInfo={NSUnderlyingError=0x600001e57330 {Error Domain=CoreMediaErrorDomain Code=-12906 ‘(null)’}, NSLocalizedFailureReason=The decoder required for this media cannot be found., AVErrorMediaTypeKey=vide, NSLocalizedDescription=Cannot Decode} Thanks!
2
1
601
Mar ’25
SpeechAnalyzer speech to text wwdc sample app
I am using the sample app from: https://developer.apple.com/videos/play/wwdc2025/277/?time=763 I installed this on an Iphone 15 Pro with iOS 26 beta 1. I was able to get good transcription with it. The app did crash sometimes when transcribing and I was going to post here with the details. I then installed iOS beta 2 and uninstalled the sample app. Now every time I try to run the sample app on the 15 Pro I get this message: SpeechAnalyzer: Input loop ending with error: Error Domain=SFSpeechErrorDomain Code=10 "Cannot use modules with unallocated locales [en_US (fixed en_US)]" UserInfo={NSLocalizedDescription=Cannot use modules with unallocated locales [en_US (fixed en_US)]} I can't continue our our work towards using SpeechAnalyzer now with this error. I have set breakpoints on all the catch handlers and it doesn't catch this error. My phone region is "United States"
19
7
1.1k
10h
slow decoding of animated AVIF images
Safari is supposed to support animated AVIF images since version 16, but the ones I've tested perform very poorly, even on an M4 Mac Mini running Sequoia 15.1.1. I believe Safari delegates decoding to the operating system itself, so this issue also happens in Live Preview in the finder, when I try to preview a file. Sample file here: https://s3.us-west-2.amazonaws.com/cdn.paintera.org/test/sample.avif 322KB file, 5 seconds long, 12fps This plays perfectly on Chrome on Mac OS, but is slow and laggy on Safari and Live Preview (it takes about 6.5 seconds to finish the 5 second video). Does anyone know how to fix this or workaround this issue?
2
1
509
Jan ’25
App crashes when opening camera from file input in WKWebView (iOS)
Hello everyone, I have a SwiftUI app using WKWebView to load a website that includes a form with a file input (). The issue is: 📌 When a user taps “Browse” and selects “Take Photo” (camera option), the app crashes before the camera opens. Setup Details: • App Uses SwiftUI with WKWebView • The crash occurs only when selecting “Take Photo”, but selecting an image from the library works fine. 📌 Full Code (WKWebView in SwiftUI) import SwiftUI import WebKit struct WebViewRepresentable: UIViewRepresentable { var urlString: String func makeUIView(context: Context) -> WKWebView { let webView = WKWebView() webView.configuration.allowsInlineMediaPlayback = true webView.configuration.mediaTypesRequiringUserActionForPlayback = [] loadURL(in: webView) return webView } func updateUIView(_ uiView: WKWebView, context: Context) { loadURL(in: uiView) } private func loadURL(in webView: WKWebView) { if let url = URL(string: urlString) { webView.load(URLRequest(url: url)) } } } struct ContentView: View { @State private var currentURL: String = "https://fv-wohlensee.ch" var body: some View { VStack(spacing: 0) { // Oberer Bereich in Grün Color(red: 0, green: 0.4, blue: 0) .frame(height: 50) // WebView with white background WebViewRepresentable(urlString: currentURL) .background(Color.white) Divider() // Navigation buttons HStack(spacing: 10) { Button { currentURL = "https://fv-wohlensee.ch/vereinshaus-eymatt/" } label: { VStack { Image(systemName: "house") .font(.system(size: 18)) Text("Klubhaus") .font(.system(size: 12)) .minimumScaleFactor(0.7) .lineLimit(1) } .padding(8) } .foregroundColor(.white) .frame(maxWidth: .infinity) Button { currentURL = "https://fv-wohlensee.ch/vereinsboot/" } label: { VStack { Image(systemName: "ferry.fill") .font(.system(size: 18)) Text("Boot") .font(.system(size: 12)) .minimumScaleFactor(0.7) .lineLimit(1) } .padding(8) } .foregroundColor(.white) .frame(maxWidth: .infinity) Button { currentURL = "https://fv-wohlensee.ch/aktivitaeten/" } label: { VStack { Image(systemName: "calendar") .font(.system(size: 18)) Text("Aktivitäten") .font(.system(size: 12)) .minimumScaleFactor(0.7) .lineLimit(1) } .padding(8) } .foregroundColor(.white) .frame(maxWidth: .infinity) Button { currentURL = "https://fv-wohlensee.ch/mitglied-werden/" } label: { VStack { Image(systemName: "person.badge.plus") .font(.system(size: 18)) Text("Mitglied") .font(.system(size: 12)) .minimumScaleFactor(0.7) .lineLimit(1) } .padding(8) } .foregroundColor(.white) .frame(maxWidth: .infinity) } .padding(.horizontal, 15) .padding(.vertical, 10) .background(Color(red: 0, green: 0.4, blue: 0)) } .frame(maxWidth: .infinity, maxHeight: .infinity) .background(Color(red: 0, green: 0.4, blue: 0)) .ignoresSafeArea() } } struct ContentView_Previews: PreviewProvider { static var previews: some View { ContentView() } } What I’ve Tried: 1️⃣ Checked Info.plist: Added permissions for camera and photo library: <key>NSCameraUsageDescription</key> <string>This app requires access to the camera to upload photos.</string> <key>NSPhotoLibraryUsageDescription</key> <string>This app requires access to your photo library.</string> 2️⃣ Enabled Media Capture in WKWebView: webView.configuration.allowsInlineMediaPlayback = true webView.configuration.mediaTypesRequiringUserActionForPlayback = [] 3️⃣ Tested in Safari: The same form works fine when opened in Safari. Questions: ❓ Does WKWebView need additional permissions to open the camera? ❓ Do I need to implement a delegate to handle file uploads in SwiftUI? ❓ Has anyone faced this issue and found a fix? Any guidance would be greatly appreciated! 🚀 Thanks in advance! 😊
1
1
423
Jan ’25
Live Photos created with PHLivePhoto API show "Motion not available" when setting as wallpaper
I'm creating Live Photos programmatically in my app using the Photos and AVFoundation frameworks. While the Live Photos work perfectly in the Photos app (long press shows motion), users cannot set them as motion wallpapers. The system shows "Motion not available" message. Here's my approach for creating Live Photos: // 1. Create video with required metadata let writer = try AVAssetWriter(outputURL: videoURL, fileType: .mov) let contentIdentifier = AVMutableMetadataItem() contentIdentifier.identifier = .quickTimeMetadataContentIdentifier contentIdentifier.value = assetIdentifier as NSString writer.metadata = [contentIdentifier] // Video settings: 882x1920, H.264, 30fps, 2 seconds // Added still-image-time metadata at middle frame // 2. Create HEIC image with asset identifier var makerAppleDict: [String: Any] = [:] makerAppleDict["17"] = assetIdentifier // Required key for Live Photo metadata[kCGImagePropertyMakerAppleDictionary as String] = makerAppleDict // 3. Generate Live Photo PHLivePhoto.request( withResourceFileURLs: [photoURL, videoURL], placeholderImage: nil, targetSize: .zero, contentMode: .aspectFit ) { livePhoto, info in // Success - Live Photo created } // 4. Save to Photos library PHAssetCreationRequest.forAsset().addResource(with: .photo, fileURL: photoURL, options: nil) PHAssetCreationRequest.forAsset().addResource(with: .pairedVideo, fileURL: videoURL, options: nil) What I've Tried Matching exact video specifications from Camera app (882x1920, H.264, 30fps) Adding all documented metadata (content identifier, still-image-time) Testing various video durations (1.5s, 2s, 3s) Different image formats (HEIC, JPEG) Comparing with exiftool against working Live Photos Expected Behavior Live Photos created programmatically should be eligible for motion wallpapers, just like those from the Camera app. Actual Behavior System shows "Motion not available" and only allows setting as static wallpaper. Any insights or workarounds would be greatly appreciated. This is affecting our users who want to use their created content as wallpapers. Questions Are there additional undocumented requirements for Live Photos to be wallpaper-eligible? Is this a deliberate restriction for third-party apps, or a bug? Has anyone successfully created Live Photos that work as motion wallpapers? Environment iOS 17.0 - 18.1 Xcode 16.0 Tested on iPhone 16 Pro
1
1
223
3w
New FairPlay Keys
Hello, My company has an in-store app with FPS SDK 4.x (1024) keys. We've handed those keys over to a trusted third-party and we do not have them. We've been in-store for several years. The person that created the keys in our organization mistakenly stored them encrypted to our third-party's PGP keys, so we cannot decrypt them, and the third party also has no mechanism to provide us with the keys even though it is in their runtime environment. They only have secure mechanisms for us to upload keys onto their servers. We are trying to migrate to a different third-party DRM provider, and would like to obtain new keys. Unfortunately, the developer portal won't let me create new keys, saying that we have exceeded the number of keys allowed, which I assume is one. Additionally, the new DRM provider can only support SDK 4.x keys, and it appears that we can only request SDK 5.x keys on the Apple Developer portal, as the SDK 4.0 option is grayed out. Regardless, it seems that we are not able to request any keys. We've submitted a request to the support e-mail address and received an automated e-mail that the response should take a few days, but may take longer on occasion. It's now been a month. The e-mail says that the reply address is not monitored. Is there any way we can accelerate this? Thank you, Carlos
0
1
189
4w
AVDevice is ignoring 60fps
Hello, I try to get the Video from an HDMI USB capture card and show it in a PreviewLayer with 60fps. The device I am using (ShadowCast 2) is supporting 1080p with 60fps in "yuvs" and "420v". This is my code with stripped away uninteresting stuff and removed error handling to build the previewLayer. I am using the AVFrameRateRange because the capture device is not directly supporting 60.00 but <AVFrameRateRange: 0x600000875680 60.00 - 60.00 (1000000 / 60000240 - 1000000 / 60000240)> fps. @Observable final class AVFoundationService: AVService { // Live View private let session: AVCaptureSession = .init() var previewLayer: AVCaptureVideoPreviewLayer { let layer = AVCaptureVideoPreviewLayer(session: session) layer.videoGravity = .resizeAspect return layer } var activeVideoDevice: AVCaptureDevice? { // TODO: implement correct logic if let device = videoDevices.first(where: { $0.localizedName.contains("Shadow") }) { return device } return AVCaptureDevice.default(for: .video) } func setupStreamDemo(completion: @escaping (Error?) -> Void) { session.beginConfiguration() if let device = activeVideoDevice { do { let input = try AVCaptureDeviceInput(device: device) if session.canAddInput(input) { session.addInput(input) } else { print("explode") } for format in device.formats { let dimensions = CMVideoFormatDescriptionGetDimensions(format.formatDescription) if dimensions.width == 1920 && dimensions.height == 1080 && format.formatDescription.mediaSubType.description == "'yuvs'" { let foundFPS = format.videoSupportedFrameRateRanges.first { Int($0.minFrameRate) == 60 && Int($0.minFrameRate) == 60 } try device.lockForConfiguration() device.activeFormat = format device.activeVideoMinFrameDuration = foundFPS!.minFrameDuration device.activeVideoMaxFrameDuration = foundFPS!.minFrameDuration device.unlockForConfiguration() } } } catch { return completion(error) } } session.commitConfiguration() session.startRunning() completion(nil) } } I am using the following code in SwiftUI to show the AVCaptureVideoPreviewLayer. struct VideoPreviewView: NSViewRepresentable { private let previewLayer: AVCaptureVideoPreviewLayer func makeNSView(context: Context) -> NSView { let view = NSView() view.layer = self.previewLayer view.layer?.frame = view.bounds return view } func updateNSView(_ nsView: NSView, context: Context) { if let layer = nsView.layer as? AVCaptureVideoPreviewLayer { layer.session = self.previewLayer.session } } } When I now run my app, it will ignore whatever I set on device.activeVideoMinFrameDuration and/or device.activeVideoMaxFrameDuration. If I set it to 10 fps - it's running with 30, if I set 60 it is running with 30. If I start in parallel to my app QuickTime and start a "Recording" from my USB Capture Card, it will switch to 60fps mode. I am on Mac Sequoia 15.0 with Xcode 16.0. What I am doing wrong?
1
1
629
Oct ’24
CMFormatDescription.audioStreamBasicDescription has wrong or unexpected sample rate for audio channels with different sample rates
In my app I use AVAssetReaderTrackOutput to extract PCM audio from a user-provided video or audio file and display it as a waveform. Recently a user reported that the waveform is not in sync with his video, and after receiving the video I noticed that the waveform is in fact double as long as the video duration, i.e. it shows the audio in slow-motion, so to speak. Until now I was using CMFormatDescription.audioStreamBasicDescription.mSampleRate which for this particular user video returns 22'050. But in this case it seems that this value is wrong... because the audio file has two audio channels with different sample rates, as returned by CMFormatDescription.audioFormatList.map({ $0.mASBD.mSampleRate }) The first channel has a sample rate of 44'100, the second one 22'050. If I use the first sample rate, the waveform is perfectly in sync with the video. The problem is given by the fact that the ratio between the audio data length and the sample rate multiplied by the audio duration is 8, double the ratio for the first audio file (4). In the code below this ratio is given by Double(length) / (sampleRate * asset.duration.seconds) When commenting out the line with the sampleRate variable definition in the code below and uncommenting the following line, the ratios for both audio files are 4, which is the expected result. I would expect audioStreamBasicDescription to return the correct sample rate, i.e. the one used by AVAssetReaderTrackOutput, which (I think) somehow merges the stereo tracks. The documentation is sparse, and in particular it’s not documented whether the lower or higher sample rate is used; in this case, it seems like the higher one is used, but audioStreamBasicDescription for some reason returns the lower one. Does anybody know why this is the case or how I should extract the sample rate of the produced PCM audio data? Should I always take the higher one? I created FB19620455. let openPanel = NSOpenPanel() openPanel.allowedContentTypes = [.audiovisualContent] openPanel.runModal() let url = openPanel.urls[0] let asset = AVURLAsset(url: url) let assetTrack = asset.tracks(withMediaType: .audio)[0] let assetReader = try! AVAssetReader(asset: asset) let readerOutput = AVAssetReaderTrackOutput(track: assetTrack, outputSettings: [AVFormatIDKey: Int(kAudioFormatLinearPCM), AVLinearPCMBitDepthKey: 16, AVLinearPCMIsBigEndianKey: false, AVLinearPCMIsFloatKey: false, AVLinearPCMIsNonInterleaved: false]) readerOutput.alwaysCopiesSampleData = false assetReader.add(readerOutput) let formatDescriptions = assetTrack.formatDescriptions as! [CMFormatDescription] let sampleRate = formatDescriptions[0].audioStreamBasicDescription!.mSampleRate //let sampleRate = formatDescriptions[0].audioFormatList.map({ $0.mASBD.mSampleRate }).max()! print(formatDescriptions[0].audioStreamBasicDescription!.mSampleRate) print(formatDescriptions[0].audioFormatList.map({ $0.mASBD.mSampleRate })) if !assetReader.startReading() { preconditionFailure() } var length = 0 while assetReader.status == .reading { guard let sampleBuffer = readerOutput.copyNextSampleBuffer(), let blockBuffer = sampleBuffer.dataBuffer else { break } length += blockBuffer.dataLength } print(Double(length) / (sampleRate * asset.duration.seconds))
0
1
86
Aug ’25
MusicKit Queue broke in iOS18
It's simple to reproduce. The bug is simply when you queue a bunch of songs to play, it will always queue less than what you gave it. Here, I'm attempting to play an apple curated playlist, it will only queue a subset, usually less than 15, but as low as 1 out of 100. Use the system's forward and backwards to test it out. Here is the code, just paste it in to the ContentView file and make sure you have the capibility to run it. import SwiftUI import MusicKit struct ContentView: View { var body: some View { VStack{ Button("Play Music") { Task{ await playMusic() } } } } } func getOnlySongsFromTracks(tracks:MusicItemCollection<Track>?) async throws ->MusicItemCollection<Song>?{ var songs:[Song]? if let t = tracks{ songs = [Song]() for track in t { if case let .song(song) = track { songs?.append(song) print("track is song \(track.debugDescription)") }else{ print("track not song \(track.debugDescription)") } } } if let songs = songs { let topSongs = MusicItemCollection(songs) return topSongs } return nil } func playMusic() async { // Request authorization let status = await MusicAuthorization.request() guard status == .authorized else { print("Music authorization denied.") return } do { // Perform a hardcoded search for a playlist let searchTerm = "2000" let request = MusicCatalogSearchRequest(term: searchTerm, types: [Playlist.self]) let response = try await request.response() guard let playlist = response.playlists.first else { print("No playlists found for the search term '\(searchTerm)'.") return } // Fetch the songs in the playlist let detailedPlaylist = try await playlist.with([.tracks]) guard let songCollection = try await getOnlySongsFromTracks(tracks: detailedPlaylist.tracks) else { print("no songs found") return } guard let t = detailedPlaylist.tracks else { print("no tracks") return } // Create a queue and play let musicPlayer = ApplicationMusicPlayer.shared let q = ApplicationMusicPlayer.Queue(for: t) musicPlayer.queue = q try await musicPlayer.play() print("Now playing playlist: \(playlist.name)") } catch { print("An error occurred: \(error.localizedDescription)") } }
3
1
712
Sep ’24
How To Play Audio Through Headphones on WatchOS 11?
I have an app that plays audio and the behaviour of it has changed in watchOS 11. I can no longer figure out how to play the audio through the headphones. To play audio I.. let session = AVAudioSession.sharedInstance() try session.setCategory(.playback, mode: .default, policy: .longFormAudio, options: [] let activated = try await session.activate() if activated { // play audio } In previous versions, 'try await session.activate()` would bring up a route picker where the user could select their headphones. Now on watchOS 11 it just plays the audio out of the speaker. Maybe that's what some people want but if they do want it to play out of the headphones I can't see how I can give that option now? There's no AVRoutePickerView available on watchOS for selecting it. I've tried setting the category to .multiRoute instead of .playback and that does bring up the picker but selecting the speaker results in an error code and selecting the headphones results in it saying it cannot find my headphones (which shouldn't be the case since Apple Music on watchOS finds them). Tried overriding the output with try session.overrideOutputAudioPort(.speaker) but the compiler complains that speaker isn't available on watchOS, which is strange as if I understand correctly it's possible to play through the speaker now at least on some Apple Watches. So is this a bug or is there some way I've not found of playing audio through the headphones?
1
1
553
Oct ’24
[Request] Support for Spotify-like Audio Analysis API for Apple Music.
Hi, I have been working on a project that enables users to listen to their favorite music using a streaming service, which so far was Spotify. The app had a programmable 3D/2D interface with the ability to connect to devices in your home and have them react to music. As of September 2024, Spotify decomissioned their Audio Analysis API. I have seen other posts mention playing Apple Music through AVFoundation, which would break DRM and so it’s not supported. However, the Spotify Audio Analysis API does not allow for a full frequency reconstruction. It is entirely temporal data on beats, kicks, loudness, and timbre changes, which themselves are operators on the spectral data from the FFT. It would be very useful for the developer community if we get the ability to do this and it will probably Apple Music among developers and those who use their apps a lot more. Would love to hear your thoughts about this and Happy New Year!
0
1
586
Dec ’24
Crash iOS 26.0: [__NSSingleObjectArrayI selectedMediaOptionInMediaSelectionGroup:]: unrecognized selector sent to instance
I'm having a crash on an app that plays videos when the users activates close captions. I was able to replicate the issue on an empty project. The crash happens when the AVPlayerLayer is used to instantiate an AVPictureInPictureController These are the example project where I tested the crash: struct ContentView: View { var body: some View { VStack { VideoPlaylistView() } .frame(maxWidth: .infinity, maxHeight: .infinity) .background(Color.black.ignoresSafeArea()) } } class VideoPlaylistViewModel: ObservableObject { // Test with other videos var player: AVPlayer? = AVPlayer(url: URL(string:"https://d2ufudlfb4rsg4.cloudfront.net/newsnation/WIpkLz23h/adaptive/WIpkLz23h_master.m3u8")!) } struct VideoPlaylistView: View { @StateObject var viewModel = VideoPlaylistViewModel() var body: some View { ScrollView { VideoCellView(player: viewModel.player) .onAppear { viewModel.player?.play() } } .scrollTargetBehavior(.paging) .ignoresSafeArea() } } struct VideoCellView: View { let player: AVPlayer? @State var isCCEnabled: Bool = false var body: some View { ZStack { PlayerView(player: player) .accessibilityIdentifier("Player View") } .containerRelativeFrame([.horizontal, .vertical]) .overlay(alignment: .bottom) { Button { player?.currentItem?.asset.loadMediaSelectionGroup(for: .legible) { group,error in if let group { let option = !isCCEnabled ? group.options.first : nil player?.currentItem?.select(option, in: group) isCCEnabled.toggle() } } } label: { Text("Close Captions") .font(.subheadline) .foregroundStyle(isCCEnabled ? .red : .primary) .buttonStyle(.bordered) .padding(8) .background(Color.blue.opacity(0.75)) } .padding(.bottom, 48) .accessibilityIdentifier("Button Close Captions") } } } import Foundation import UIKit import SwiftUI import AVFoundation import AVKit struct PlayerView: UIViewRepresentable { let player: AVPlayer? func updateUIView(_ uiView: UIView, context: UIViewRepresentableContext<PlayerView>) { } func makeUIView(context: Context) -> UIView { let view = PlayerUIView() view.playerLayer.player = player view.layer.addSublayer(view.playerLayer) view.layer.backgroundColor = UIColor.red.cgColor view.pipController = AVPictureInPictureController(playerLayer: view.playerLayer) view.pipController?.requiresLinearPlayback = true view.pipController?.canStartPictureInPictureAutomaticallyFromInline = true view.pipController?.delegate = view return view } } class PlayerUIView: UIView, AVPictureInPictureControllerDelegate { let playerLayer = AVPlayerLayer() var pipController: AVPictureInPictureController? override init(frame: CGRect) { super.init(frame: frame) } required init?(coder: NSCoder) { fatalError("init(coder:) has not been implemented") } override func layoutSubviews() { super.layoutSubviews() playerLayer.frame = bounds playerLayer.backgroundColor = UIColor.green.cgColor } func pictureInPictureController(_ pictureInPictureController: AVPictureInPictureController, failedToStartPictureInPictureWithError error: any Error) { print("Error starting Picture in Picture: \(error.localizedDescription)") } } class AppDelegate: NSObject, UIApplicationDelegate { func application(_ application: UIApplication, didFinishLaunchingWithOptions launchOptions: [UIApplication.LaunchOptionsKey : Any]? = nil) -> Bool { let audioSession = AVAudioSession.sharedInstance() do { try audioSession.setCategory(.playback, mode: .moviePlayback) try audioSession.setActive(true) } catch { print("ERR: \(error.localizedDescription)") } return true } } UITest to make the app crash: final class VideoPlaylistSampleUITests: XCTestCase { func testCrashiOS26ToggleCloseCaptions() throws { let app = XCUIApplication() app.launch() let videoPlayer = app.otherElements["Player View"] XCTAssertTrue(videoPlayer.waitForExistence(timeout: 30)) let closeCaptionButton = app.buttons["Button Close Captions"] for _ in 0..<2000 { closeCaptionButton.tap() } } }
0
1
50
1d