Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.

All subtopics
Posts under Media Technologies topic

Post

Replies

Boosts

Views

Activity

Moving Photos
How can it be that you still don't have the option to move photos into an album instead of just copying them? This is a bad joke, right? The entire Photos app is absolutely untidy and a nightmare for people who like order. I want car photos in the car folder. Vacation photos in the vacation folder without them being visible in the recent folder. Cant be so difficult???
3
0
336
Sep ’24
tvOS 18.0 update harmed passthrough of multichannel audio for previously compatible hardware
tvOS 18 doesn't provide passthrough of multichannel audio for streaming apps offering content where it it promoted as available. This is true for devices for which the functionality existed before the 18.0 tvOS update. What's more, the 18.1 Public Beta did not provide a resolution for the issue. All streaming apps appear to be affected. Notably, Home Sharing does not appear to be affected, and continues to provide multichannel audio as it did before the 18.0 update.
1
2
844
Sep ’24
Silent output MP4 when using AVAssetReaderHello, I am trying to create an MP4 by obtaining the content from another source MP4. The source MP4 would be read with `AVAssetReader` and the output written with `AVAssetWriter`. I wanted to do partial t
Hello, I am trying to create an MP4 by obtaining the content from another source MP4. The source MP4 would be read with AVAssetReader and the output written with AVAssetWriter. I wanted to do partial tests: first, I placed only the video in the output MP4. Now, I am trying to place only the audio in the output MP4. I even managed to get the output MP4 to have the same length (in seconds) as the source MP4. But the problem is simple: the output MP4 is simply silent. Naturally, I want it to have audio. Below are two excerpts from the source code. Reading and writing. Note: The variable videoURL is from the class where the function writeVideo() is located. Its assignment happens in another scope, already debugged. Snippet: let semaphore = DispatchSemaphore(value: 0) func writeVideo() { var audioReaderBuffers = [CMSampleBuffer]() // File directory url = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask)[0].appendingPathComponent("teste/output.mp4") guard let url = url else { return } try FileManager.default.createDirectory(at: url.deletingLastPathComponent(), withIntermediateDirectories: true) if FileManager.default.fileExists(atPath: url.path()) { try FileManager.default.removeItem(at: url) } if let videoURL = videoURL { let videoAsset = AVAsset(url: videoURL) Task { let audioTrack = try await videoAsset.loadTracks(withMediaType: .audio).first! let reader = try AVAssetReader(asset: videoAsset) let audioSettings = [ AVFormatIDKey: kAudioFormatLinearPCM, AVSampleRateKey: 44100, AVNumberOfChannelsKey: 2 ] as [String : Any] let audioOutputTest = try await audioTrack.getAudioSettings() let readerAudioOutput = AVAssetReaderTrackOutput(track: audioTrack, outputSettings: audioSettings) reader.add(readerAudioOutput) reader.startReading() while let sampleBuffer = readerAudioOutput.copyNextSampleBuffer() { audioReaderBuffers.append(sampleBuffer) } semaphore.signal() } } semaphore.wait() let audioInput = createAudioInput(sampleBuffer: audioReaderBuffers[0]) let assetWriter = try AVAssetWriter(outputURL: url, fileType: .mp4) assetWriter.add(audioInput) assetWriter.startWriting() assetWriter.startSession(atSourceTime: .zero) for (index, buffer) in audioReaderBuffers.enumerated() { while !audioInput.isReadyForMoreMediaData { usleep(1000) } audioInput.append(buffer) } assetWriter.finishWriting { switch assetWriter.status { case .completed: print("Operation completed successfully: \(url.absoluteString)") case .failed: if let error = assetWriter.error { print("Error description: \(error.localizedDescription)") } else { print("Error not found.") } default: print("Error not found.") } } } Below is the createAudioInput method: func createAudioInput(sampleBuffer: CMSampleBuffer) -> AVAssetWriterInput { let audioSettings = [ AVFormatIDKey: kAudioFormatMPEG4AAC, AVSampleRateKey: 48000, AVEncoderBitRatePerChannelKey: 64000, AVNumberOfChannelsKey: 1 ] as [String : Any] let audioInput = AVAssetWriterInput(mediaType: .audio, outputSettings: audioSettings, sourceFormatHint: sampleBuffer.formatDescription) audioInput.expectsMediaDataInRealTime = false return audioInput } I await your help, please.
0
0
564
Sep ’24
Bug: Duplicate audio playback in QuickLook with .reality files in iOS 18
I'm experiencing an issue with QuickLook in iOS 18 where.reality files with audio playback are affected. When I open a.reality file that includes audio, the audio track plays twice: once from the moment the file is opened, and again from the start of the animation. This results in a duplicate audio playback. I've tested this issue on multiple devices running iOS 16, 17, and 18, and the problem only occurs on iOS 18. I've tried restarting the devices and checking for any software updates, but the issue persists. Steps to reproduce: Open a.reality file with audio playback in QuickLook on an iOS 18 device. Observe the audio playback. Expected result: The audio track should play only once, from the start of the animation. Actual result: The audio track plays twice, once from the moment the file is opened and again from the start of the animation. Device and iOS version: I've tested this issue on iPhone 12 Pro, iPhone 13 Pro running iOS 18, iPhone 13 running iOS 16 and iPhone 11 Pro running iOS 17,
5
5
605
Sep ’24
RPScreenRecorder startCapture issues on generated file
Hello all, This is my first post on the developer forums. I am developing an app that records the screen of my app, using AVAssetWriter and RPScreenRecorder startCapture. Everything is working as it should on most cases. There are some seemingly random times where the file generated is of some kb and it is corrupted. There seems to be no pattern on what the device is or the iOS version is. It can happen on various phones and iOS versions. The steps I have followed in order to create the file are: configuring the AssetWritter videoAssetWriter = try? AVAssetWriter(outputURL: url!, fileType: AVFileType.mp4) let size = UIScreen.main.bounds.size let width = (Int(size.width / 4)) * 4 let height = (Int(size.height / 4)) * 4 let videoOutputSettings: Dictionary<String, Any> = [ AVVideoCodecKey : AVVideoCodecType.h264, AVVideoWidthKey : width, AVVideoHeightKey : height ] videoInput = AVAssetWriterInput(mediaType: AVMediaType.video, outputSettings: videoOutputSettings) videoInput?.expectsMediaDataInRealTime = true guard let videoInput = videoInput else { return } if videoAssetWriter?.canAdd(videoInput) ?? false { videoAssetWriter?.add(videoInput) } let audioInputsettings = [ AVFormatIDKey: Int(kAudioFormatMPEG4AAC), AVSampleRateKey: 12000, AVNumberOfChannelsKey: 1, AVEncoderAudioQualityKey: AVAudioQuality.high.rawValue ] audioInput = AVAssetWriterInput(mediaType: .audio, outputSettings: audioInputsettings) audioInput?.expectsMediaDataInRealTime = true guard let audioInput = audioInput else { return } if videoAssetWriter?.canAdd(audioInput) ?? false { videoAssetWriter?.add(audioInput) } The urlForVideo returns the URL to the documentDirectory, after appending and creating the folders needed. This part seems to be working as it should as the directories are created and the video file exists on them. Start the recording if RPScreenRecorder.shared().isRecording { return } RPScreenRecorder.shared().startCapture(handler: { [weak self] sample, bufferType, error in if let error = error { onError?(error.localizedDescription) } else { if (!RPScreenRecorder.shared().isMicrophoneEnabled) { RPScreenRecorder.shared().stopCapture { error in if let error = error { return } } onError?("Microphone was not enabled") } else { succesCompletion?() succesCompletion = nil self?.processSampleBuffer(sample, with: bufferType) } } }) { error in if let error = error { onError?(error.localizedDescription) } } Process the sampleBuffers guard CMSampleBufferDataIsReady(sampleBuffer) else { return } DispatchQueue.main.async { [weak self] in switch sampleBufferType { case .video: self?.handleVideoBaffer(sampleBuffer) case .audioMic: self?.add(sample: sampleBuffer, to: self?.audioInput) self?.audioInput) default: break } } // The add function from above fileprivate func add(sample: CMSampleBuffer, to writerInput: AVAssetWriterInput?) { if writerInput?.isReadyForMoreMediaData ?? false { writerInput?.append(sample) } // The handleVideoBaffer function from above fileprivate func handleVideoBaffer(_ sampleBuffer: CMSampleBuffer) { if self.videoAssetWriter?.status == AVAssetWriter.Status.unknown { self.videoAssetWriter?.startWriting() self.videoAssetWriter?.startSession(atSourceTime: CMSampleBufferGetPresentationTimeStamp(sampleBuffer)) } else { if (self.videoInput?.isReadyForMoreMediaData) ?? false { if self.videoAssetWriter?.status == AVAssetWriter.Status.writing { self.videoInput?.append(sampleBuffer) } } } } } Finally the stop recording func stopRecording(completion: @escaping (URL?, URL?, Error?) -> Void) { RPScreenRecorder.shared().stopCapture { error in if let error = error { completion(nil, nil, error) return } self.finish { videoURL, _ in completion(videoURL, nil, nil) } } } // The finish function mentioned above fileprivate func finish(completion: @escaping (URL?, URL?) -> Void) { let dispatchGroup = DispatchGroup() dispatchGroup.enter() finishRecordVideo { dispatchGroup.leave() } dispatchGroup.notify(queue: .main) { print("Finish with url:\(String(describing: self.urlForVideo()))") completion(self.urlForVideo(), nil) } } // The finishRecordVideo mentioned above fileprivate func finishRecordVideo(completion: @escaping ()-> Void) { videoInput?.markAsFinished() audioInput?.markAsFinished() videoAssetWriter?.finishWriting { if let writer = self.videoAssetWriter { if writer.status == .completed { completion() } else if writer.status == .failed { // Print the error to find out what went wrong if let error = writer.error { print("Video asset writing failed with error: \(error.localizedDescription). Url: \(writer.outputURL.path)") } else { print("Video asset writing failed, but no error description available.") } completion() }else { completion() } } } } What could it be the reason of the corrupted files generated? This issue has never happened to my devices so there is no way to debug using xcode. Also there are no errors popping out on the logs. Can you spot any issues on the code that can create this kind of issue? Do you have any suggestions on the problem at hand? Thanks
0
0
578
Sep ’24
Recording A/V .mov file with SMPTE timecode
Hello, I used following technical note to develop app that record mov file with SMPTE timecode. https://developer.apple.com/library/archive/technotes/tn2310/_index.html As result, a timecode track is present within .mov file (other tracks are audio and video) Unfortunately, QuickTime Player doesn't display timecode information. Analyser tools like mediainfo or online service as https://media-analyzer.pro/app show that timecode track has null duration (and so no "time code of last frame" example n° of TC track : Other ID : 3 Type : Time code Format : QuickTime TC Frame rate : 60.000 FPS Time code of first frame : 17:39:59:00 Time code, stripped : Yes Title : Core Media Time Code Encoded date : 2024-09-10 15:39:46 UTC Tagged date : 2024-09-10 15:39:59 UTC example 2 of Timecode track : 0000569562Quicktime Timecode #0 00007f6b8a'trak' Track atom #1 00007f6b92'tkhd' Track header atom #2 size 92 (0x5C) type 'tkhd' (hex 74 6B 68 64) version 0 flags 15 (0xF) creation_time 0xE30618C2, '2024-09-10 15:39:46' modification_time 0xE30618CF, '2024-09-10 15:39:59' track_ID 3 reserved 0 duration 0 reserved [0, 0] In each case, duration is considered as null even if the record's duration is more than 20s. STEPS TO REPRODUCE Use AVAssetWriter for video and audio. Create AVAssetWrite for timecode and associate it with video track. Just before stopping record, a sample buffer containing SMPTE is generated and added. All track are marked as finished before stopping the record with finishWritingWithCompletionHandler.
1
0
545
Sep ’24
AVFoundation error when making a window full screen
I am working on a macOS app that uses AVFoundation to record the screen. During a recording if I make a window full screen, AVFoundation stops capturing screen frames (or does it at a very slow rate). In my logs I get the following error: Error Domain=AVFoundationErrorDomain Code=-11844 note that I have had instances where I could not reproduce the error but they were rare. The screen recording sometimes resumes normally if I switch desktops or minimize the full screen window. Did anyone ever run across a similar issue or knows how to fix it ?
1
0
371
Sep ’24
CarPlay issue after iOS 18 update
After upgrading to iOS 18 CarPlay with 2023 Lexus and iPhone 15 Pro Max shows multiple issues: • speakers reduced to Mono sound (going back to normal after some minutes and then reducing again) • no speaker sound at all • touching / moving phone while driving resulting in “on and off” sound No Reboot / Shutdown helps No Cable connection works @Apple: do you test your software professionally or is this outsourced to the community? Doesn’t look at all like a professional approach? Please solve this dangerous (traffic!) and annoying topic ASAP! Thanks - Torsten
1
0
727
Sep ’24
"Terminated due to signal 9" when launching app from camera button (iPhone 16 Pro)
Hello, I have followed the Creating a camera experience for the Lock Screen guide, and can now launch my app using the iPhone 16's new camera button. That said, after about 10 seconds the app is force-closed by the OS, with the only message appearing in the console: "Terminated due to signal 9". This error does not happen when: launching the app via physical camera button when device is locked launching the app by tapping the icon on the Home Screen It is only happening when: launching the app via physical camera button from the Home Screen when device is unlocked Any ideas? Thank you!
2
1
614
Sep ’24
Issue with Low-Latency HLS Playback Using AVAssetResourceLoaderDelegate
Hi, I am writing to seek any help or workaround regarding an issue I have encountered while implementing Low-Latency HLS playback using the AVAssetResourceLoaderDelegate. I have been successfully loading playlists during HLS live playback using the AVAssetResourceLoaderDelegate. However, after introducing Low-Latency HLS, I have run into a problem. When the AVPlayer loads low-latency content playback natively, everything works fine. But when I use the delegate for loading, I encounter the following error from AVPlayer's status observer: CoreMediaErrorDomain -15410 Low Latency: Server must support http2 ECN and SACK It seems there is no problem since playback does not stop, but there is a very critical part missing. The playback does not achieve the expected low latency and behaves similarly to standard HLS. Additionally, this behavior only occurs on iOS 16 devices and simulators. On iOS 17 simulators and devices, the error message does not appear, and the latency remains low as expected. Therefore, I suspect that there might be some misjudgment in the verification process within the internal implementation of AVPlayer. Since our app needs to support iOS 16, I would appreciate any solutions, methods to try, or workarounds that you could share regarding this issue. Thank you.
2
0
806
Sep ’24
AVUnknown error using Camera Extensions in AVCaptureSession
I have a Mac Catalyst video conferencing app that streams video using AVCaptureMultiCamSession. Everything has been working well for me in a variety of scenarios and hardware, but recently I got a report that virtual cameras / camera extensions do not seem to work - which I can reproduce 100% of the time by using something like OBS's virtual camera. FaceTime and Photo Booth work okay with these virtual cameras. Although my app can see and add the external AVCaptureDevice, I get an AVCaptureSessionRuntimeError posted when I start the session with a connection between the virtual camera and a AVCaptureVideoDataOutput (I don't get the error if I don't connect or add an output). The posted error is AVUnknown: AVCaptureSessionRuntimeErrorNotification with Error Domain=AVFoundationErrorDomain Code=-11800 "The operation could not be completed" UserInfo={NSLocalizedFailureReason=An unknown error occurred (-12780), NSLocalizedDescription=The operation could not be completed, NSUnderlyingError=0x600001dcd680 {Error Domain=NSOSStatusErrorDomain Code=-12780 "(null)"}} Which doesn't tell me too much. I do see some fig assertions just above in Console though: <<<< BWMultiStreamCameraSourceNode >>>> Fig assert: "err == 0 " at bail (BWMultiStreamCameraSourceNode.m:3964) - (err=-12780) <<<< BWMultiStreamCameraSourceNode >>>> Fig assert: "err == 0 " at bail (BWMultiStreamCameraSourceNode.m:1591) - (err=-12780) <<<< BWMultiStreamCameraSourceNode >>>> Fig assert: "err == 0 " at bail (BWMultiStreamCameraSourceNode.m:1418) - (err=-12780) <<<< FigCaptureCameraSourcePipeline >>>> Fig assert: "err == 0 " at bail (FigCaptureCameraSourcePipeline.m:3572) - (err=-12780) <<<< FigCaptureCameraSourcePipeline >>>> Fig assert: "err == 0 " at bail (FigCaptureCameraSourcePipeline.m:4518) - (err=-12780) <<<< FigCaptureCameraSourcePipeline >>>> Fig assert: "err == 0 " at bail (FigCaptureCameraSourcePipeline.m:483) - (err=-12780) I've verified formats are sane (the usual 420v 1080p 30fps I have everywhere else) and data output functions and such, but I'm a bit stuck as to where to go from here. One thing that did stand out is that in the AVCamBarcode example I can see the virtual camera in that app's preview layer, but if I create an AVCaptureVideoDataOutput and add it to the session in that example, it fails in what looks like exactly the same way that my app does, with the same assertions. Does anyone have any advice? Thanks!
4
0
817
Sep ’24
Flag to avoid "shared is unavailable in application extensions" error?
Hello, I am trying to get my camera app to launch from the Lock Screen, and see that calls to UIApplication.shared are not allowed. In my app, I have: UIApplication.shared.isIdleTimerDisabled = true Which is causing this compile time error: 'shared' is unavailable in application extensions for iOS: Use view controller based solutions where appropriate instead I do not believe there is a view controller based solution for this. Is there a flag I can wrap around the call so that the compiler knows it won't be used during an application extension? Thank you!
3
0
677
Sep ’24
AVAssetExportSession is not working on Iphone 16 pro max.
My App is live on app store , user are using it with iPhone 16 pro max and they are getting Operation Stopped while combining videos and audios only specifically on iPhone 16 pro max , on every other device its working fine. And When i adding AVAssetExportPresetPassthrough it able to combine videos and audios but not respecting the encoding and without audio. NSArray *compatiblePresets = [AVAssetExportSession exportPresetsCompatibleWithAsset:composition]; if ([compatiblePresets containsObject:AVAssetExportPresetHighestQuality]) { presetName = AVAssetExportPresetHighestQuality; } else if ([compatiblePresets containsObject:AVAssetExportPreset1920x1080]) { presetName = AVAssetExportPreset1920x1080; } else if ([compatiblePresets containsObject:AVAssetExportPreset1280x720]) { presetName = AVAssetExportPreset1280x720; } else { presetName = AVAssetExportPresetPassthrough; } } else { presetName = AVAssetExportPreset1280x720; }
5
2
812
Sep ’24
MusicKit & React Native app, overlay part of a song to a video?
I have an app on which users learn choreography. To avoid copyright infringements we currently only have audio instructions and no music on the app. Could we enable those that are subscribed to Apple Music to listen to the part of a song the corresponds to the choreography? Usually they are 60 seconds long. The app is in React Native. Would it be possible to implement it so that opening a dance video automatically triggers the playback of that song from e.g. second 32 - 95? Since the video is looping, could it then start playing from second 32 again? Also looking for devs with experience in integrating the MusicKit for this usecase if it turns out to be possible.
0
0
489
Sep ’24
Time Limit
Hi Team, When we select the screen Timing option and if the screen limit is exists for the day limit we have an option to click Ignore Limit—-> Ignore limit for today. After we select this option, suppose if we are watching any videos via Third party apps (YouTube) at that moment it got hang after we clicked multiple times the play button the video started to play but the volume is not working, again if we click previous or forward button then only the sounds are coming. Kindly look into this issue and address it. Thanks, Pemkumar S
0
0
324
Sep ’24