Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.

All subtopics
Posts under Media Technologies topic

Post

Replies

Boosts

Views

Activity

Can't play queues with mixed library and non-library items correctly
If a queue (ApplicationMusicPlayer.Queue) is set with both library and non-library (catalog) items, the queue will play only one kind of item (library or non-library) or will just stop playing when the next item is of a different kind. Using both Xcode 16 beta 4 and Xcode 15.4. The issue was present in iOS 17 and is not resolved as of iOS 18 beta 4. FB14491999
1
1
634
Sep ’24
Is it worth using Accelerate to convert 16 bpc RGB to 8 bpc RGB
I am working on an image processing app that requires 8 bit per channel (bpc) images. Sometimes, input images are 16 bpc (e.g. sRGB IEC61966-2.1; extended range) The app already draws the input image in a CGContext producing an 8 bpc CGImage. This works fine if the input is 16 bpc and I get an 8 bpc image. I am wondering if it would be better for image quality to convert the 16 bpc images to 8 bpc using Accelerate before that CGContext draw? or does that draw essentially do the equivalent?
0
0
308
Sep ’24
AVCaptureSessionControlsDelegate Not Being Called From Capture App
I am looking to learn more about the new Capture Button controls for iPhone 16, and am working to adapt the AVCam Sample Code to support the Capture Button. While I believe I've followed the guidance in the Enhancing your app experience with the Camera Control documentation, I'm finding that while my AVCaptureControl items seem to be added to the capture session, the Capture Button does not ever do anything, nor are any of the delegate methods called. After I configure my capture session per the setupSession() method, I'm calling a method I added, func configureCameraControls(device:AVCaptureDevice): func configureCameraControls(device: AVCaptureDevice) { guard captureSession.supportsControls else { assertionFailure("App does not support camera control.") return } // Set the controls delegate captureSession.setControlsDelegate(controlsDelegate, queue: sessionQueue) // Begin configuring the capture session. captureSession.beginConfiguration() // Remove previously configured controls, if any. for control in captureSession.controls { captureSession.removeControl(control) } // Add a zoom control let systemZoomSlider = AVCaptureSystemZoomSlider(device: device) { zoomFactor in // TODO } // Create a control to adjust the device's exposure bias. let systemBiasSlider = AVCaptureSystemExposureBiasSlider(device: device) // Add a custom slider let focusSlider = AVCaptureSlider("Focus", symbolName: "scope", in: 0...1) focusSlider.setActionQueue(sessionQueue) { focusValue in // TODO } // Iterate over the passed in controls. for control in [systemZoomSlider, systemBiasSlider, focusSlider] { // Add the control to the capture session if possible. if captureSession.canAddControl(control) { captureSession.addControl(control) } else { print("Unable to add control \(control).") } } // Commit the capture session configuration. captureSession.commitConfiguration() } I define the controls delegate like so: final class CaptureControlsDelegate: NSObject, AVCaptureSessionControlsDelegate { func sessionControlsDidBecomeActive(_ session: AVCaptureSession) { } func sessionControlsWillEnterFullscreenAppearance(_ session: AVCaptureSession) { } func sessionControlsWillExitFullscreenAppearance(_ session: AVCaptureSession) { } func sessionControlsDidBecomeInactive(_ session: AVCaptureSession) { } } Which I instantiate earlier on in my app's lifecycle and make available to the CaptureService actor. I'm not sure if this snippet can provide enough detail to gather some help, but I can't quite fathom why the camera/capture pipeline works, but I'm not getting any functionality from the Capture Button nor is the AVCaptureSessionControlsDelegate ever having its methods called.
3
0
663
Sep ’24
CarPlay music issues iOS 18.1 (22B5054e)
When my CarPlay connects and tries to play music it plays it through the vehicles phone speakers. If I make a phone call and then hang up it pushes the sound back to the vehicles stereo speakers. Does anyone have a fix for this because this is very annoying. Sometimes it will just switch to the phone speakers and I have to complete the process again.
2
1
533
Sep ’24
Content items not updating when using MediaPlayer API for CarPlay on iOS18
We are using the MediaPlayer API to provide CarPlay support. Starting in iOS 18 we are having issues updating the content list. The initial list of items will populate on a fresh instance but soon there after an error will show up saying we are not entitled to "com.apple.mediaremote.external-artwork-validation". From that point onwards no changes we make to our MPPlayableContentDataSource are reflected in CarPlay. Even after restarting the device. While the MediaPlayer API is marked as deprecated, we are still using it to provide CarPlay support going back to iOS 10. Has anyone else run into this or have suggestions for workarounds?
3
1
697
Oct ’24
AVExternalStorageDevice permissions behavior completely broken on iOS 18?
I'm attempting to use AVExternalStorageDevice.requestAccess on iOS 18 using Xcode 16. When calling requestAccess, a dialog does appear, but the completionHandler closure is never called to indicate whether access was granted. If using the async version, the function just never returns. Calling requestAccess also results in a mediaServicesWereReset (-11819) error without fail. Supposedly, "the system only presents the dialog to a person the first time your app calls the method." That also doesn't appear to be the case. The dialog appears every time requestAccess is called, regardless of previous invocations and whether "Allow" or "Don't Allow" was selected. The dialog itself says "You can change this in Privacy settings." I cannot find this permission anywhere in the Settings app, neither under Privacy & Security nor under the app-specific settings page. Has anyone else experienced these issues? Am I missing something here? I did suspect permissions issues and tried adding a NSRemovableVolumesUsageDescription entry to the app. This did not appear to change anything.
1
2
654
Oct ’24
Recording A/V .mov file with SMPTE timecode
Hello, I used following technical note to develop app that record mov file with SMPTE timecode. https://developer.apple.com/library/archive/technotes/tn2310/_index.html As result, a timecode track is present within .mov file (other tracks are audio and video) Unfortunately, QuickTime Player doesn't display timecode information. Analyser tools like mediainfo or online service as https://media-analyzer.pro/app show that timecode track has null duration (and so no "time code of last frame" example n° of TC track : Other ID : 3 Type : Time code Format : QuickTime TC Frame rate : 60.000 FPS Time code of first frame : 17:39:59:00 Time code, stripped : Yes Title : Core Media Time Code Encoded date : 2024-09-10 15:39:46 UTC Tagged date : 2024-09-10 15:39:59 UTC example 2 of Timecode track : 0000569562Quicktime Timecode #0 00007f6b8a'trak' Track atom #1 00007f6b92'tkhd' Track header atom #2 size 92 (0x5C) type 'tkhd' (hex 74 6B 68 64) version 0 flags 15 (0xF) creation_time 0xE30618C2, '2024-09-10 15:39:46' modification_time 0xE30618CF, '2024-09-10 15:39:59' track_ID 3 reserved 0 duration 0 reserved [0, 0] In each case, duration is considered as null even if the record's duration is more than 20s. STEPS TO REPRODUCE Use AVAssetWriter for video and audio. Create AVAssetWrite for timecode and associate it with video track. Just before stopping record, a sample buffer containing SMPTE is generated and added. All track are marked as finished before stopping the record with finishWritingWithCompletionHandler.
1
0
545
Oct ’24
Apple Music Bug
Since the iOS 18 update, there's been a bug that always occurs when listening to music through Apple Music. When music is playing and the iPhone enters or exits standby mode, the music pauses by itself for 1 second. The initial conditions were the same as when this problem didn’t exist.
2
0
631
Oct ’24
AVPlayer Error in iOS 18.0
When attempting to play a video using AVPlayer on iOS 18.0, I am encountering an error that does not occur on versions earlier than 18.0. Could you please advise what might be causing this issue? Error Domain=AVFoundationErrorDomain Code=-11828 Error Domain=NSOSStatusErrorDomain Code=-12847 "(null)" This is Error code This is the URL information I retrieved using the curl command. HTTP/1.1 200 Content-Disposition: inline;filename="sample.mp4" Accept-Ranges: bytes ETag: sample.mp4 Last-Modified: Tue, 20 Jan 1970 23:52:10 GMT Expires: Mon, 07 Oct 2024 09:15:49 GMT Content-Range: bytes 0-987561/987561 Content-Type: application/octet-stream Content-Length: 987561 Date: Mon, 30 Sep 2024 09:15:49 GMT
2
0
767
Oct ’24
IOS 18 Voicemail feature
I am facing an Issue regarding the voicemail feature. when someone calls me, it would go to voicemail only if I click on the voicemail icon. If I do not respond at all to the incoming call, it does not give the caller an option to record voicemail. I have tried switching the voicemail on and off, its working on other devices Iphone 14 and 15 for my family member, just not for me. How can i resolve this?
1
0
630
Oct ’24
AVDevice is ignoring 60fps
Hello, I try to get the Video from an HDMI USB capture card and show it in a PreviewLayer with 60fps. The device I am using (ShadowCast 2) is supporting 1080p with 60fps in "yuvs" and "420v". This is my code with stripped away uninteresting stuff and removed error handling to build the previewLayer. I am using the AVFrameRateRange because the capture device is not directly supporting 60.00 but <AVFrameRateRange: 0x600000875680 60.00 - 60.00 (1000000 / 60000240 - 1000000 / 60000240)> fps. @Observable final class AVFoundationService: AVService { // Live View private let session: AVCaptureSession = .init() var previewLayer: AVCaptureVideoPreviewLayer { let layer = AVCaptureVideoPreviewLayer(session: session) layer.videoGravity = .resizeAspect return layer } var activeVideoDevice: AVCaptureDevice? { // TODO: implement correct logic if let device = videoDevices.first(where: { $0.localizedName.contains("Shadow") }) { return device } return AVCaptureDevice.default(for: .video) } func setupStreamDemo(completion: @escaping (Error?) -> Void) { session.beginConfiguration() if let device = activeVideoDevice { do { let input = try AVCaptureDeviceInput(device: device) if session.canAddInput(input) { session.addInput(input) } else { print("explode") } for format in device.formats { let dimensions = CMVideoFormatDescriptionGetDimensions(format.formatDescription) if dimensions.width == 1920 && dimensions.height == 1080 && format.formatDescription.mediaSubType.description == "'yuvs'" { let foundFPS = format.videoSupportedFrameRateRanges.first { Int($0.minFrameRate) == 60 && Int($0.minFrameRate) == 60 } try device.lockForConfiguration() device.activeFormat = format device.activeVideoMinFrameDuration = foundFPS!.minFrameDuration device.activeVideoMaxFrameDuration = foundFPS!.minFrameDuration device.unlockForConfiguration() } } } catch { return completion(error) } } session.commitConfiguration() session.startRunning() completion(nil) } } I am using the following code in SwiftUI to show the AVCaptureVideoPreviewLayer. struct VideoPreviewView: NSViewRepresentable { private let previewLayer: AVCaptureVideoPreviewLayer func makeNSView(context: Context) -> NSView { let view = NSView() view.layer = self.previewLayer view.layer?.frame = view.bounds return view } func updateNSView(_ nsView: NSView, context: Context) { if let layer = nsView.layer as? AVCaptureVideoPreviewLayer { layer.session = self.previewLayer.session } } } When I now run my app, it will ignore whatever I set on device.activeVideoMinFrameDuration and/or device.activeVideoMaxFrameDuration. If I set it to 10 fps - it's running with 30, if I set 60 it is running with 30. If I start in parallel to my app QuickTime and start a "Recording" from my USB Capture Card, it will switch to 60fps mode. I am on Mac Sequoia 15.0 with Xcode 16.0. What I am doing wrong?
1
1
629
Oct ’24
AVAudioEngine change playback rate in real time (AVAudioUnitVarispeed is non real time)
I am using the AVAudioEngine to play back samples in an iOS game. I would like to change the play back rate of a sample in real time. When using AVAudioUnitVarispeed for chaging the play back rate it creates stutters in the game as it isn't processed in real time (as stated here:AVAudioUnitTimeEffect) The other option I found to change the rate is by using an AVAudioEnvironmentNode and change the rate of the AVAudioPlayerNode. That works without creating stutters but limits the valid values for the rate from 0.5 to 2.0 (I need higher rates then 2.0). See here: AVAudio3DMixing. Are there any other ways to play back a sample with a rate control in real time?
0
0
317
Oct ’24
SFSpeechRecognizer is broken on iOS 18
Hello, I noticed that SFSpeechRecognizer is broken on iOS 18. During a recognition task, it keeps dropping the recognized text on every pause. For example, if you say "how are you fine", it will drop the "how are you" part and only give you "fine" as the result. Say "how are you <pause> fine" // iOS 17 ✅ (perfect final result) How How are How are you How are you. How are you. Fine. // iOS 18 ❌ How How are How are you How are you Fine (the text before the pause is dropped, and fail to recognize the punctuations.) Reproducing the issue: Download the official sample project. Run it on an iOS 18 device or simulator. Say "how are you fine" Only "fine" will be displayed.
4
4
1.3k
Oct ’24
How To Play Audio Through Headphones on WatchOS 11?
I have an app that plays audio and the behaviour of it has changed in watchOS 11. I can no longer figure out how to play the audio through the headphones. To play audio I.. let session = AVAudioSession.sharedInstance() try session.setCategory(.playback, mode: .default, policy: .longFormAudio, options: [] let activated = try await session.activate() if activated { // play audio } In previous versions, 'try await session.activate()` would bring up a route picker where the user could select their headphones. Now on watchOS 11 it just plays the audio out of the speaker. Maybe that's what some people want but if they do want it to play out of the headphones I can't see how I can give that option now? There's no AVRoutePickerView available on watchOS for selecting it. I've tried setting the category to .multiRoute instead of .playback and that does bring up the picker but selecting the speaker results in an error code and selecting the headphones results in it saying it cannot find my headphones (which shouldn't be the case since Apple Music on watchOS finds them). Tried overriding the output with try session.overrideOutputAudioPort(.speaker) but the compiler complains that speaker isn't available on watchOS, which is strange as if I understand correctly it's possible to play through the speaker now at least on some Apple Watches. So is this a bug or is there some way I've not found of playing audio through the headphones?
1
1
553
Oct ’24
Call cannot be disconnected due to delay observed in AudioOutputUnitStop API
We have a VOIP calling application that releases resources at the end of call. When the AudioOutputUnitStop API is invoked, it takes upto 700millisecond to return back sometimes. If we comment that API call as a test then the AudioUnitUninitialize API takes upto 700ms. Once the cleanup is done, as part of the call flow, the application invokes a BYE SIP message. Hence in cases where the API takes more than 200 ms, the Bye message is sent with that much delay and gets blocked as per the server DDOS settings. (DDOS timer will start as soon as a the UDP socket is disconnected with the client and will timeout within 200 ms and Bye request coming post that time will get blocked) We need to understand why there is a delay of more than 200ms observed sometimes while in other cases it requires less than 50ms?
2
0
353
Oct ’24