Dive into the technical aspects of audio on your device, including codecs, format support, and customization options.

Audio Documentation

Posts under Audio subtopic

Post

Replies

Boosts

Views

Activity

MusicKit + AirPlay
Hello, I'm working on a MusicKit based SwiftUI app. I've integrated AirPlay using the AVRoutePickerView like so: struct UIKitAirPlayPickerView: UIViewRepresentable { func makeUIView(context: Context) -> AVRoutePickerView { let routePickerView = AVRoutePickerView() routePickerView.prioritizesVideoDevices = false return routePickerView } func updateUIView(_ uiView: AVRoutePickerView, context: Context) {} } The AirPlay menu appears as expected, and selecting an AirPlay device functions as expected. I'm currently sending audio from my app to a HomePod. However, the state of the AVRoutePickerView does not reflect the playback state. There is no cover art and it says "Not Playing". When my device is locked, my lock screen shows the album art, metadata and AirPlay routing as expected. My app uses the ApplicationMusicPlayer however I encounter the same behavior using the SystemMusicPlayer. Any guidance on how to troubleshoot this? Is there any other way to integrate the system AirPlay picker into my app, or is this my only option? Thank you for reading.
1
0
187
18h
AVAudioSession.outputVolume does not reflect system volume changes made while app is in background
I have a question regarding the behavior of AVAudioSession.sharedInstance().outputVolume. Observed behavior: When the app is in the foreground, I read audioSession.outputVolume (for example, 0.1). The app is then moved to the background. While the app is in the background, the user changes the system volume using the hardware buttons (for example, to 0.5). When the app returns to the foreground, audioSession.outputVolume still reports the previous value (0.1). From my testing, outputVolume only seems to update when the system volume is changed while the app is in the foreground. Volume changes made while the app is in the background are not reflected when the app returns to the foreground. Questions: According to Apple’s documentation for AVAudioSession.outputVolume: “The systemwide output volume set by the user.” https://developer.apple.com/documentation/avfaudio/avaudiosession/outputvolume However, based on our testing on iOS 18.6.2 and iOS 18.1, the observed behavior seems to differ from this description. Questions: The documentation states that outputVolume represents the system-wide volume set by the user. In our testing, the value does not reflect volume changes made while the app is in the background and only updates when the app is in the foreground.Is this the expected behavior of AVAudioSession.outputVolume? Is there any other recommended way in Swift to retrieve the current system volume that reflects user changes made both while the app is in the foreground and while it is in the background? Any clarification on the intended behavior or recommended handling would be greatly appreciated.
1
1
161
4d
AVAudioSession.outputVolume not reporting correctly in iOS 18+ devices
I’m using the shared instance of AVAudioSession. After activating it with .setActive(true), I observe the outputVolume, and it correctly reports the device’s volume. However, after deactivating the session using .setActive(false), changing the volume, and then reactivating it again, the outputVolume returns the previous volume (before deactivation), not the current device volume. The correct volume is only reported after the user manually changes it again using physical buttons or Control Center, which triggers the observer. What I need is a way to retrieve the actual current device volume immediately after reactivating the audio session, even on the second and subsequent activations. Disabling and re-enabling the audio session is essential to how my application functions. I’ve tested this behavior with my colleagues, and the issue is consistently reproducible on iOS 18.0.1, iOS 18.1, iOS 18.3, iOS 18.5 and iOS 18.6.2. On devices running iOS 17.6.1 and iOS 16.0.3, outputVolume correctly reflects the current volume immediately after calling .setActive(true) multiple times.
3
1
253
4d
Unable to trigger AudioRecordingIntent from background
I am building an app where I am using AudioRecordingIntent to start audio recording from shortcuts / Action button etc. Whenever I set that up, I notice that I get an error - Unknown NSError Live Activity start failed: The operation couldn’t be completed. Target is not foreground I explicitly try to start the live activity and then start the audio recording and that's when I see this error. How can I make this work? I am unable to find any examples.
0
0
25
4d
ioreg AVBControllerState - AVB/EAV Mode
Hello. To determine wether "AVB/EAV Mode" of a AV-capable network interfaces is turned on or off I query the IO registry and evaluate the property "AVBControllerState". I was wondering if this is the "correct" approach and if there is anything known about the values for this property? Network interfaces without AV capability may also carry this property (e.g.: for my WiFi adapter the value of 1) whereas the value for interfaces with AV capability can be 0 and 3. At least as far as I could observe with my limited amount of test devices at hand. Is it safe to assume that a value of 3 means this feature is turned on, 0 that it is turned off and ignore values of 1? Is there another approach to get to know the status of the "AVB/EAV Mode"? Thanks for any insight. Best regards, Ingo
0
0
34
6d
coreaudio-api mailing list search broken
Hello, The search functionality of the coreaudio-api mailing list archive has been broken for a very long time. Several of the lower-level audio APIs have only been discussed on this mailing list, making it critical for those of us maintaining old audio code. Steps to reproduce: Open https://lists.apple.com/archives/list/coreaudio-api@lists.apple.com/ in your web browser. Enter a search term in the "Search this list" field in the top-right corner of the page. The search will eventually time out with "502 Bad Gateway" Can somebody please forward this information to the current maintainer? I've tried to contact developer support but they weren't sure what to do. Thanks!
1
0
174
1w
Keeping PiP alive during third-party video recording (camera capture)
I’m building a teleprompter-style app that relies on Picture in Picture. PiP starts correctly on device. Everything works — until another app (e.g. TikTok / Instagram) starts active video recording. When camera capture begins in the foreground app, iOS terminates my PiP session. Some teleprompter apps appear to keep PiP active while recording in other apps, so I’m trying to understand the recommended architectural pattern for this scenario. Is there a documented approach or best practice to keep PiP stable during third-party camera capture? Looking specifically for guidance on the correct AVKit / AVAudioSession configuration for this use case.
0
0
80
1w
🎧Define if headphones is only playing device for current session
I need to apply headphone-specific scenario only when headphones are the sole active playback device in my iOS audio app. Problem that there is no absolute way to definitively understand that headphones are the sole active playback device AVAudioSession.currentRoute.outputs portTypes don't guarantee headphones: let session = AVAudioSession.sharedInstance() let outputs = session.currentRoute.outputs let headphonesOnly = outputs.count == 1 && (outputs.first?.portType == .headphones || outputs.first?.portType == .bluetoothA2DP || outputs.first?.portType == .bluetoothHFP || outputs.first?.portType == .bluetoothLE) The issue in code above that listed bluetooth profiles (A2DP, HFP, LE) can be used by any audio device, not only headphones Is there any public API on iOS that can: Distinguish Bluetooth headphones vs Bluetooth speakers when both use A2DP/LE? Expose the user’s “Device Type” classification (headphones / speaker / car stereo, etc.) that is shown in Settings → Bluetooth → Device Type? Provide a more reliable way to know “this route is definitely headphones” for A2DP devices, beyond portType and portName string heuristics?
0
0
64
1w
Spatial Audio on Mac - When and how to render using Audio Units?
I'm working on adding Spatial Audio support to a game on the Mac. I'm looking at the SpatialAudioRenderer sample but having some issues. It's unclear to me when a device is compatible with Spatial Audio and when I should attempt to render Spatial Audio. There is no property that I can find on the Mac that advertises Spatial Audio compatibility on a device. The sample crashes when the output device is a USB device. This includes the Apple Studio Display. The Apple Studio Display is supposed to be capable of rendering Spatial Audio. The device doesn't work with the sample - do I still need to render down the 7.1.4 source on this device? The sample always renders down to Stereo, but the Apple Studio Display is not a Stereo device. I'm a bit confused by the sample and when/how I should configure the mixing unit.
0
0
88
1w
failed to set category, reason: 未能完成操作。(OSStatus错误4097。)
When using the [AVAudioSession setCategory:withOptions:error:] API, the call hangs for a long time and eventually returns an error.This issue occurs on iOS 16, and did not appear in earlier versions. Thread 135: 0 libsystem_kernel.dylib 0x00000002478e3cd4 _mach_msg2_trap :8 (in libsystem_kernel.dylib) 1 libsystem_kernel.dylib 0x00000002478e7214 _mach_msg_overwrite :428 (in libsystem_kernel.dylib) 2 libsystem_kernel.dylib 0x00000002478e705c _mach_msg :24 (in libsystem_kernel.dylib) 3 libdispatch.dylib 0x00000001d63ffe84 __dispatch_mach_send_and_wait_for_reply :548 (in libdispatch.dylib) 4 libdispatch.dylib 0x00000001d6400224 _dispatch_mach_send_with_result_and_wait_for_reply :60 (in libdispatch.dylib) 5 libxpc.dylib 0x00000001b2114e04 _xpc_connection_send_message_with_reply_sync :256 (in libxpc.dylib) 6 Foundation 0x000000019b6249f0 ___NSXPCCONNECTION_IS_WAITING_FOR_A_SYNCHRONOUS_REPLY__ :16 (in Foundation) 7 Foundation 0x000000019c06d1b4 -[NSXPCConnection _sendInvocation:orArguments:count:methodSignature:selector:withProxy:] :2100 (in Foundation) 8 CoreFoundation 0x000000019dfcb1cc ____forwarding___ :1072 (in CoreFoundation) 9 CoreFoundation 0x000000019dfd3200 ___forwarding_prep_0___ :96 (in CoreFoundation) 10 AudioSession 0x00000001c77498b0 __ZN4avas6client11SessionCore10HandlePingEv :192 (in AudioSession) 11 AudioSession 0x00000001c77497b0 ____ZN4avas6client11SessionCore12DispatchPingEv_block_invoke :52 (in AudioSession) 12 libdispatch.dylib 0x00000001d63e4adc __dispatch_call_block_and_release :32 (in libdispatch.dylib) 13 libdispatch.dylib 0x00000001d63fe7ec __dispatch_client_callout :16 (in libdispatch.dylib) 14 libdispatch.dylib 0x00000001d63ed468 __dispatch_lane_serial_drain :740 (in libdispatch.dylib) 15 libdispatch.dylib 0x00000001d63edf78 __dispatch_lane_invoke :440 (in libdispatch.dylib) 16 libdispatch.dylib 0x00000001d63f6f48 __dispatch_root_queue_drain :364 (in libdispatch.dylib) 17 libdispatch.dylib 0x00000001d63f6d08 __dispatch_worker_thread :268 (in libdispatch.dylib) 18 libsystem_pthread.dylib 0x00000001f9ff144c __pthread_start :136 (in libsystem_pthread.dylib) 19 libsystem_pthread.dylib 0x00000001f9fed8cc _thread_start :8 (in libsystem_pthread.dylib) Thread 132: 0 libsystem_kernel.dylib 0x00000002478e3cd4 _mach_msg2_trap :8 (in libsystem_kernel.dylib) 1 libsystem_kernel.dylib 0x00000002478e7214 _mach_msg_overwrite :428 (in libsystem_kernel.dylib) 2 libsystem_kernel.dylib 0x00000002478e705c _mach_msg :24 (in libsystem_kernel.dylib) 3 libdispatch.dylib 0x00000001d63ffe84 __dispatch_mach_send_and_wait_for_reply :548 (in libdispatch.dylib) 4 libdispatch.dylib 0x00000001d6400224 _dispatch_mach_send_with_result_and_wait_for_reply :60 (in libdispatch.dylib) 5 libxpc.dylib 0x00000001b2114e04 _xpc_connection_send_message_with_reply_sync :256 (in libxpc.dylib) 6 Foundation 0x000000019b6249f0 ___NSXPCCONNECTION_IS_WAITING_FOR_A_SYNCHRONOUS_REPLY__ :16 (in Foundation) 7 Foundation 0x000000019c06d1b4 -[NSXPCConnection _sendInvocation:orArguments:count:methodSignature:selector:withProxy:] :2100 (in Foundation) 8 CoreFoundation 0x000000019dfcb1cc ____forwarding___ :1072 (in CoreFoundation) 9 CoreFoundation 0x000000019dfd3200 ___forwarding_prep_0___ :96 (in CoreFoundation) 10 AudioSession 0x00000001c7754198 __ZNK4avas6client11SessionCore18SetBatchPropertiesEP12NSDictionaryIP8NSStringPU25objcproto14NSSecureCoding11objc_objectEPU15__autoreleasingP7NSArrayIPS2_IS4_P8NSNumberEENS_30AVAudioSessionBatchSetStrategyEbb :548 (in AudioSession) 11 AudioSession 0x00000001c7753e58 __ZNK4avas6client11SessionCore20SetBatchPropertiesMXEP12NSDictionaryIP8NSStringPU25objcproto14NSSecureCoding11objc_objectE :92 (in AudioSession) 12 AudioSession 0x00000001c775179c __ZN4avas6client11SessionCore11setCategoryEP8NSStringS3_32AVAudioSessionRouteSharingPolicym :472 (in AudioSession) 13 AudioSession 0x00000001c7768f88 -[AVAudioSession setCategory:withOptions:error:] :68 (in AudioSession) 14 AlipayWallet 0x000000010140580c -[AVAudioSession(APMHook) apmhook_setCategory:withOptions:error:] APMHookAudioSession.m:35 (in AlipayWallet) 15 AlipayWallet 0x00000001014001a4 -[APMAudioSessionManager resume] APMAudioSessionManager.m:718 (in AlipayWallet) 16 libdispatch.dylib 0x00000001d63e4adc __dispatch_call_block_and_release :32 (in libdispatch.dylib) 17 libdispatch.dylib 0x00000001d63fe7ec __dispatch_client_callout :16 (in libdispatch.dylib) 18 libdispatch.dylib 0x00000001d63ed468 __dispatch_lane_serial_drain :740 (in libdispatch.dylib) 19 libdispatch.dylib 0x00000001d63edf44 __dispatch_lane_invoke :388 (in libdispatch.dylib) 20 libdispatch.dylib 0x00000001d63f83ec __dispatch_root_queue_drain_deferred_wlh :292 (in libdispatch.dylib) 21 libdispatch.dylib 0x00000001d63f7ce4 __dispatch_workloop_worker_thread :692 (in libdispatch.dylib) 22 libsystem_pthread.dylib 0x00000001f9fee3b8 __pthread_wqthread :292 (in libsystem_pthread.dylib) 23 libsystem_pthread.dylib 0x00000001f9fed8c0 _start_wqthread :8 (in libsystem_pthread.dylib)
2
0
717
2w
_MPRemoteCommandEventDispatch crashes on iOS 26.x devices.
I'm seeing crashes in _MPRemoteCommandEventDispatch on iOS 26.x devices in 3 apps. According to Bugsnag logs they are: NSInternalInconsistencyException: event dispatch <_MPRemoteCommandEventDispatch: <MPRemoteCommandEvent: 0x11c049500 commandID=THV0 command=<MPRemoteCommand: 0x109ad1ea0 type=Play (0) enabled=YES handlers=[0x109b6a310]> sourceID=(null) ([HostedRoutingSessionDataSource] handleControlSendingCommand<2W5E>)> state:201> deallocated without calling continuation I attached a log from Xcode organizer matching Bugsnag crash. mpr_remote_command_event.crash When I set the brakpoint on the -[_MPRemoteCommandEventDispatch dealloc] I can see it it's hit every time I tap play or pause on locked screen play button. Thread 0 Crashed: 0 libsystem_kernel.dylib 0x00000002370420cc __pthread_kill + 8 (:-1) 1 libsystem_pthread.dylib 0x00000001e975c810 pthread_kill + 268 (pthread.c:1721) 2 libsystem_c.dylib 0x0000000198f8ff64 abort + 124 (abort.c:122) 3 libc++abi.dylib 0x000000018a7cf808 __abort_message + 132 (abort_message.cpp:66) 4 libc++abi.dylib 0x000000018a7be484 demangling_terminate_handler() + 304 (cxa_default_handlers.cpp:76) 5 libobjc.A.dylib 0x000000018a6cff78 _objc_terminate() + 156 (objc-exception.mm:496) 6 xxxxxxxxxxxxxx 0x00000001003a7db8 CPPExceptionTerminate() + 416 (BSG_KSCrashSentry_CPPException.mm:156) 7 libc++abi.dylib 0x000000018a7cebdc std::__terminate(void (*)()) + 16 (cxa_handlers.cpp:59) 8 libc++abi.dylib 0x000000018a7ceb80 std::terminate() + 108 (cxa_handlers.cpp:88) 9 CoreFoundation 0x000000018d7341c4 __CFRunLoopPerCalloutARPEnd + 256 (CFRunLoop.c:769) 10 CoreFoundation 0x000000018d70bb5c __CFRunLoopRun + 1976 (CFRunLoop.c:3179) 11 CoreFoundation 0x000000018d70aa6c _CFRunLoopRunSpecificWithOptions + 532 (CFRunLoop.c:3462) 12 GraphicsServices 0x000000022e31c498 GSEventRunModal + 120 (GSEvent.c:2049) 13 UIKitCore 0x00000001930ceba4 -[UIApplication _run] + 792 (UIApplication.m:3902) 14 UIKitCore 0x0000000193077a78 UIApplicationMain + 336 (UIApplication.m:5577) 15 xxxxxxxxxxxxxx 0x00000001000c0134 main + 308 (main.swift:15) 16 dyld 0x000000018a722e28 start + 7116 (dyldMain.cpp:1477) Is the crash happening when the app is being terminated? Thank you!
2
2
572
2w
AudioOutputUnitStart takes ~500 ms when using Push-to-Talk framework after beginTransmission
I’m working with the Push-to-Talk (PTT) framework and observing a consistent delay when starting audio capture. Scenario: A PTT call is already active The AVAudioSession is fully configured I request beginTransmission on the PTT channel I start my Audio Unit for recording (AudioOutputUnitStart) Observed behavior: AudioOutputUnitStart takes ~500 ms This happens whether I start the Audio Unit: after didBeginTransmission, or after AVAudioSession didActivate Comparison: Using the same Audio Unit, same format, and same configuration Without the PTT framework, AudioOutputUnitStart takes ~200 ms Additional notes: I am not modifying or reconfiguring AVAudioSession when requesting beginTransmission The audio session is already set up when the PTT call starts There are no interruptions or route changes at the time of starting the Audio Unit Impact: This extra latency is significant for Push-to-Talk use cases where fast transmit start is critical.
1
0
269
2w
Audio DSP Processing Issue / Metallic Ringing Artifacts when recording acoustic instruments on iPhone 17 Pro Max
Description: I have identified a specific issue when recording acoustic guitar and other instruments on the iPhone 17 Pro Max using native applications (Voice Memos, Camera). The recordings contain an unnatural metallic resonance (ringing artifacts) that should not be present. Testing and Methodology: Hardware Verification: Initially, I suspected a hardware defect in the audio chip or microphone. However, extensive testing with third-party software suggests this is likely a software-level issue. AudioShare Test: I conducted a test using the AudioShare app in "Measurement Mode" (which bypasses standard iOS system-wide audio processing). In this mode, the audio remains perfectly clean, and the metallic ringing disappears entirely. Conclusion: The issue is rooted in the DSP (Digital Signal Processing) algorithms that iOS applies for noise suppression or voice enhancement. These algorithms appear to misinterpret the high-frequency overtones of acoustic instruments as background noise and attempt to "filter" them, resulting in audible digital artifacts. Comparison Results: This issue has not been observed on devices from other brands or on older iPhone models (preliminary tests suggest older versions handle this better). Notably, the problem persists even in GarageBand, as the app still utilizes certain system-level processing layers. Proposed Solution: I suggest adding a "Raw Audio" or "Instrument Mode" toggle within the Microphone/Audio settings for native apps. This mode should disable aggressive DSP processing, similar to how the AVAudioSession.Mode.measurement works in specialized apps. Attachments: I am attaching 4 archives, including a final "Measurement Mode" folder with comparative samples (Measurement Mode vs. Standard Mode). The artifacts are most prominent when monitored through headphones.
0
0
109
2w
USB microphone input : Mac "Designed for iPad"
My app - natively iOS but built with the "Designed for iPad" option to run on Mac - does not recognise an attached USB microphone when running on a Mac. This line int32_t items = (int32_t) [[[AVAudioSession sharedInstance] availableInputs] count ]; returns 1, which is the Mac internal mic. On iPad and iPhone it sees both the internal mic and the USB mic. Is this an inherent "Designed for iPad" restriction, and is there some trick I can pull to get the USB microphone to be recognised by the system?
1
0
250
2w
SpeechAnalyzer > AnalysisContext lack of documentation
I'm using the new SpeechAnalyzer framework to detect certain commands and want to improve accuracy by giving context. Seems like AnalysisContext is the solution for this, but couldn't find any usage example. So I want to make sure that I'm doing it right or not. let context = AnalysisContext() context.contextualStrings = [ AnalysisContext.ContextualStringsTag("commands"): [ "set speed level", "set jump level", "increase speed", "decrease speed", ... ], AnalysisContext.ContextualStringsTag("vocabulary"): [ "speed", "jump", ... ] ] try await analyzer.setContext(context) With this implementation, it still gives outputs like "Set some speed level", "It's speed level", etc. Also, is it possible to make it expect number after those commands, in order to eliminate results like "set some speed level to" (instead of two).
1
0
452
3w
ApplicationMusicPlayer fails play in macCatalyst 26.3 due to RemotePlayerService crash
I've filed this as FB21446798 but figured I'd post here too. In the first build of macOS 26.3, playback via ApplicationMusicPlayer is completely broken. When starting playback of anything at all, the console shows the following error: applicationController: xpc service connection interrupted Failed to obtain remoteObject: Error Domain=NSCocoaErrorDomain Code=4099 "The connection to service created from an endpoint was invalidated from this process." UserInfo={NSDebugDescription=The connection to service created from an endpoint was invalidated from this process.} Failed to prepareToPlay with error: Error Domain=MPMusicPlayerControllerErrorDomain Code=10 "(null)" UserInfo={NSUnderlyingError=0xc92910ff0 {Error Domain=NSCocoaErrorDomain Code=4099 "The connection to service created from an endpoint was invalidated from this process." UserInfo={NSDebugDescription=The connection to service created from an endpoint was invalidated from this process.}}} In addition, several crash logs for RemotePlayerService are generated, showing my app as the parent process. This issue is 100% repeatable. No matter how I load the queue, whether it’s catalog or library content, any variation I can think of all fails like this. I really hope this can be fixed before 26.3 comes out, otherwise my app will be totally unusable. 😅
2
0
600
3w
How to mark Audio Unit as dirty (needing to be saved)
I'm working on a v2 Audio Unit that has some complicated internal state (audio, midi, other settings). When the internal state changes, I want to inform the host (f.i. Logic Pro) that my plugin state has changed, and that the main window should show the 'project changed' status through the window close button. This was easy to achieve for the VST version of the plugin, but I can't figure out a way to do it for the Audio Unit. I've tried: Notifying change of the kAudioUnitProperty_ClassInfo property that stores the plugin state: unit->PropertyChanged(kAudioUnitProperty_ClassInfo, kAudioUnitScope_Global, 0); Setting the kAudioUnitProperty_ClassInfo property value each time the plugin state changes. Adding a new parameter called 'dirtystate' and toggling it and notifying the change each time the plugin state changes. But nothing really make Logic take notice. This should be an easy task, but I can't put my finger on it. How do I flag may AUv2 as needing its status saved (i.e. the host project needs saving)?
0
0
143
3w
AppleAVBAudio assertion information
Hi, I'm currently developping an AVB hardware device, and I'm currently stuck because because the apple AVB stack is throwing me errors without much informations. Is there any way to have more information about these assertions and why they are happening ? Furtermore is there any documentation on theAppleAVBAudio module ? It would be very handy Here are the logs shown in the console: Filtering the log data using "process == "coreaudiod"" Timestamp Thread Type Activity PID TTL 2025-12-05 15:44:27.087043+0100 0x15ae74 Default 0x0 12965 0 coreaudiod: (AppleAVBAudio) Assert: <private> (value 0x0 0), <private> file: <private>, line: 1533 2025-12-05 15:44:27.087545+0100 0x15ae74 Default 0x0 12965 0 coreaudiod: (AppleAVBAudio) Assert: <private> (value 0x0 0), <private> file: <private>, line: 1533 2025-12-05 15:44:27.088043+0100 0x15ae74 Default 0x0 12965 0 coreaudiod: (AppleAVBAudio) Assert: <private> (value 0x0 0), <private> file: <private>, line: 1533 2025-12-05 15:44:27.088546+0100 0x15ae74 Default 0x0 12965 0 coreaudiod: (AppleAVBAudio) Assert: <private> (value 0x0 0), <private> file: <private>, line: 1533 2025-12-05 15:44:27.089043+0100 0x15ae74 Default 0x0 12965 0 coreaudiod: (AppleAVBAudio) Assert: <private> (value 0x0 0), <private> file: <private>, line: 1533 2025-12-05 15:44:27.089545+0100 0x15ae74 Default 0x0 12965 0 coreaudiod: (AppleAVBAudio) Assert: <private> (value 0x0 0), <private> file: <private>, line: 1533 2025-12-05 15:44:27.090043+0100 0x15ae74 Default 0x0 12965 0 coreaudiod: (AppleAVBAudio) Assert: <private> (value 0x0 0), <private> file: <private>, line: 1533 2025-12-05 15:44:27.090545+0100 0x15ae74 Default 0x0 12965 0 coreaudiod: (AppleAVBAudio) Assert: <private> (value 0x0 0), <private> file: <private>, line: 1533 2025-12-05 15:44:27.091043+0100 0x15ae74 Default 0x0 12965 0 coreaudiod: (AppleAVBAudio) Assert: <private> (value 0x0 0), <private> file: <private>, line: 1533 2025-12-05 15:44:27.091545+0100 0x15ae74 Default 0x0 12965 0 coreaudiod: (AppleAVBAudio) Assert: <private> (value 0x0 0), <private> file: <private>, line: 1533 2025-12-05 15:44:27.092044+0100 0x15ae74 Default 0x0 12965 0 coreaudiod: (AppleAVBAudio) Assert: <private> (value 0x0 0), <private> file: <private>, line: 1533 2025-12-05 15:44:27.092544+0100 0x15ae74 Default 0x0 12965 0 coreaudiod: (AppleAVBAudio) Assert: <private> (value 0x0 0), <private> file: <private>, line: 1533 2025-12-05 15:44:27.093044+0100 0x15ae74 Default 0x0 12965 0 coreaudiod: (AppleAVBAudio) Assert: <private> (value 0x0 0), <private> file: <private>, line: 1533 2025-12-05 15:44:27.093552+0100 0x15ae74 Default 0x0 12965 0 coreaudiod: (AppleAVBAudio) Assert: <private> (value 0x0 0), <private> file: <private>, line: 1533 2025-12-05 15:44:27.094050+0100 0x15ae74 Default 0x0 12965 0 coreaudiod: (AppleAVBAudio) Assert: <private> (value 0x0 0), <private> file: <private>, line: 1533 2025-12-05 15:44:27.094543+0100 0x15ae74 Default 0x0 12965 0 coreaudiod: (AppleAVBAudio) Assert: <private> (value 0x0 0), <private> file: <private>, line: 1533
3
0
264
3w
Dell monitor volume control issue on iMac via USB-C
I have a new 2725QC (Dell) Monitor that uses USB-C connection to connect with the iMac (2019, 27 inch) through the back port but the problem is that the volume control can currently only be done from the hardware, not the software control using the Apple keyboard. What should I do in terms of writing code to do this (Swift or Obj-C)? Is there a third-party solution for Intel iMac and ARM Mac?
2
0
215
3w