Core Audio

RSS for tag

Interact with the audio hardware of a device using Core Audio.

Core Audio Documentation

Posts under Core Audio tag

54 Posts
Sort by:
Post not yet marked as solved
0 Replies
669 Views
Hi! I am working on an audio application on iOS. This is how I retreive the workgroup from the remoteIO audiounit (ioUnit). The unit is initialized and is working fine (meaning that it is regularly called by the system). os_workgroup_t os_workgroup{nullptr}; uint32_t os_workgroup_index_size; if (status = AudioUnitGetProperty(ioUnit, kAudioOutputUnitProperty_OSWorkgroup, kAudioUnitScope_Global, 0, &os_workgroup, &os_workgroup_index_size); status != noErr) { throw runtime_error("AudioUnitSetProperty kAudioOutputUnitProperty_OSWorkgroup - Failed with OSStatus: " + to_string(status)); } However the resulting os_workgroup's value is 0x40. Which seems not correct. No wonder I cannot join any other realtime threads to the workgroup as well. The returned status however is a solid 0. Can anyone help?
Posted Last updated
.
Post not yet marked as solved
30 Replies
26k Views
I'm very excited about the new AirTag product and am wondering if there will be any new APIs introduced in iOS 14.5+ to allow developers to build apps around them outside the context of the Find My network? The contexts in which I am most excited about using AirTags are: Gaming Health / Fitness-focused apps Accessibility features Musical and other creative interactions within apps I haven't been able to find any mention of APIs. Thanks in advance for any information that is shared here. Alexander
Posted
by alexander.
Last updated
.
Post not yet marked as solved
4 Replies
2.1k Views
I'm using a VoiceProcessingIO audio unit in my VoIP application on Mac. The problem is, at least since Mojave, AudioComponentInstanceNew blocks for at least 2 seconds. Profiling shows that internally it's waiting on some mutex and then on some message queue. My code to initialize the audio unit is as follows: OSStatus status; AudioComponentDescription desc; AudioComponent inputComponent; desc.componentType = kAudioUnitType_Output; desc.componentSubType = kAudioUnitSubType_VoiceProcessingIO; desc.componentFlags = 0; desc.componentFlagsMask = 0; desc.componentManufacturer = kAudioUnitManufacturer_Apple; inputComponent = AudioComponentFindNext(NULL, &desc); status = AudioComponentInstanceNew(inputComponent, &unit);Here's a profiler screenshot showing the two system calls in question.So, is this a bug or an intended behavior?
Posted
by Grishka.
Last updated
.
Post not yet marked as solved
0 Replies
650 Views
I like to know NullAudio.c is official SDK sample or not. And the reason of enum and UID is defined in NullAudio.c, not defined in SDK header files. I try to use kObjectID_Mute_Output_Master, but it defined different values on each 3rd party plugin. kObjectID_Mute_Output_Master = 10 // NullAudio.c kObjectID_Mute_Output_Master = 9 // https://github.com/ExistentialAudio/BlackHole kObjectID_Mute_Output_Master = 6 // https://github.com/q-p/SoundPusher I can build BlackHole and SoundPusher, these plugin worked. This enum should be defined SDK header and keep same value on each SDK version. I like to know why 3rd party defined different value. If you know the history of NullAudio.c, please let me know.
Posted
by Himadeus.
Last updated
.
Post not yet marked as solved
0 Replies
871 Views
We are developing an app that uses external hardware to measure analogue hearing-loop performance . It uses audio jack on phone/iPad. With the new hardware on iPad using USB-C , we have noticed that the same input , one with lighting adapter and one with usb-C adapter - both produce way different input levels. The USB-C is ~23dB lower, with the same code and settings. That's almost 10x difference. Is there any way to control the USB-C adapter? am I missing something ? The code simply uses AVAudioInputNode and block attached to it via self.inputNode.installTap we do adjust gain to 1.0 let gain: Float = 1.0 try session.setInputGain(gain) But that still does not help. I wish there was an apple lab I could go to , to speak to engineers about it.
Posted
by greggj.
Last updated
.
Post not yet marked as solved
0 Replies
591 Views
A: iPhone SE 2nd (iOS 16.5) Used bluetooth model: Shokz OpenRun S803 B: Any mobile device A uses bluetooth microphone/speaker, and make a call to B using iPhone app. Mute the A's headphone. (The bluetooth device support mute by hardware). While A mutes, B speaks. Unmute A's headphone. Every time B speaks, B can hear the echo. Since there is no audio data during the hardware muted, VPIO don't recognize audio reference data to remove echo signal. Is there any alternative to resolve this echo in VoIP software using VPIO?
Posted
by ened.
Last updated
.
Post not yet marked as solved
2 Replies
5.0k Views
Hi! I am trying to develop an application that uses Core MIDI to instantiate a connection to my Macbook via Audio MIDI Setup. I have created a client within my application and that shows up under the directory in Audio MIDI Setup on my macbook. Now I am stuck trying to figure out how to send MIDI from my app to my computer. I have tried MIDISend, and using CreateOutputPort. Both are successful in the sense that I don't get zeros when printing to the console, but nothing changes in the DAW when I set the controller and number values to the exact numbers I created in my app.I have a feeling that I am missing a network connection within my app somehow so that it recognizes my computer as a source, but I have not yet found an effective method to do this.Any information as to how I get midi to send from my app to my DAW on my computer would be greatly appreciated!I am trying to make this for my final project in one of my coding classes.Thanks!-GH
Posted Last updated
.
Post not yet marked as solved
0 Replies
550 Views
I am currently working on a project that involves real-time audio processing in my iOS/macOS application. I have been exploring the Audio Unit Hosting API and specifically the AUHAL units for handling audio input and output. My goal is to establish a direct connection between an input AUHAL unit and an output AUHAL unit to achieve seamless real-time audio processing. I've been researching and experimenting with the API, but I haven't been able to find a clear solution or documentation regarding this specific scenario. Has anyone attempted such a configuration or encountered similar requirements? I would greatly appreciate any insights, suggestions, or pointers to relevant documentation that could help me achieve this direct connection between the input and output AUHAL units. Thank you in advance for your time and assistance. Best regards, Yosemite
Posted
by yosemite.
Last updated
.
Post not yet marked as solved
7 Replies
12k Views
Hi,We recently started using AVAudioEngine in our app, and are receiving reports of crashes in the wild, specifically: com.apple.coreaudio.avfaudio required condition is false: hwFormatThese crashes are occurring on all iOS versions, including the latest (10.0.2 14A456).The crashes are always on a background thread. Here is one example stack trace:(CoreFoundation + 0x0012f1c0 ) __exceptionPreprocess (libobjc.A.dylib + 0x00008558 ) objc_exception_throw (CoreFoundation + 0x0012f090 ) +[NSException raise:format:arguments:] (AVFAudio + 0x00016788 ) AVAE_RaiseException(NSString*, ...) (AVFAudio + 0x0008f168 ) AVAudioIOUnit::_GetHWFormat(unsigned int, unsigned int*) (AVFAudio + 0x0008ee64 ) ___ZN13AVAudioIOUnit22IOUnitPropertyListenerEPvP28OpaqueAudioComponentInstancejjj_block_invoke_2 (libdispatch.dylib + 0x000011fc ) _dispatch_call_block_and_release (libdispatch.dylib + 0x000011bc ) _dispatch_client_callout (libdispatch.dylib + 0x0000f440 ) _dispatch_queue_serial_drain (libdispatch.dylib + 0x000049a4 ) _dispatch_queue_invoke (libdispatch.dylib + 0x00011388 ) _dispatch_root_queue_drain (libdispatch.dylib + 0x000110e8 ) _dispatch_worker_thread3 (libsystem_pthread.dylib + 0x000012c4 ) _pthread_wqthread (libsystem_pthread.dylib + 0x00000db0 ) start_wqthreadWhen this crash occurs, the main thread is generally responding to an audio route change. Here is one example stack trace:(libsystem_kernel.dylib + 0x0000116c ) mach_msg_trap (libsystem_kernel.dylib + 0x00000fd8 ) mach_msg (libdispatch.dylib + 0x0001cac0 ) _dispatch_mach_msg_send (libdispatch.dylib + 0x0001c214 ) _dispatch_mach_send_drain (libdispatch.dylib + 0x0001d414 ) _dispatch_mach_send_push_and_trydrain (libdispatch.dylib + 0x000174a8 ) _dispatch_mach_send_msg (libdispatch.dylib + 0x000175cc ) dispatch_mach_send_with_result (libxpc.dylib + 0x00002c80 ) _xpc_connection_enqueue (libxpc.dylib + 0x00003cf0 ) xpc_connection_send_message_with_reply (MediaRemote + 0x00011edc ) MRMediaRemoteServiceGetPickedRouteHasVolumeControl (MediaPlayer + 0x0009d57c ) -[MPAVRoutingController _pickableRoutesDidChangeNotification:] (CoreFoundation + 0x000c9228 ) __CFNOTIFICATIONCENTER_IS_CALLING_OUT_TO_AN_OBSERVER__ (CoreFoundation + 0x000c892c ) _CFXRegistrationPost (CoreFoundation + 0x000c86a8 ) ___CFXNotificationPost_block_invoke (CoreFoundation + 0x00137b98 ) -[_CFXNotificationRegistrar find:object:observer:enumerator:] (CoreFoundation + 0x0000abf0 ) _CFXNotificationPost (Foundation + 0x000066b8 ) -[NSNotificationCenter postNotificationName:object:userInfo:] (MediaServices + 0x00003100 ) -[MSVDistributedNotificationObserver _handleDistributedNotificationWithNotifyToken:] (MediaServices + 0x00002f10 ) __78-[MSVDistributedNotificationObserver initWithDistributedName:localName:queue:]_block_invoke (libsystem_notify.dylib + 0x00009ea4 ) ___notify_dispatch_local_notification_block_invoke (libdispatch.dylib + 0x000011fc ) _dispatch_call_block_and_release (libdispatch.dylib + 0x000011bc ) _dispatch_client_callout (libdispatch.dylib + 0x00005d68 ) _dispatch_main_queue_callback_4CF (CoreFoundation + 0x000dcf28 ) __CFRUNLOOP_IS_SERVICING_THE_MAIN_DISPATCH_QUEUE__ (CoreFoundation + 0x000dab14 ) __CFRunLoopRun (CoreFoundation + 0x00009044 ) CFRunLoopRunSpecific (GraphicsServices + 0x0000c194 ) GSEventRunModal (UIKit + 0x0007b624 ) -[UIApplication _run] (UIKit + 0x0007635c ) UIApplicationMainThis looks like the same issue as https://forums.developer.apple.com/message/36184#36184One potential complication is that we are calling AVAudioEngine methods off of the main thread. My belief is that this is safe - but I can't find any official reference to confirm that it is. We find that AVAudioEngine method calls can block, which is why we moved the work off the main thread.We are listening for audio engine configuration change notifications and handling them similarly to the AVAudioEngine sample code.I have attempted to reproduce this issue locally by performing various actions in combination (receiving phone calls, suspending the app, plugging in or unplugging headphones, etc) with no luck.Any thoughts on what conditions might be triggering this exception? Hopefully, I can at least narrow down a set of conditions to allow me to reproduce the crash in a controlled environment.Thanks,Rob
Posted
by robgaunt.
Last updated
.
Post not yet marked as solved
1 Replies
2k Views
Hi, Wondering if anyone has found a solution to the automatic volume reduction on the host computer using the OSX native screen share application. The volume reduction makes it nearly impossible to comfortably continue working on the host computer when there is any audio involved. Is there a way to bypass to this function? It seems to be the same native function that FaceTime uses to reduce the system audio volume to create priority for the application. Please help save my speakers! Thanks.
Posted Last updated
.
Post not yet marked as solved
4 Replies
2.0k Views
Hello,I am having problems getting my AUv3 Instrument with an inputBus to work. As a standalone app (with the SimplePlayEngine of the sample code integrated) it seems to work fine, the plugin also passes the auval test without errors. But when I try to use the plugin in a host application (like garageband / logic / host of the sample code) I can't get any output, the internalRenderBlock is not even being called. I narrowed it down to the inputBusses property, so it seems that I am doing something wrong with setting up the input bus.To reproduce, take the InstrumentDemo of the Apple sample code, and in the init method initialize an inputBusBuffer, create an inputBusArray with the bus of the inputBusBuffer. Set the inputBusArray as the return value for the inputBusses property and allocateRenderResources of the inputBusBuffer in the allocateRenderResourcesAndReturnError (and deallocateRenderResources in the deallocateRenderResources call). All of this is done analogous to the inputBus setup in the FilterDemo example.I also explicitly set the channelCapabilities to Stereo In, Stereo Out.Omitting the further processing in the internalRenderBlock, shouldn't this work to the point that internalRenderBlock is getting called? Ít is getting called in the App, and auval validation succeeds, but it is not being called in any host.Am I missing something here?Any help will be much appreciated!
Posted
by jfjs.
Last updated
.
Post not yet marked as solved
1 Replies
1.3k Views
In the AudioBufferList extension, there is a comment above the allocate function     /// The memory should be freed with `free()`.     public static func allocate(maximumBuffers: Int) -> UnsafeMutableAudioBufferListPointer But when I try to call free on the returned pointer, free (buffer) XCode complains: Cannot convert value of type 'UnsafeMutableAudioBufferListPointer' to expected argument type 'UnsafeMutableRawPointer?' How should the pointer be free'd? I tried free (&buffer) XCode didn't complain, but when I ran the code, I got an error in the console. malloc: *** error for object 0x16fdfee70: pointer being freed was not allocated I know the call to allocate was successful. Thanks, Mark
Posted Last updated
.