Core Audio

RSS for tag

Interact with the audio hardware of a device using Core Audio.

Core Audio Documentation

Posts under Core Audio tag

71 Posts
Sort by:
Post not yet marked as solved
0 Replies
779 Views
I'm trying to change device of the inputNode of AVAudioEngine. To do so, I'm calling setDeviceID on its auAudioUnit. Although this call doesn't fail, something wrong happens to the output busses. When I ask for its format, it shows a 0Hz and 0 channels format. It makes the app crash when I try to connect the node to the mainMixerNode. Can anyone explain what's wrong with this code? avEngine = AVAudioEngine() print(avEngine.inputNode.auAudioUnit.inputBusses[0].format) // <AVAudioFormat 0x1404b06e0: 2 ch, 44100 Hz, Float32, non-inter> print(avEngine.inputNode.auAudioUnit.outputBusses[0].format) // <AVAudioFormat 0x1404b0a60: 2 ch, 44100 Hz, Float32, inter> // Now, let's change a device from headphone's mic to built-in mic. try! avEngine.inputNode.auAudioUnit.setDeviceID(inputDevice.deviceID) print(avEngine.inputNode.auAudioUnit.inputBusses[0].format) // <AVAudioFormat 0x1404add50: 2 ch, 44100 Hz, Float32, non-inter> print(avEngine.inputNode.auAudioUnit.outputBusses[0].format) // <AVAudioFormat 0x1404adff0: 0 ch, 0 Hz, 'lpcm' (0x00000029) 32-bit little-endian float, deinterleaved> // !!! // Interestingly, 'inputNode' shows a different format than `auAudioUnit` print(avEngine.inputNode.inputFormat(forBus: 0)) // <AVAudioFormat 0x1404af480: 1 ch, 44100 Hz, Float32> print(avEngine.inputNode.outputFormat(forBus: 0)) // <AVAudioFormat 0x1404ade30: 1 ch, 44100 Hz, Float32> Edit: Further debugging revels another puzzling thing. avEngine.inputNode.auAudioUnit == avEngine.outputNode.auAudioUnit // this is true ?! inputNode and outputNode share the same AUAudioUnit. And its deviceID is by default set to the speakers. It's so confusing to me...why would inpudeNode's device be a speaker?
Posted
by
Post not yet marked as solved
0 Replies
469 Views
Hi, I'm attempting to call audioComponentFindNext() from an iOS application (built with juce) to get a list of all available plugins. I've got an issue whereby the function is only returning the generic system plugins and missing any the 3rd party installed plugins. This issue is currently found when called from within another auv3 plugin though I have also seen it from within a normal iOS app. (Ran on iPad air 4), it the moment is working fine from an iOS app. I've tried setting microphone access and inter-app audio capabilities as I saw it suggested on similar forum posts though it has not solved my problem. Any advice would be very appreciated Thanks
Posted
by
Post marked as solved
1 Replies
591 Views
We have an audio app that utilises a custom internal audio unit attached to AVAudioEngine to do DSP processing. Currently: MIDI arrives at the input port for the app (created with MIDIDestinationCreateWithProtocol). For MIDI 1 we use AUScheduleMIDIEventBlock to pass the events from the midi input to the audio unit. All works well for MIDI 1. So while we ponder on it ourselves we have some questions to throw into the ether... a) For MIDI 2 there appears to be no equivalent method to AUScheduleMIDIEventBlock to send UMP to an audio unit? b) We initially chose the audio unit approach because MIDI and audio processing is all handled neatly, but is this approach essentially redundant? Would it be better to put a tap somewhere on the AVAudioEngine and pass MIDI 2 events directly from the input to the tap? I fear in that case synchronising MIDI to audio nicely would be a pain? c) perhaps we should wait until apple implement a UMP version of AUScheduleMIDIEventBlock?
Posted
by
Post marked as solved
1 Replies
530 Views
I have just begun to start building plugins for Logic using the AUv3 format. After a few teething problems to say the least I have a basic plug in working (integrated with SwiftUI which is handy) but the install and validation process is still buggy and bugging me! Does the stand alone app which Xcode generates to create the plugin have to always be running separately to Logic for the AUv3 to be available? Is there no way to have it as a permanently available plugin without running that? If anyone has any links to a decent tutorial please.. there are very few I can find on YouTube or anywhere else and the Apple tutorials and examples aren't great.
Posted
by
Post marked as solved
1 Replies
604 Views
Hey folks, I've been able to build and run the 'StarterAudioUnitExample' project provided by Apple, in Xcode 12.5.1, and run and load in Logic 10.6.3 on macOS 11.5.2. However, when trying to recreate the same plugin from a blank project, I'm having trouble with AUVAL or Logic actually instantiating and loading the component. See auval output below: validating Audio Unit Tremelo AUv2 by DAVE: AU Validation Tool Version: 1.8.0 Copyright 2003-2019, Apple Inc. All Rights Reserved. Specify -h (-help) for command options VALIDATING AUDIO UNIT: 'aufx' - 'trem' - 'DAVE' Manufacturer String: DAVE AudioUnit Name: Tremelo AUv2 Component Version: 1.0.0 (0x10000) PASS TESTING OPEN TIMES: COLD: FATAL ERROR: OpenAComponent: result: -1,0xFFFFFFFF validation result: couldn’t be opened Does anyone, hopefully someone from Apple, know what the error code <FATAL ERROR: OpenAComponent: result: -1,0xFFFFFFFF> actually refers too? I've been Googling for hours, and nothing I have found had worked for me so far. Also, here is the info.plist file too : Anyone that could help steer me in the right direction? Thanks, Dave
Posted
by
Post not yet marked as solved
0 Replies
535 Views
My iOS app using CoreMIDI is able to receive MIDI messages from various keyboards, but I have just received a note from a customer notifying me that my app does not appear to receive MIDI messages from his Casio CT-S1 keyboard. I am stymied on how to diagnose and fix this issue. Clearly there must be something amiss in my CoreMIDI integration. I thought perhaps someone else might have encountered a similar odd situation such as this. If it helps, here is my CoreMIDI integration code. Thanks! Regards, Brad
Posted
by
Post not yet marked as solved
1 Replies
441 Views
I have an AVComposition playback via AVPlayer where AVComposition has multiple audio tracks with audioMix applied. My question is how is it possible to compute audio meter values for the audio playing back through AVPlayer? Using MTAudioProcessingTap it seems you can only get callback for one track at a time. But if that route has to be used, it's not clear how to get sample values of all the audio tracks at a given time in a single callback?
Posted
by
Post not yet marked as solved
0 Replies
356 Views
Hi! How would you synchronize bpm, pitch and playhead position on 10-20 different devices*, all on the same closed ethernet network? *Mac, iPad and iPhone. A single device is master. Required latency tolerance in sub millisecond range, ideally sample sync. All the devices will play up to 64 channels of audio each. 48khz 24bit wav. I have considered two strategies: Broadcast all user events from master and replicate them on the slaves. Broadcast a continuous stream from master, comparing it on the slaves and slightly increasing / decreasing the corresponding parameter. I have the feeling there are some better options out there as these are neither fail safe nor very accurate. I have looked into the Ableton Link SDK, but it does not support position sync (only beat sync). All the best.
Posted
by
Post not yet marked as solved
0 Replies
399 Views
I’m using AVAudioEngine to get a stream of AVAudioPCMBuffers from the device’s microphone using the usual installTap(onBus:) setup. To distribute the audio stream to other parts of the program, I’m sending the buffers to a Combine publisher similar to the following: private let publisher = PassthroughSubject<AVAudioPCMBuffer, Never>() I’m starting to suspect I have some kind of concurrency or memory management issue with the buffers, because when consuming the buffers elsewhere I’m getting a range of crashes that suggest some internal pointer in a buffer is NULL (specifically, I’m seeing crashes in vDSP.convertElements(of:to:) when I try to read samples from the buffer). These crashes are in production and fairly rare — I can’t reproduce them locally. I never modify the audio buffers, only read them for analysis. My question is: should it be possible to put AVAudioPCMBuffers into a Combine pipeline? Does the AVAudioPCMBuffer class not retain/release the underlying AudioBufferList’s memory the way I’m assuming? Is this a fundamentally flawed approach?
Post not yet marked as solved
0 Replies
432 Views
I'm trying to play an audio content built from NSData inside a library (.a). It works properly when my code is inside an app. But it is not working when in a library, I get no error and no sound playing. NSError * errorAudio = nil; NSError * errorFile; // Clear all cache NSArray* tmpDirectory = [[NSFileManager defaultManager] contentsOfDirectoryAtPath:NSTemporaryDirectory() error:NULL]; for (NSString *file in tmpDirectory) {     [[NSFileManager defaultManager] removeItemAtPath:[NSString stringWithFormat:@"%@%@", NSTemporaryDirectory(), file] error:NULL]; } // Set temporary directory and temporary file NSURL * tmpDirURL = [NSURL fileURLWithPath:NSTemporaryDirectory() isDirectory:YES]; NSURL * soundFileURL = [[tmpDirURL URLByAppendingPathComponent:@"temp"] URLByAppendingPathExtension:@"wav"]; [[NSFileManager defaultManager] createDirectoryAtURL:tmpDirURL withIntermediateDirectories:NO attributes:nil error:&amp;errorFile]; // Write NSData to temporary file NSString *path= [soundFileURL path]; [audioToPlay writeToFile:path options:NSDataWritingAtomic error:&amp;errorFile]; if (errorFile) {     // Error while writing NSData } else {     // Init audio player     self.audioPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:soundFileURL error:&amp;errorAudio];     if (errorAudio) {         // Audio player could not be initialized     } else {         // Audio player was initialized correctly         [audioPlayer prepareToPlay];         [audioPlayer stop];         [audioPlayer setCurrentTime:0];         [audioPlayer play];     } } I don't check errorFile intros piece of code, but when debugging I can see that value is nil. My header file #import &lt;AudioToolbox/AudioToolbox.h&gt; #import &lt;AVFoundation/AVFoundation.h&gt; @property(nonatomic, strong) AVAudioPlayer * audioPlayer; My m file #import &lt;AudioToolbox/AudioToolbox.h&gt; #import &lt;AVFoundation/AVFoundation.h&gt; @synthesize audioPlayer; I've been checking for dozens of posts but cannot find any solution, it always works properly in an app, but not in a library. Any help would be greatly appreciated.
Posted
by
Post not yet marked as solved
0 Replies
340 Views
I’m using AVFoundation for image capture using camera on iPad. But I’m not using Video or Audio related functionality. Looks like with AVFoundation; CoreMedia, CoreVideo and CoreAudio are also imported in any project. Is there any way by which I can remove these libraries(CoreMedia, CoreVideo and CoreAudio) from my app. I have used otool to list all the frameworks and libraries being used by my framework.
Posted
by
Post not yet marked as solved
0 Replies
346 Views
I’m using AVFoundation to access camera on iPad. But with AVFoundation, CoreMedia is also imported, which in-turn imports CoreAudio and CoreVideo. Keeping privacy concerns in mind, is there any way by which I can ensure that the app is never able to access Microphone or Video Recording? AVfoundation CoreMedia
Posted
by
Post not yet marked as solved
0 Replies
403 Views
Hi, Wondering if anyone has found a solution to the automatic volume reduction on the host computer using the OSX native screen share application. The volume reduction makes it nearly impossible to comfortably continue working on the host computer when there is any audio involved. Is there a way to bypass to this function? It seems to be the same native function that FaceTime uses to reduce the system audio volume to create priority for the application. Please help save my speakers! Thanks.
Posted
by
Post not yet marked as solved
1 Replies
452 Views
Curious if there is a sound way for an AUv3 component to identify how many other instances of it that are running on a device. For instance, if GarageBand has 4 tracks and all of the tracks use the same AUv3 component, is there a sound way for each one to obtain a unique index value? Thanks!
Posted
by
Post not yet marked as solved
1 Replies
388 Views
Hi, I've released an open-source AUv3 MIDI processor plugin for iOS and macOS that records and plays MIDI messages in a sample accurate fashion and doesn't ever apply any quantization. I've tested this plugin with 120 beta testers and everything seemed to work fine. However, now that I've released it, there seems to be a problem in Logic Pro X on some Mac computers with MIDI FX processor plugins that are using Catalyst. You can find my plugin here: http://uwyn.com/mtr/ ... and the source code here: https://github.com/gbevin/MIDITapeRecorder When I trace the AUv3 instantiation, I see Logic Pro X obtaining the internalRenderBlock several times, but then never ever calling it. This means there's no render callback and there's never any MIDI parameter events received. I've talked to the developer of ZOA, which is also a MIDI processor plugin using Catalyst and he's running into exactly the same problem: https://www.audiosymmetric.com/zoa.html Another developer that’s working on a MIDI processor plugin has been trying to track this down for weeks also. When I test this on my M1 Max MacBook Pro, is always internalRenderBlock, however an my M1 MacBook Air and Intel 2019 MacBook Pro, it is never called. Any thoughts or ideas to work around this would be really helpful. Thanks!
Posted
by
Post not yet marked as solved
1 Replies
343 Views
We're trying to join our audio worker threads to a CoreAudio HAL audio workgroup, but haven't managed to this working yet. Here's what we do: Fetch audio workgroup handle from the CoreAudio device: UInt32 Count = sizeof(os_workgroup_t); os_workgroup_t pWorkgroup = NULL; ::AudioDeviceGetProperty(SomeCoreAudioDeviceHandle, kAudioUnitScope_Global, 0, kAudioDevicePropertyIOThreadOSWorkgroup, &Count, &pWorkgroup); This succeeds on a M1 Mini for the "Apple Inc.: Mac mini Speakers" on OSX 11.1. The returned handle looks fine as well: [(NSObject*)pWorkgroup debugDescription] returns "{xref = 2, ref = 1, name = AudioHALC Workgroup}" Join some freshly created worker threads to the workgroup via: os_workgroup_join_token_s JoinToken; int Result = ::os_workgroup_join(pWorkgroup, &JoinToken); The problem: Result from os_workgroup_join always is EINVAL, Invalid argument - whatever we do. Both arguments, the workgroup handle and the join token are definitely valid. And the device hasn't been stopped or reinitialized here, so the workgroup should not be cancelled? Has anyone else managed to get this working? All examples out there seem to successfully use the AUHAL workgroup instead of the audio device HAL API.
Posted
by
Post not yet marked as solved
3 Replies
854 Views
Hello, my app crashed on the new MacOS12.x system, it works well on MacOS 11 BigSur. I'm developing an audio app on MacOS using AudioUnit, it sometime crashed when i switch devices. the relevant api is: AudioUnitSetProperty(audio_unit, kAudioOutputUnitProperty_CurrentDevice, kAudioUnitScope_Global, kAudioUnitOutputBus, &rnd_id, sizeof(rnd_id)); it troubles me for month, i can't find the reason or any useful info, any help will be appreciate. the crash log is: OS Version:      macOS 12.1 (21C51) Report Version:    12 Bridge OS Version:  6.1 (19P647) Crashed Thread:    43 schedule-thread Exception Type:    EXC_BAD_ACCESS (SIGSEGV) Exception Codes:   KERN_INVALID_ADDRESS at 0x00000a14f8969188 Exception Codes:   0x0000000000000001, 0x00000a14f8969188 Exception Note:    EXC_CORPSE_NOTIFY Application Specific Information: objc_msgSend() selector name: copy Thread 43 Crashed:: schedule-thread 0 libobjc.A.dylib         0x7ff815ef405d objc_msgSend + 29 1 CoreAudio            0x7ff817a237b9 HALC_ShellDevice::_GetPropertyData(unsigned int, AudioObjectPropertyAddress const&, unsigned int, void const*, unsigned int, unsigned int&, void*, unsigned int&, AudioObjectPropertyAddress&, bool&) const + 1133 2 CoreAudio            0x7ff817c57b81 invocation function for block in HALC_ShellObject::GetPropertyData(unsigned int, AudioObjectPropertyAddress const&, unsigned int, void const*, unsigned int, unsigned int&, void*) const + 107 3 CoreAudio            0x7ff817e8a606 HALB_CommandGate::ExecuteCommand(void () block_pointer) const + 98 4 CoreAudio            0x7ff817c56a98 HALC_ShellObject::GetPropertyData(unsigned int, AudioObjectPropertyAddress const&, unsigned int, void const*, unsigned int, unsigned int&, void*) const + 376 5 CoreAudio            0x7ff817b04235 HAL_HardwarePlugIn_ObjectGetPropertyData(AudioHardwarePlugInInterface**, unsigned int, AudioObjectPropertyAddress const*, unsigned int, void const*, unsigned int*, void*) + 349 6 CoreAudio            0x7ff817c16109 HALPlugIn::ObjectGetPropertyData(HALObject const&, AudioObjectPropertyAddress const&, unsigned int, void const*, unsigned int&, void*) const + 59 7 CoreAudio            0x7ff817bd2f5d HALObject::GetPropertyData(AudioObjectPropertyAddress const&, unsigned int, void const*, unsigned int&, void*) const + 461 8 CoreAudio            0x7ff817f2ffca HALDevice::GetPropertyData(AudioObjectPropertyAddress const&, unsigned int, void const*, unsigned int&, void*) const + 644 9 CoreAudio            0x7ff8179809ab AudioObjectGetPropertyData + 275 10 AudioDSP              0x134b6df19 0x13490b000 + 2502425 11 AudioDSP              0x134b69776 0x13490b000 + 2484086 12 AudioDSP              0x134cc4eb4 0x13490b000 + 3907252 13 AudioDSP              0x134cc5f56 0x13490b000 + 3911510 14 AudioDSP              0x134e17b2c 0x13490b000 + 5294892 15 AudioDSP              0x134e0f4d1 0x13490b000 + 5260497
Posted
by
Post not yet marked as solved
0 Replies
367 Views
Hi, I've problem with an AU host (based on Audio Toolbox/Core Audio, not AVFoundation) when running on macOS 11 or later and Apple Silicon – it crashes after some operations in GUI. The weird is, it crashes in IOThread. Could this be caused by some inappropriate operation in GUI (eg. outside the main thread) that effects the IOThread? Sounds quite improbable to me. And I did not find anything suspicious in the code. There are two logs in the debugger: [AUHostingService Client] connection interrupted. rt_sender::signal_wait failed: 89 ... And here is the crash log: Crash log: ... Thanks, Tomas
Posted
by
Post not yet marked as solved
0 Replies
268 Views
let volumePropertyAddress = AudioObjectPropertyAddress(         mSelector: kAudioHardwareServiceDeviceProperty_VirtualMainVolume,         mScope: kAudioDevicePropertyScopeOutput,         mElement: kAudioObjectPropertyElementMaster       )      let status = AudioObjectSetPropertyData(deviceId, &theAddress, 0, nil, size, &theValue) Then App freezes. Is it not possible to call the AudioObjectSetPropertyData method on the main thread
Posted
by
Post not yet marked as solved
2 Replies
345 Views
The very little and outdated 'documentation' shared by Apple about CoreAudio and CoreMIDI server plugins suggested to use syslog for logging. At least since Bug Sur syslog doesn't end up anywhere. (So, while you seem to think its OK to not document your APIs you could at least remove not working APIs then! Not to do so causes unnecessary and frustrating bug hunting?) Should we replace syslog by unified logging? For debugging purpose only our plugins write to our own log files. Where can I find suitable locations? Where is this documented? Thanks, hagen.
Posted
by