Core Audio Kit

RSS for tag

Add user interfaces to Cocoa audio units using Core Audio Kit.

Core Audio Kit Documentation

Posts under Core Audio Kit tag

14 results found
Sort by:
Post not yet marked as solved
196 Views

Issue finding available auv3 plugins on ios

Hi, I'm attempting to call audioComponentFindNext() from an iOS application (built with juce) to get a list of all available plugins. I've got an issue whereby the function is only returning the generic system plugins and missing any the 3rd party installed plugins. This issue is currently found when called from within another auv3 plugin though I have also seen it from within a normal iOS app. (Ran on iPad air 4), it the moment is working fine from an iOS app. I've tried setting microphone access and inter-app audio capabilities as I saw it suggested on similar forum posts though it has not solved my problem. Any advice would be very appreciated Thanks
Asked
by Wlpjam.
Last updated
.
Post not yet marked as solved
240 Views

How to send midi data over Bluetooth ?

Hello, I created a connection between my phone and my mac by Bluetooth with CABTMIDILocalPeripheralViewController . Now according to the documentation it is possible to send midi commands through this connection but I did not find anything about it.
Asked
by lleeooGG.
Last updated
.
Post not yet marked as solved
406 Views

why isn't the pod file working?!

I have an Xcode project and im integrating admobs to it and as a part of that ive used firebase and cocoa pods. after finding the Xcode project on the terminal and making the changes to the pod file when type 'pod install' as the tutorial tells me too I get this error code: [!] Invalid Podfile file: undefined method `Pod' for #Pod::Podfile:0x00007fcd953068c0. has anyone had this before and what can I do to resolve this?
Asked
by Georgeb19.
Last updated
.
Post not yet marked as solved
329 Views

How can I collect dB levels?

Is there a simple means of collecting decibel levels when my app user begins a timer? I simply want to register the moment (time) the decibel level of the wearer reaches 140+dB.
Asked
by chollman.
Last updated
.
Post not yet marked as solved
276 Views

How can a MediaStream (microphone) be used in several functions?

Hello developers, In my webapp I use a MediaStream (microphone) to convert the spoken word into text. (SpeechToText API) For this, the MediaStream is passed to an NPM package function. Before that, however, the stream is to run through another function in which the MediaStream is used to style a speech visualisation. This works without any problems under Windows 10 and Android. So I get the visualisation styled and the language converted to text. Unfortunately, this does not work on iOS (not yet tested on macOS). Only the language is converted to text, but the visualisation does not work. I was able to find out through other forums that it is probably because the functions duplicate the stream and thus the first stream that goes into the visualisation is muted and only the second stream can use the language for the text output. If I comment out the language for the text function, the visualisation works perfectly. Is there a trick or solution to use a stream through both functions? btw. MediaStream.clone() does not work either. Attached again is a sketch of my code: navigator.mediaDevices.getUserMedia({         audio: true,         video: false       }).then((stream)={ this.voiceVisualization(stream); // not working recognizeMic({MediaStream: stream}); // working }); Any idea? Many thanks in advance for your support. Cheers Saintenr
Asked
by Saintenr.
Last updated
.
Post not yet marked as solved
329 Views

Core audio for apple silicon

I just saw my activity monitor and noticed that there was something named core audio with intel architecture in my apple silicon mac which came pre loaded. Analysis of sampling com.apple.audio.Core-Audio-Driver-Service (pid 11196) every 1 millisecond Process: com.apple.audio.Core-Audio-Driver-Service [11196] Path: /System/Library/Frameworks/CoreAudio.framework/Versions/A/XPCServices/com.apple.audio.Core-Audio-Driver-Service.xpc/Contents/MacOS/com.apple.audio.Core-Audio-Driver-Service Load Address: 0x10023d000 Identifier: com.apple.audio.Core-Audio-Driver-Service Version: 1.0 (1) Code Type: X86-64 (translated) Platform: macOS Parent Process: launchd [1] Date/Time: 2021-03-31 01:30:39.233 +0530 Launch Time: 2021-03-31 01:20:44.850 +0530 OS Version: macOS 11.2.3 (20D91) Report Version: 7 Analysis Tool: /usr/bin/sample I wish I could get an answer for this why it is still in intel architecture and is not yet ported to apple silicon yet.
Asked
by Rishiar.
Last updated
.
Post not yet marked as solved
658 Views

Virtual Audio Device with Driver kit

We have a product which has remote desktop functionality, where users can connect to their remote machines and do their work seems-less. Along with that we are providing extra support to hear system sounds, for that we had an existing solution with KEXT using IOKit. Where we are using IOAudioStream and supporting API classes for input and Output audio sounds. Since System Extension is introduced from macOS 10.15, Apple is recommending transition from KEXT to DEXT. In recent releases our sound KEXT is unable to load in Kernel because of recent OS updates, now we are planning to migrate to System Extension approach. As part of this we went through Driver Kit APIs, then we found IOKit APIs are not part of Driver Kit framework. These are the classes which has majority APIs to transfer the System sounds with the help of IOAudioStream. The below thread tells, https://developer.apple.com/forums/thread/133318 we can implement audio drivers if the device is of USB, but in our case it is Virtual Network device. We are struck at this part, how to create a Network Audio Device in the new approach. Kindly help us how to achieve this in the current environment or Are there any plans to include IOKit APIs in Driver Kit framework soon! Thanks in Advance, Regards, Venkata.
Asked Last updated
.
Post not yet marked as solved
340 Views

What kind of Apple Developer Certificate is required to create a custom virtual audio device using Core Audio?

I am working on an application that will require functionality similar to that of SoundFlower or LoopBack, and I know that this means creating my own Aggregate Audio device or Virtual Audio Device using Apple's Core Audio. Is there a specific certification that I need so that I can develop in this area? I know the process could potentially take a few months.
Asked
by nturczan.
Last updated
.
Post not yet marked as solved
275 Views

Identify recording sound is pleasant or unpleasant

How to identify recording audio is pleasant or unpleasant in iOS. I am recording of a motor vehicle sound and want to measure sound is pleasant/unpleasant or noisy means basically want to measure motor sound of vehicle for maintenance in iOS.
Asked Last updated
.
Post not yet marked as solved
384 Views

Critical Breaking Bug with Audio Units version 3 Midi Processors in Logic Pro X 10.5.1

Logic Pro X has a critical bug when using Audio Units version 3 Midi FX. The issue is very easy to reproduce: What happens: When Audio Unit v3 MIDI FX is inserted on instrument, the instrument will play and produce audio output reliably only when the instrument is currently selected track. If another track is selected, the instrument will not play even the MIDI FX Plugin is currently receiving the midi. The instrument will loose audio and will not play or the midi is not correctly passed. MIDI FX Plugins Audio Unit v3 appears 8x duplicated in Logic Pro X plugin menu This issue happens with any MIDI FX processor Audio Unit version 3
Asked
by jendakub.
Last updated
.
Post not yet marked as solved
302 Views

Audio Delay - Clean OSX Install

Hi All I have directed this question to Google / Logic users but get caught up in a strictly 'delay effects' trap. You'll see what I mean when you read the problem. I have Logic Pro installed on Mojave 10.14.6 and whenever I use headphones and just the metronome there is a strange audio delay. Its very short but is also present when using loops or any other osx application with audio output. I used an alternative audio interface and this resolves the issue but I would like to use the macs headphone output especially whilst traveling. Here's the interesting one for me - I have reinstalled Mojave on a separate, clean drive and then followed by Logic as my test to see if it returns and hey ho its not there. You may say just do a clean install on my mac but I'm very curious to find out what this could be. I'm not a coder but a long time mac/audio/educator and would be good to share with students. Any help much appreciated. Jasonix
Asked
by Jasonix.
Last updated
.
Post not yet marked as solved
311 Views

AudioOutputUnitStart() is taking over 700ms

Our VOIP application is taking around 700msec in the function call  AudioOutputUnitStart(AudioUnit), which is causing the call setup delays. This delay is seen on iPhone11. Any suggestions to minimize this delay would be highly appreciated.
Asked
by phani.
Last updated
.