Post not yet marked as solved
My company develops an audio plugin that downloads a dylib file from our server in order to ensure that the plugin is always up-to-date. The dylib file is signed and hardened, and this process has worked correctly for years.
We've installed the OSX 12 Beta, and the software continues to work properly, except in GarageBand. GarageBand refuses to load the dylib, saying the file "cannot be opened because Apple cannot check it for malicious software."
This dylib file cannot be installed via a stapled package. Is there a mechanism by which we can staple an individual dylib file? We've tried adding the dylib to a .zip file and uploading it to the notarization service. The file is approved, but the staple tool apparently won't staple a single dylib file, expecting it instead to be part of a package.
Is there any way around this? Will GarageBand for OSX 12 be repaired? The plugin works correctly in Logic on OSX 12.
Thanks!
Hey folks,
I've been able to build and run the 'StarterAudioUnitExample' project provided by Apple, in Xcode 12.5.1, and run and load in Logic 10.6.3 on macOS 11.5.2.
However, when trying to recreate the same plugin from a blank project, I'm having trouble with AUVAL or Logic actually instantiating and loading the component.
See auval output below:
validating Audio Unit Tremelo AUv2 by DAVE:
AU Validation Tool
Version: 1.8.0
Copyright 2003-2019, Apple Inc. All Rights Reserved.
Specify -h (-help) for command options
VALIDATING AUDIO UNIT: 'aufx' - 'trem' - 'DAVE'
Manufacturer String: DAVE
AudioUnit Name: Tremelo AUv2
Component Version: 1.0.0 (0x10000)
PASS
TESTING OPEN TIMES:
COLD:
FATAL ERROR: OpenAComponent: result: -1,0xFFFFFFFF
validation result: couldn’t be opened
Does anyone, hopefully someone from Apple, know what the error code <FATAL ERROR: OpenAComponent: result: -1,0xFFFFFFFF> actually refers too?
I've been Googling for hours, and nothing I have found had worked for me so far.
Also, here is the info.plist file too :
Anyone that could help steer me in the right direction?
Thanks,
Dave
My app is using RemoteIO to record audio. It doesn’t do any playback. RemoteIO seems to be broadly compatible with the new Sound Recognition feature in iOS14, but I’m seeing a glitch when sound recognition is first enabled.
If my app is started and I initialise RemoteIO, and then turn on Sound Recognition (say via control centre), the RemoteIO input callback is not called thereafter, until I tear down the audio unit and set it back up. So something like the following:
Launch app
RemoteIO is initialised and working, can record
Turn on Sound Recognition via Settings or control centre widget
Start recording with already-set up RemoteIO
Recording callback is never again called
Though no input callbacks are seen, kAudioOutputUnitProperty_IsRunning is reported as true, so the audio unit thinks it is active
Tear down audio unit
Set up audio unit again
Recording works
Buffer size is changed, reflecting some effect on the audio session of the Sound Recognition feature
I also noticed that when Sound Recognition is enabled, I see several (usually 3) AVAudioSession.routeChangeNotifications in quick succession. When Sound Recognition is disabled while RemoteIO is set up, I don’t see this problem. I’m allocating my own buffers so it’s not a problem with their size.
What could be going on here? Am I not handling a route change properly? There doesn’t seem to be a reliable sequence of events I can catch to know when to reset the audio unit.
The only fix I’ve found here is to hack in a timer that checks for callback activity shortly after starting recording, and resets the audio unit if no callback activity is seen. Better than nothing, but not super reliable.
Post not yet marked as solved
I have the bare bones of an AUv3 plug in. I have used UIHostingController to allow me to add a SwiftUI view. Then from the SwiftUI I have added a SprikeKit scene which I want to use to start building an interface for the AU.
It's a very basic GameScene for now. If I run one instance of it in Logic then it's OK. If I add another instance it crashes with a Thread 1: EXC_BAD_INSTRUCTION (code=EXC_I386_INVOP, subcode=0x0) error.
This seems to be down to how I declare a SpriteNode variable.
If I declare it within the GameScene as a "let" then it's fine. If I declare it outside the class - (see hashed code) then it will crash with two instances. It would be useful to me later on for this to be a public variable as I want to perform functions on it outside of the class.
Is there a better way to declare the SpriteNode variable to make it stable?
import SpriteKit
//public var ball = SKShapeNode(circleOfRadius: 30)
var ball = SKShapeNode(circleOfRadius: 30)
class GameScene: SKScene {
let ball = SKShapeNode(circleOfRadius: 30)
override func didMove(to view: SKView) {
//physicsBody = SKPhysicsBody(edgeLoopFrom: frame)
ball.position = CGPoint(x: 200, y: 200)
ball.fillColor = .lightGray
self.addChild(ball)
}
Post not yet marked as solved
AudioUnit is kAudioUnitSubType_VoiceProcessingIO
ExceptionHandling 0x7fff3bde6f31 -[NSExceptionHandler _handleException:mask:] + 364
ExceptionHandling 0x7fff3bde6cac NSExceptionHandlerUncaughtSignalHandler + 35
libsystem_platform.dylib 0x7fff203bad7d _sigtramp + 29
0x0000000000000000 0x7000043de580 0x0 + 123145373476224
CoreAudio 0x7fff220a1ee9 _ZN9HALDevice4DuckEfPK14AudioTimeStampf + 921
CoreAudio 0x7fff21c08598 AudioDeviceDuck + 843
AudioDSP 0x13ed3161f _Z14DuckOtherAudiojff + 51
AudioDSP 0x13ee64a3b _ZN16AUVoiceProcessor22DestroyAggregateDeviceEv + 829
AudioDSP 0x13ee65d59 _ZN16AUVoiceProcessorD2Ev + 415
AudioDSP 0x13ef43342 _ZN13ComponentBase8AP_CloseEPv + 30
AudioToolboxCore 0x7fff217b5c8c _ZN19APComponentInstance15disposeInstanceEv + 40
AudioToolboxCore 0x7fff218b92ef AudioComponentInstanceDispose + 40
Post not yet marked as solved
When I try to initialize audio unit, I'm getting this error, And status code is -66635. I can't find any description for this code. What does it mean?
I have similar problems with both of the following devices
Device Type:iPad7,11 Software Version:iOS 14.1
Device Type:iPad7,5 Software Version:iOS 13.4.1
Please Help!Thank you!
Post not yet marked as solved
Hi,
I'm attempting to call audioComponentFindNext() from an iOS application (built with juce) to get a list of all available plugins.
I've got an issue whereby the function is only returning the generic system plugins and missing any the 3rd party installed plugins.
This issue is currently found when called from within another auv3 plugin though I have also seen it from within a normal iOS app. (Ran on iPad air 4), it the moment is working fine from an iOS app.
I've tried setting microphone access and inter-app audio capabilities as I saw it suggested on similar forum posts though it has not solved my problem.
Any advice would be very appreciated
Thanks
Post not yet marked as solved
In the documentation for AUAudioUnitBusArray, there is this passage:
Some audio units (e.g. mixers) support variable numbers of busses, via subclassing.
I tried to implement this by subclassing AUAudioUnitBusArray, creating my own internal array to store the buses, and overriding isCountChangeable to true and setBusCount to now increase the number of buses if the count is less than the current count. However, I don't think this will work because AUAudioUnitBus has several properties that I can't set such as ownerAudioUnit and index. I would also have to change all the observer functions like addObserver(toAllBusses:forKeyPath:options:context:), which seems overkill for a class that is designed for subclassing.
I know about replaceBusses(busArray:) but wouldn't that override the current buses in the bus array since it's copying them?