Post not yet marked as solved
I've got volume in my implementation. Too much hurricane force volume. Though, the consistent problem is the volume is blasting when I create a ambient or channel mixer. (not a point or volumetric source. eg. calm breeze sound) I set the level on the mixer and nothing seems to happen. I'd like to set the volume lower.
Though, on the spatial mixer, if I set the gain, rolloff and direct path level on the source node ( a point or volumetric source), then the spatial mixer case appears to work and no blasting audio.
I've been following the wwdc examples. ( watched it about 4 times now) It appears I should not use the source node with the ambient and channel mixers? That seems to be only an option adding the parameter to the spatial mixer. The ambient mixer seems to only want the listener and a quaternion direction. ( I normalized to 1 )
If I set the calibration to relative spl on the sampler node but that always seems to cause blasting audio.
I added the sound assets with dynamic using wav format at 32 bits and 44.1 khz.
Also, are there any examples of the meta parameters? Is that how I could dynamically adjust the level? Think there was a passing reference to it in the wwdc video.
Any pointers would be appreciated. I wonder if I'm making consistent assumptions on how phase works. I try to set up as much as possible before I start the engine. ( especially adding children nodes. )
Post not yet marked as solved
When I use PHASE on my iPad, the test app is in landscape mode left or right only. But it seems like phase is in portrait mode. That is, 90 degrees off with a normal pointed at me. Is this a case where I need to use the world transform and detect view orientation notifications? Or does phase automatically handle it? I notice in simulator when I rotate the app, the speakers on the mac pro are always fixed which is what I was expecting. Or maybe its my imagination ... sounds like portrait on my device but I'm in landscape. I do have supported interface orientations set in my plist. Actually, its kind of annoying having the speakers on one side of the iPad.
I was wondering if there are any example download projects of the PHASE audio framework? I was watching the WWDC 2021 video but there was no example code to download. The examples within the video were pretty verbose -- do not want to freeze a frame and type all that by hand.
I am attempting to replace some old OpenAL code from a few years ago with an alternate solution. All the OpenAL code shows deprecation messages when I build in Xcode.
The generated header PHASE documentation is kind of sparse and somewhat boiler plate with no examples.
Thanks in advance.
I run my metal app on a capable device (iPhone X) with iOS 12. When I capture the gpu frame during debugging, select geomety from the captured metadata, select a polygon within the metal debugger, then the metal debug button is blue and enabled. Though, when I click on the metal debug button (in this case for a vertex shader), I get a Xcode dialog complaining that it can't find source code for the shader and the metal compiler in build settings should be in debug mode. But the metal compiler is in debug mode.My default metal library for the device is set to a bundled framework within the main applications bundle. I set my device up like the following:let shaders = try device.makeDefaultLibrary(bundle: myFrameworkBundle) // works for default.metallibThere are no metal shaders in the main bundle itself -- they reside only within the framework bundled within the app. So, I suspect the metal shader debugger is just defaulting to the main app bundle and not finding anything? Its not looking at the device for a location? I tried adding a build phase to copy the shaders into the framework too but that did not seem to make a difference. Really would be useful to get the debugger working.