I haven't seen that session, and I probably should to help understand the direction the API's taking because right now it's unclear to me.
When looking for AUv3 documentation I couldn't help but observe that that the AUv3 host/plugin sample code (which basically IS the bulk of the documentation) is only available in Swift.
I can only guess that the principle is that all the 'offline' setup is performed via a realtime unsafe language, and then the rest is done with something fit for purpose (C/C++/ASM/intrinsics). If there are still cases where Obj C/Swift can pollute the audio context then this hasn't been thought through.
I appreciate that the C/C++ interface has never been particularly pretty (or typesafe), but it does work, and when compared to the work required to build a production quality Core Audio/DSP implementation, some 'ugly' Get/SetProperty() calls are neither here nor there. We're DSP programmers - we can take it!
What we can't take is days lost simply because a significant proportion of the API is documented as 'No overview available.'