Hi, I have just implemented an Audio Unit v3 host. AgsAudioUnitPlugin *audio_unit_plugin; AVAudioUnitComponentManager *audio_unit_component_manager; NSArray *av_component_arr; AudioComponentDescription description; guint i, i_stop; if(!AGS_AUDIO_UNIT_MANAGER(audio_unit_manager)){ return; } audio_unit_component_manager = [AVAudioUnitComponentManager sharedAudioUnitComponentManager]; /* effects */ description = (AudioComponentDescription) {0,}; description.componentType = kAudioUnitType_Effect; av_component_arr = [audio_unit_component_manager componentsMatchingDescription:description]; i_stop = [av_component_arr count]; for(i = 0; i < i_stop; i++){ ags_audio_unit_manager_load_component(audio_unit_manager, (gpointer) av_component_arr[i]); } /* instruments */ description = (AudioComponentDescription) {0,}; description.componentType = kAudioUnitType_MusicDevice; av_component_arr = [audio_unit_component_manager componentsMatchingDescription:description]; i_stop = [av_component_arr count]; for(i = 0; i &
Search results for
Popping Sound
19,349 results found
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Hi, Just want to share an update. I figured out you can't run signed Audio Units without proper entitlements set. https://developer.apple.com/library/archive/technotes/tn2312/_index.html
Topic:
Media Technologies
SubTopic:
Audio
Tags:
auval -a shows me following: auval -a AU Validation Tool Version: 1.10.0 Copyright 2003-2019, Apple Inc. All Rights Reserved. Specify -h (-help) for command options aufx bpas appl - Apple: AUBandpass aufx dcmp appl - Apple: AUDynamicsProcessor aufx dely appl - Apple: AUDelay aufx dist appl - Apple: AUDistortion aufx filt appl - Apple: AUFilter aufx greq appl - Apple: AUGraphicEQ aufx hpas appl - Apple: AUHipass aufx hshf appl - Apple: AUHighShelfFilter aufx lmtr appl - Apple: AUPeakLimiter aufx lpas appl - Apple: AULowpass aufx lshf appl - Apple: AULowShelfFilter aufx mcmp appl - Apple: AUMultibandCompressor aufx mrev appl - Apple: AUMatrixReverb aufx nbeq appl - Apple: AUNBandEQ aufx nsnd appl - Apple: AUNetSend aufx nutp appl - Apple: AUNewPitch aufx pmeq appl - Apple: AUParametricEQ aufx raac appl - Apple: AURoundTripAAC aufx rogr appl - Apple: AURogerBeep aufx rvb2 appl - Apple: AUReverb2 aufx sdly appl - Apple: AUSampleDelay aufx tmpt appl - Apple: AUPitch aufx vois appl - Apple: AUSoundIsolation aumf Al
Topic:
Media Technologies
SubTopic:
Audio
Tags:
Hi, I just started to develop audio unit hosting support in my application. Offline rendering seems to work except that I hear no output, but why? I suspect with the player goes something wrong. I connect to CoreAudio in a different location in the code. Here are some error messages I faced so far: 2025-08-14 19:42:04.132930+0200 com.gsequencer.GSequencer[34358:18611871] [avae] AVAudioEngineGraph.mm:4668 Can't retrieve source node to play sequence because there is no output node! 2025-08-14 19:42:04.151171+0200 com.gsequencer.GSequencer[34358:18611871] [avae] AVAudioEngineGraph.mm:4668 Can't retrieve source node to play sequence because there is no output node! 2025-08-14 19:43:08.344530+0200 com.gsequencer.GSequencer[34358:18614927] AUAudioUnit.mm:1417 Cannot set maximumFramesToRender while render resources allocated. 2025-08-14 19:43:08.346583+0200 com.gsequencer.GSequencer[34358:18614927] [avae] AVAEInternal.h:104 [AVAudioSequencer.mm:121:-[AVAudioSequencer(AVAudioSequencer_Player) startAndReturnE
Hi, Now, I have a working Audio Unit v3 host, using these objects: AVAudioEngine *audio_engine; AVAudioOutputNode *av_output_node; AVAudioInputNode *av_input_node; AVAudioUnit *av_audio_unit; AVAudioSequencer *av_audio_sequencer; AVAudioFormat *av_format; You can make use of output and input node of AVAudioEngine while in offline rendering mode. /* output node */ av_output_node = [audio_engine outputNode]; /* input node */ av_input_node = [audio_engine inputNode]; /* mixer node */ av_audio_mixer_node = [audio_engine mainMixerNode]; /* audio player and audio unit */ [audio_engine attachNode:av_audio_unit]; [audio_engine connect:av_input_node to:av_audio_unit format:av_format]; [audio_engine connect:av_audio_unit to:av_audio_mixer_node format:av_format]; [audio_engine connect:av_audio_mixer_node to:av_output_node format:av_format]; The thing with the input node is you have to provide a block before start AVAudioEngine. input_success = [av_input_node setManualRenderingInputPCMFormat:av
Topic:
Media Technologies
SubTopic:
Audio
Tags:
Hi, I have limited knowledge here, but I'v been working on Core Audio recently so: from my understanding, offline rendering outputs to a file, i.e. you process offline your audio, it goes super fast, to then play the file. Now, if you _really_want to hear the audio, disable manual rendering.
Topic:
Media Technologies
SubTopic:
Audio
Tags:
web react developers have no problem creating mapkit js token, It sounds like they can create a token, but have you tried to deploy that token generated through the Apple website into your app to make sure that initialization succeeds without the 401 response? I'm looking to rule out entire classes of issues by ensuring that path is working for you before proceeding with the other details of how you are custom generating your tokens. — Ed Ford, DTS Engineer
Topic:
App & System Services
SubTopic:
Maps & Location
Hello, Quartz Debug is available as an additional tools package for Xcode. For example, Additional Tools for Xcode 26 beta 6 contains: This package includes audio, graphics, hardware I/O, and other auxiliary tools. These tools include AU Lab, OpenGL Driver Monitor, OpenGL Profiler, *****, Quartz Debug, CarPlay Simulator, HomeKit Accessory Simulator, IO Registry Explorer, Network Link Conditioner, PacketLogger, Printer Simulator, 64BitConversion, Clipboard Viewer, Crash Reporter Prefs, Dictionary Development Kit, Help Indexer, and Modem Scripts.
Topic:
Developer Tools & Services
SubTopic:
General
Tags:
This sounds like the issue being discussed in this other forums thread, so I recommend you hop over there. — Ed Ford, DTS Engineer
Topic:
App Store Distribution & Marketing
SubTopic:
App Store Connect
Tags:
That sounds pretty strange and ... interesting. Your code snippet is quite straightforward and has nothing wrong, and so I can't comment based on that. If you provide more details about how you trigger and observe the issue, or even better, a minimal project with detailed steps to reproduce the issue, I'd be interested in taking a look. Best, —— Ziqiao Chen Worldwide Developer Relations.
Topic:
App & System Services
SubTopic:
iCloud & Data
Tags:
Any one experience this bug, when playing a video the bluetooth headphones loose audio? my workaround is to select from one of the other audio outputs, and go back again and select the affected headphone.
Topic:
Media Technologies
SubTopic:
Video
What's a pop alert? Can you show us your code? If you've tried everything, raise a feedback report here: https://feedbackassistant.apple.com/ and provide more useful information than you've given in this post.
Topic:
UI Frameworks
SubTopic:
SwiftUI
An iOS app has a UINavigationController with a UINavigationBar that is non-translucent (e.g. black). When performing a push (or pop) to navigate to or from another UIViewController the UIBarButtonItems on the navigation bar are flashing a white background. With a dark navigation bar this is very noticeable and not desirable. This only occurs when run on iOS 26 and is related to Liquid Glass I've created FB19660024 with a minimal Xcode workspace to reproduce, along with a video showing the behavior. This is a cosmetic bug, not affecting functionality, but is a very undesirable effect on apps with dark and non-translucent navigation bars. Has anyone else seen this and found a workaround?
We have an app in Swift that uses push notifications. It has a deployment target of iOS 15.0 I originally audited our app for iOS 26 by building it with Xcode 26 beta 3. At that point, all was well. Our implementation of application:didRegisterForRemoteNotificationsWithDeviceToken was called. But when rebuilding the app with beta 4, 5 and now 6, that function is no longer being called. I created a simple test case by creating a default iOS app project, then performing these additional steps: Set bundle ID to our app's ID Add the Push Notifications capability Add in application:didRegisterForRemoteNotificationsWithDeviceToken: with a print(HERE) just to set a breakpoint. Added the following code inside application:didFinishLaunchingWithOptions: along with setting a breakpoint on the registerForRemoteNotifications line: UNUserNotificationCenter.current().requestAuthorization(options: [.badge, .alert, .sound]) { granted, _ in DispatchQueue.main.async { UIApplication.shared.registerForRemoteNotifications
More info: I built a very basic test app from Xcode templates; it loads into the iPhone, and Xcode attaches to it successfully and enables debugging. So there is something my app is doing that prevents Xcode from attaching to it. My app: supports carplay plays audio does not use storyboard, except for Launch storyboard uses split view navigation any suggestions or thoughts would be welcome.
Topic:
Developer Tools & Services
SubTopic:
Xcode
Tags: