Core MIDI

RSS for tag

Communicate with MIDI (Musical Instrument Digital Interface) devices using Core MIDI.

Posts under Core MIDI tag

11 Posts

Post

Replies

Boosts

Views

Activity

Exploring Secure Enclave–backed biometric authorization between macOS and iPhone using public APIs (FaceBridge prototype)
Hi everyone, I’ve been working on an experimental prototype called FaceBridge that explores whether Secure Enclave–backed biometric authorization can be delegated between macOS and iPhone using only public Apple APIs. The goal of the project was to better understand the architectural boundaries of cross-device trust and approval flows that resemble Apple’s built-in Touch ID / Continuity authorization experiences. FaceBridge implements a local authorization pipeline where: macOS generates a signed authorization request the request is delivered to a trusted nearby iPhone over BLE / Network framework the iPhone verifies sender identity Face ID approval is requested using LocalAuthentication the iPhone signs the approval response using Secure Enclave–backed keys macOS validates the response and unlocks a protected action Security properties currently implemented: • Secure Enclave–backed signing identities per device • cryptographic device pairing and trust persistence • replay protection using nonce + timestamp binding • structured authorization request/response envelopes • signed responder identity verification • trusted-device registry model • local encrypted transport over BLE and local network This is intentionally not attempting to intercept or replace system-level Touch ID dialogs (App Store installs, Keychain prompts, loginwindow, etc.), but instead explores what is possible within application-level authorization boundaries using public APIs only. The project is open source: https://github.com/wesleysfavarin/facebridge Technical architecture write-up: https://medium.com/@wesleysfavarin/facebridge I’m particularly interested in feedback around: • recommended Secure Enclave identity lifecycle patterns • best practices for cross-device trust persistence • LocalAuthentication usage in delegated approval scenarios • whether similar authorization models are expected to become more formally supported across Apple platforms in the future Thanks in advance for any guidance or suggestions.
1
0
62
1d
MusicSequenceSetUserCallback not called during playback on macOS despite successful registration
Hello, I am developing a macOS app using AudioToolbox's MusicSequence and MusicPlayer APIs to play Standard MIDI Files. The MIDI playback works correctly and I can hear sound from the external MIDI device. However, the user callback registered via MusicSequenceSetUserCallback is never invoked during playback. Details: Callback registration returns no error. MusicPlayer is properly started and prerolled. The callback is defined as a global function with the correct @convention(c) signature. I have tried commenting out MusicTrackSetDestMIDIEndpoint to avoid known callback suppression issues. The clientData pointer is passed and correctly unwrapped in the callback. Minimal reproducible example shows the same behavior. Environment: macOS version: [Tahoe 26.2] Xcode version: [26.2] Is it expected that MusicSequenceSetUserCallback callbacks may not be called in some cases? Are there additional steps or configurations required to ensure the callback is triggered during MIDI playback? Thank you for any advice or pointers. Execute playTest() in the viewDidLoad() method of the ViewController. extension ViewController { private func playTest() { NewMusicSequence(&sequence) if let midiFileURL = Bundle.main.url(forResource: "etude", withExtension: "mid") { MusicSequenceFileLoad(sequence!, midiFileURL as CFURL, .midiType,MusicSequenceLoadFlags()) NewMusicPlayer(&player) MusicPlayerSetSequence(player!, sequence!) MusicPlayerPreroll(player!) let status = MusicSequenceSetUserCallback(sequence!, musicSequenceUserCallback, Unmanaged.passUnretained(self).toOpaque()) if status == noErr { print("Callback registered successfully") } else { print("Callback registration failed: \(status)") } MusicPlayerStart(player!) } else { print("MIDI File Not Found") } } } The callback function was generated by Xcode and defined outside the ViewController. func musicSequenceUserCallback( clientData: UnsafeMutableRawPointer?, sequence: MusicSequence, track: MusicTrack, eventTime: MusicTimeStamp, eventData: UnsafePointer<MusicEventUserData>, startSliceBeat: MusicTimeStamp, endSliceBeat: MusicTimeStamp ) { print("User callback fired at eventTime: \(eventTime)") if let clientData = clientData { let controller = Unmanaged<ViewController>.fromOpaque(clientData).takeUnretainedValue() // Example usage to prove round-trip works (avoid strong side effects in callback) _ = controller.view // touch to silence unused warning if needed print("Callback has access to ViewController: \(controller)") } else { print("clientData was nil") } }
1
0
510
Feb ’26
Unique identifier of a MIDI device
Hello, I need to know what is a unique identifier of a MIDI device (source/destination). Important note: I want to get the same ID when a device is reconnected (unplugged and then plugged again). The main candidate is kMIDIPropertyUniqueID property. But I don't know if it meets the requirement above or not. Additional question: is it always available for any endpoint? Also there is kMIDIPropertyDeviceID property. What about it? And one more option is just MIDIEndpointRef returned by MIDIGetSource or MIDIGetDestination. So what is the proper way to get ID which persists between device reconnections?
0
0
80
Jan ’26
sysEx struct in CoreMIDI/MIDIMessages.h
The sysEx struct in the MIDIUniversalMessage struct has a channel member but the System Exclusive (7-Bit) Message doesn't have a channel field. The System Exclusive (7-Bit) Message has a # of bytes field but the sysEx struct doesn't have a nrOfBytes, byteCount or bytesUsed member. It looks like the channel member of the sysEx struct contains the number of used bytes. Is this a mistake in the header or did I misunderstand something?
1
0
587
Dec ’25
CoreMIDI driver - flow control
Hi, when a CoreMIDI driver controls physical HW it is probably quite commune to have to control the amount of MIDI data received from the system. What comes to mind is to just delay returning control of the MIDIDriverInterface::Send() callback to the calling process. While the application trying to send MIDI really stalls until the callback returns it seems only to be a side effect of a generally stalled CoreMIDI server. Between the callbacks the application can send as much MIDI data as it wants to CoreMIDI, it's buffering seems to be endless... However the HW might not be able to play out all the data. It seems there is no way to indicate an overflow/full buffer situation back the application/CoreMIDI. How is this supposed to work? Thanks, any hints or pointers are highly appreciated! Hagen.
0
0
259
Oct ’25
Destroy MIDIUMPMutableEndpoint again?
Is there a way to destroy MIDIUMPMutableEndpoint again? In my app, the user has a setting to enable and disable MIDI 2.0. If MIDI 2.0 should not be supported (or if iOS version < 18), it creates a virtual destination and a virtual source. And if MIDI 2.0 should be enabled, it instead creates a MIDIUMPMutableEndpoint, which itself creates the virtual destination and source automatically. So here is my problem: I didn't find any way to destroy the MIDIUMPMutableEndpoint again. There is a method to disable it (setEnabled:NO), but that doesn't destroy or hide the virtual destination and source. So when the user turns MIDI 2.0 support off, I will have two virtual destinations and sources, and cannot get rid of the 2.0 ones. What is the correct way to get rid of the MIDIUMPMutableEndpoint once it is created?
0
0
171
Sep ’25
USB audio/midi interface with OS 26 Tahoe
The first Beta install stopped recognizing a Yamaha USB audio interface. The next two installs brought the recognition back. The last Beta install doesn't recognize the interface completely. The midi implementation works but the audio implementation isn't recognized in Audio/Midi setup. This might be something to look at in the next Beta release or something audio interface venders should look at in their next driver release.
3
0
559
Sep ’25
MIDI output form Standalone MIDI Processor Demo App to DAW
I am trying to get MIDI output from the AU Host demo app using the recent MIDI processor example. The processor works correctly in Logic Pro, but I cannot send MIDI from the AUv3 extension in standalone mode using the default host app to another program (e.g., Ableton). The MIDI manager, which is part of the standalone host app, works fine, and I can send MIDI using it directly—Ableton receives it without issues. I have already set the midiOutputNames in the extension, and the midiOutBlock is mapped. However, the MIDI data from the AUv3 extension does not reach Ableton in standalone mode. I suspect the issue is that midiOutBlock might never be called in the plugin, or perhaps an input to the plugin is missing, which prevents it from sending MIDI. I am currently using the default routing. I have modified the MIDI manager such that it works well as described above. Here is a part of my code for SimplePlayEngine.swift and my MIDIManager.swift for reference: @MainActor @Observable public class SimplePlayEngine { private let midiOutBlock: AUMIDIOutputEventBlock = { sampleTime, cable, length, data in return noErr } var scheduleMIDIEventListBlock: AUMIDIEventListBlock? = nil public init() { engine.attach(player) engine.prepare() setupMIDI() } private func setupMIDI() { if !MIDIManager.shared.setupPort(midiProtocol: MIDIProtocolID._2_0, receiveBlock: { [weak self] eventList, _ in if let scheduleMIDIEventListBlock = self?.scheduleMIDIEventListBlock { _ = scheduleMIDIEventListBlock(AUEventSampleTimeImmediate, 0, eventList) } }) { fatalError("Failed to setup Core MIDI") } } func initComponent(type: String, subType: String, manufacturer: String) async -> ViewController? { reset() guard let component = AVAudioUnit.findComponent(type: type, subType: subType, manufacturer: manufacturer) else { fatalError("Failed to find component with type: \(type), subtype: \(subType), manufacturer: \(manufacturer))" ) } do { let audioUnit = try await AVAudioUnit.instantiate( with: component.audioComponentDescription, options: AudioComponentInstantiationOptions.loadOutOfProcess) self.avAudioUnit = audioUnit self.connect(avAudioUnit: audioUnit) return await audioUnit.loadAudioUnitViewController() } catch { return nil } } private func startPlayingInternal() { guard let avAudioUnit = self.avAudioUnit else { return } setSessionActive(true) if avAudioUnit.wantsAudioInput { scheduleEffectLoop() } let hardwareFormat = engine.outputNode.outputFormat(forBus: 0) engine.connect(engine.mainMixerNode, to: engine.outputNode, format: hardwareFormat) do { try engine.start() } catch { isPlaying = false fatalError("Could not start engine. error: \(error).") } if avAudioUnit.wantsAudioInput { player.play() } isPlaying = true } private func resetAudioLoop() { guard let avAudioUnit = self.avAudioUnit else { return } if avAudioUnit.wantsAudioInput { guard let format = file?.processingFormat else { fatalError("No AVAudioFile defined.") } engine.connect(player, to: engine.mainMixerNode, format: format) } } public func connect(avAudioUnit: AVAudioUnit?, completion: @escaping (() -> Void) = {}) { guard let avAudioUnit = self.avAudioUnit else { return } engine.disconnectNodeInput(engine.mainMixerNode) resetAudioLoop() engine.detach(avAudioUnit) func rewiringComplete() { scheduleMIDIEventListBlock = auAudioUnit.scheduleMIDIEventListBlock if isPlaying { player.play() } completion() } let hardwareFormat = engine.outputNode.outputFormat(forBus: 0) engine.connect(engine.mainMixerNode, to: engine.outputNode, format: hardwareFormat) if isPlaying { player.pause() } let auAudioUnit = avAudioUnit.auAudioUnit if !auAudioUnit.midiOutputNames.isEmpty { auAudioUnit.midiOutputEventBlock = midiOutBlock } engine.attach(avAudioUnit) if avAudioUnit.wantsAudioInput { engine.disconnectNodeInput(engine.mainMixerNode) if let format = file?.processingFormat { engine.connect(player, to: avAudioUnit, format: format) engine.connect(avAudioUnit, to: engine.mainMixerNode, format: format) } } else { let stereoFormat = AVAudioFormat(standardFormatWithSampleRate: hardwareFormat.sampleRate, channels: 2) engine.connect(avAudioUnit, to: engine.mainMixerNode, format: stereoFormat) } rewiringComplete() } } and my MIDI Manager @MainActor class MIDIManager: Identifiable, ObservableObject { func setupPort(midiProtocol: MIDIProtocolID, receiveBlock: @escaping @Sendable MIDIReceiveBlock) -> Bool { guard setupClient() else { return false } if MIDIInputPortCreateWithProtocol(client, portName, midiProtocol, &port, receiveBlock) != noErr { return false } for source in self.sources { if MIDIPortConnectSource(port, source, nil) != noErr { print("Failed to connect to source \(source)") return false } } setupVirtualMIDIOutput() return true } private func setupVirtualMIDIOutput() { let virtualStatus = MIDISourceCreate(client, virtualSourceName, &virtualSource) if virtualStatus != noErr { print("❌ Failed to create virtual MIDI source: \(virtualStatus)") } else { print("✅ Created virtual MIDI source: \(virtualSourceName)") } } func sendMIDIData(_ data: [UInt8]) { print("hey") var packetList = MIDIPacketList() withUnsafeMutablePointer(to: &packetList) { ptr in let pkt = MIDIPacketListInit(ptr) _ = MIDIPacketListAdd(ptr, 1024, pkt, 0, data.count, data) if virtualSource != 0 { let status = MIDIReceived(virtualSource, ptr) if status != noErr { print("❌ Failed to send MIDI data: \(status)") } else { print("✅ Sent MIDI data: \(data)") } } } } }
0
0
524
Aug ’25
Creating RTP-MIDI Sessions via MIDINetworkSession C API (dlopen/dlsym) on macOS 15?
I’m an amateur developer working on a free utility for composers/producers, for which the macOS release needs to create and name RTP-MIDI sessions in Audio MIDI Setup from the command line (so I can ship a small C helper instead of telling users to click through the UI). Here’s what I’ve tried so far, without luck: • Plist hacks: Injecting entries into ~/Library/Audio/MIDI Configurations/*.mcfg works when AMS is closed, but AMS immediately locks and reverts my changes when it’s open. • CoreMIDI C API: I can create virtual ports with MIDISourceCreate, but attempting MIDIObjectGetDataProperty on the apple.midirtp.session plugin always returns err –10836. • Obj-C & Swift: Loading MIDINetworkSession and calling defaultSession, init, setNetworkName: and setting enabled = YES doesn’t produce a new session object in the Network panel. • dlopen/dlsym: I extracted the real CoreMIDI binary out of the dyld shared cache and tried binding _MIDINetworkSessionCreate, _SetName, _SetEnabled, etc., but all the symbols come back null or my tool segfaults. • Plugin registration: I’ve pulled the factory UUID (70C9C5EA-7C65-11D8-B317-000393A34B5A) from /System/Library/Extensions/AppleMIDIRTPDriver.plugin/Contents/Info.plist and called CFPlugInRegisterFactories, but it still never exposes the session-creation calls. At this point I’m convinced I’m either loading the wrong binary or missing one critical step in registering the RTP-MIDI plugin’s private API. Can anyone point me to: The exact path of the dylib or bundle that actually exports the MIDINetworkSessionCreate/MIDINetworkSessionSetName/MIDINetworkSessionSetEnabled symbols? A minimal working snippet (C or Obj-C) that reliably creates and names a Network-MIDI session? Any pointers, sample code, or even ideas about where Apple hides this functionality on macOS 15 would be hugely appreciated. Thanks!
0
1
217
Jun ’25
Making music….
Maybe a bit of a simplistic question, but…. Best ways to create music - programmatically - within an app? I have a few small, simple games that I would like to add some background music to. I’ve fiddled with AVAudio, SKAudio, and now AudioKit. I’m just not quite finding the holy grail of music generation. I don’t know if it’s more technique or tool. So I thought I’d hit up the pool of minds here for some suggestions…?
0
0
141
Jun ’25
Exploring Secure Enclave–backed biometric authorization between macOS and iPhone using public APIs (FaceBridge prototype)
Hi everyone, I’ve been working on an experimental prototype called FaceBridge that explores whether Secure Enclave–backed biometric authorization can be delegated between macOS and iPhone using only public Apple APIs. The goal of the project was to better understand the architectural boundaries of cross-device trust and approval flows that resemble Apple’s built-in Touch ID / Continuity authorization experiences. FaceBridge implements a local authorization pipeline where: macOS generates a signed authorization request the request is delivered to a trusted nearby iPhone over BLE / Network framework the iPhone verifies sender identity Face ID approval is requested using LocalAuthentication the iPhone signs the approval response using Secure Enclave–backed keys macOS validates the response and unlocks a protected action Security properties currently implemented: • Secure Enclave–backed signing identities per device • cryptographic device pairing and trust persistence • replay protection using nonce + timestamp binding • structured authorization request/response envelopes • signed responder identity verification • trusted-device registry model • local encrypted transport over BLE and local network This is intentionally not attempting to intercept or replace system-level Touch ID dialogs (App Store installs, Keychain prompts, loginwindow, etc.), but instead explores what is possible within application-level authorization boundaries using public APIs only. The project is open source: https://github.com/wesleysfavarin/facebridge Technical architecture write-up: https://medium.com/@wesleysfavarin/facebridge I’m particularly interested in feedback around: • recommended Secure Enclave identity lifecycle patterns • best practices for cross-device trust persistence • LocalAuthentication usage in delegated approval scenarios • whether similar authorization models are expected to become more formally supported across Apple platforms in the future Thanks in advance for any guidance or suggestions.
Replies
1
Boosts
0
Views
62
Activity
1d
MusicSequenceSetUserCallback not called during playback on macOS despite successful registration
Hello, I am developing a macOS app using AudioToolbox's MusicSequence and MusicPlayer APIs to play Standard MIDI Files. The MIDI playback works correctly and I can hear sound from the external MIDI device. However, the user callback registered via MusicSequenceSetUserCallback is never invoked during playback. Details: Callback registration returns no error. MusicPlayer is properly started and prerolled. The callback is defined as a global function with the correct @convention(c) signature. I have tried commenting out MusicTrackSetDestMIDIEndpoint to avoid known callback suppression issues. The clientData pointer is passed and correctly unwrapped in the callback. Minimal reproducible example shows the same behavior. Environment: macOS version: [Tahoe 26.2] Xcode version: [26.2] Is it expected that MusicSequenceSetUserCallback callbacks may not be called in some cases? Are there additional steps or configurations required to ensure the callback is triggered during MIDI playback? Thank you for any advice or pointers. Execute playTest() in the viewDidLoad() method of the ViewController. extension ViewController { private func playTest() { NewMusicSequence(&sequence) if let midiFileURL = Bundle.main.url(forResource: "etude", withExtension: "mid") { MusicSequenceFileLoad(sequence!, midiFileURL as CFURL, .midiType,MusicSequenceLoadFlags()) NewMusicPlayer(&player) MusicPlayerSetSequence(player!, sequence!) MusicPlayerPreroll(player!) let status = MusicSequenceSetUserCallback(sequence!, musicSequenceUserCallback, Unmanaged.passUnretained(self).toOpaque()) if status == noErr { print("Callback registered successfully") } else { print("Callback registration failed: \(status)") } MusicPlayerStart(player!) } else { print("MIDI File Not Found") } } } The callback function was generated by Xcode and defined outside the ViewController. func musicSequenceUserCallback( clientData: UnsafeMutableRawPointer?, sequence: MusicSequence, track: MusicTrack, eventTime: MusicTimeStamp, eventData: UnsafePointer<MusicEventUserData>, startSliceBeat: MusicTimeStamp, endSliceBeat: MusicTimeStamp ) { print("User callback fired at eventTime: \(eventTime)") if let clientData = clientData { let controller = Unmanaged<ViewController>.fromOpaque(clientData).takeUnretainedValue() // Example usage to prove round-trip works (avoid strong side effects in callback) _ = controller.view // touch to silence unused warning if needed print("Callback has access to ViewController: \(controller)") } else { print("clientData was nil") } }
Replies
1
Boosts
0
Views
510
Activity
Feb ’26
Unique identifier of a MIDI device
Hello, I need to know what is a unique identifier of a MIDI device (source/destination). Important note: I want to get the same ID when a device is reconnected (unplugged and then plugged again). The main candidate is kMIDIPropertyUniqueID property. But I don't know if it meets the requirement above or not. Additional question: is it always available for any endpoint? Also there is kMIDIPropertyDeviceID property. What about it? And one more option is just MIDIEndpointRef returned by MIDIGetSource or MIDIGetDestination. So what is the proper way to get ID which persists between device reconnections?
Replies
0
Boosts
0
Views
80
Activity
Jan ’26
sysEx struct in CoreMIDI/MIDIMessages.h
The sysEx struct in the MIDIUniversalMessage struct has a channel member but the System Exclusive (7-Bit) Message doesn't have a channel field. The System Exclusive (7-Bit) Message has a # of bytes field but the sysEx struct doesn't have a nrOfBytes, byteCount or bytesUsed member. It looks like the channel member of the sysEx struct contains the number of used bytes. Is this a mistake in the header or did I misunderstand something?
Replies
1
Boosts
0
Views
587
Activity
Dec ’25
CoreMIDI: neither syslog nor unified logging works.
Hi, macOS (latest macOS, latest HW, but doesn't matter) seems to prevent CoreMIDI driver logging with standard logging procedures (syslog, unified logging). The only chance to log something is writing to a file at one of the rare write-accessible locations for CoreMIDI. How is this supposed to work? Any hint is highly appreciated. Thanks!
Replies
3
Boosts
0
Views
343
Activity
Oct ’25
CoreMIDI driver - flow control
Hi, when a CoreMIDI driver controls physical HW it is probably quite commune to have to control the amount of MIDI data received from the system. What comes to mind is to just delay returning control of the MIDIDriverInterface::Send() callback to the calling process. While the application trying to send MIDI really stalls until the callback returns it seems only to be a side effect of a generally stalled CoreMIDI server. Between the callbacks the application can send as much MIDI data as it wants to CoreMIDI, it's buffering seems to be endless... However the HW might not be able to play out all the data. It seems there is no way to indicate an overflow/full buffer situation back the application/CoreMIDI. How is this supposed to work? Thanks, any hints or pointers are highly appreciated! Hagen.
Replies
0
Boosts
0
Views
259
Activity
Oct ’25
Destroy MIDIUMPMutableEndpoint again?
Is there a way to destroy MIDIUMPMutableEndpoint again? In my app, the user has a setting to enable and disable MIDI 2.0. If MIDI 2.0 should not be supported (or if iOS version < 18), it creates a virtual destination and a virtual source. And if MIDI 2.0 should be enabled, it instead creates a MIDIUMPMutableEndpoint, which itself creates the virtual destination and source automatically. So here is my problem: I didn't find any way to destroy the MIDIUMPMutableEndpoint again. There is a method to disable it (setEnabled:NO), but that doesn't destroy or hide the virtual destination and source. So when the user turns MIDI 2.0 support off, I will have two virtual destinations and sources, and cannot get rid of the 2.0 ones. What is the correct way to get rid of the MIDIUMPMutableEndpoint once it is created?
Replies
0
Boosts
0
Views
171
Activity
Sep ’25
USB audio/midi interface with OS 26 Tahoe
The first Beta install stopped recognizing a Yamaha USB audio interface. The next two installs brought the recognition back. The last Beta install doesn't recognize the interface completely. The midi implementation works but the audio implementation isn't recognized in Audio/Midi setup. This might be something to look at in the next Beta release or something audio interface venders should look at in their next driver release.
Replies
3
Boosts
0
Views
559
Activity
Sep ’25
MIDI output form Standalone MIDI Processor Demo App to DAW
I am trying to get MIDI output from the AU Host demo app using the recent MIDI processor example. The processor works correctly in Logic Pro, but I cannot send MIDI from the AUv3 extension in standalone mode using the default host app to another program (e.g., Ableton). The MIDI manager, which is part of the standalone host app, works fine, and I can send MIDI using it directly—Ableton receives it without issues. I have already set the midiOutputNames in the extension, and the midiOutBlock is mapped. However, the MIDI data from the AUv3 extension does not reach Ableton in standalone mode. I suspect the issue is that midiOutBlock might never be called in the plugin, or perhaps an input to the plugin is missing, which prevents it from sending MIDI. I am currently using the default routing. I have modified the MIDI manager such that it works well as described above. Here is a part of my code for SimplePlayEngine.swift and my MIDIManager.swift for reference: @MainActor @Observable public class SimplePlayEngine { private let midiOutBlock: AUMIDIOutputEventBlock = { sampleTime, cable, length, data in return noErr } var scheduleMIDIEventListBlock: AUMIDIEventListBlock? = nil public init() { engine.attach(player) engine.prepare() setupMIDI() } private func setupMIDI() { if !MIDIManager.shared.setupPort(midiProtocol: MIDIProtocolID._2_0, receiveBlock: { [weak self] eventList, _ in if let scheduleMIDIEventListBlock = self?.scheduleMIDIEventListBlock { _ = scheduleMIDIEventListBlock(AUEventSampleTimeImmediate, 0, eventList) } }) { fatalError("Failed to setup Core MIDI") } } func initComponent(type: String, subType: String, manufacturer: String) async -> ViewController? { reset() guard let component = AVAudioUnit.findComponent(type: type, subType: subType, manufacturer: manufacturer) else { fatalError("Failed to find component with type: \(type), subtype: \(subType), manufacturer: \(manufacturer))" ) } do { let audioUnit = try await AVAudioUnit.instantiate( with: component.audioComponentDescription, options: AudioComponentInstantiationOptions.loadOutOfProcess) self.avAudioUnit = audioUnit self.connect(avAudioUnit: audioUnit) return await audioUnit.loadAudioUnitViewController() } catch { return nil } } private func startPlayingInternal() { guard let avAudioUnit = self.avAudioUnit else { return } setSessionActive(true) if avAudioUnit.wantsAudioInput { scheduleEffectLoop() } let hardwareFormat = engine.outputNode.outputFormat(forBus: 0) engine.connect(engine.mainMixerNode, to: engine.outputNode, format: hardwareFormat) do { try engine.start() } catch { isPlaying = false fatalError("Could not start engine. error: \(error).") } if avAudioUnit.wantsAudioInput { player.play() } isPlaying = true } private func resetAudioLoop() { guard let avAudioUnit = self.avAudioUnit else { return } if avAudioUnit.wantsAudioInput { guard let format = file?.processingFormat else { fatalError("No AVAudioFile defined.") } engine.connect(player, to: engine.mainMixerNode, format: format) } } public func connect(avAudioUnit: AVAudioUnit?, completion: @escaping (() -> Void) = {}) { guard let avAudioUnit = self.avAudioUnit else { return } engine.disconnectNodeInput(engine.mainMixerNode) resetAudioLoop() engine.detach(avAudioUnit) func rewiringComplete() { scheduleMIDIEventListBlock = auAudioUnit.scheduleMIDIEventListBlock if isPlaying { player.play() } completion() } let hardwareFormat = engine.outputNode.outputFormat(forBus: 0) engine.connect(engine.mainMixerNode, to: engine.outputNode, format: hardwareFormat) if isPlaying { player.pause() } let auAudioUnit = avAudioUnit.auAudioUnit if !auAudioUnit.midiOutputNames.isEmpty { auAudioUnit.midiOutputEventBlock = midiOutBlock } engine.attach(avAudioUnit) if avAudioUnit.wantsAudioInput { engine.disconnectNodeInput(engine.mainMixerNode) if let format = file?.processingFormat { engine.connect(player, to: avAudioUnit, format: format) engine.connect(avAudioUnit, to: engine.mainMixerNode, format: format) } } else { let stereoFormat = AVAudioFormat(standardFormatWithSampleRate: hardwareFormat.sampleRate, channels: 2) engine.connect(avAudioUnit, to: engine.mainMixerNode, format: stereoFormat) } rewiringComplete() } } and my MIDI Manager @MainActor class MIDIManager: Identifiable, ObservableObject { func setupPort(midiProtocol: MIDIProtocolID, receiveBlock: @escaping @Sendable MIDIReceiveBlock) -> Bool { guard setupClient() else { return false } if MIDIInputPortCreateWithProtocol(client, portName, midiProtocol, &port, receiveBlock) != noErr { return false } for source in self.sources { if MIDIPortConnectSource(port, source, nil) != noErr { print("Failed to connect to source \(source)") return false } } setupVirtualMIDIOutput() return true } private func setupVirtualMIDIOutput() { let virtualStatus = MIDISourceCreate(client, virtualSourceName, &virtualSource) if virtualStatus != noErr { print("❌ Failed to create virtual MIDI source: \(virtualStatus)") } else { print("✅ Created virtual MIDI source: \(virtualSourceName)") } } func sendMIDIData(_ data: [UInt8]) { print("hey") var packetList = MIDIPacketList() withUnsafeMutablePointer(to: &packetList) { ptr in let pkt = MIDIPacketListInit(ptr) _ = MIDIPacketListAdd(ptr, 1024, pkt, 0, data.count, data) if virtualSource != 0 { let status = MIDIReceived(virtualSource, ptr) if status != noErr { print("❌ Failed to send MIDI data: \(status)") } else { print("✅ Sent MIDI data: \(data)") } } } } }
Replies
0
Boosts
0
Views
524
Activity
Aug ’25
Creating RTP-MIDI Sessions via MIDINetworkSession C API (dlopen/dlsym) on macOS 15?
I’m an amateur developer working on a free utility for composers/producers, for which the macOS release needs to create and name RTP-MIDI sessions in Audio MIDI Setup from the command line (so I can ship a small C helper instead of telling users to click through the UI). Here’s what I’ve tried so far, without luck: • Plist hacks: Injecting entries into ~/Library/Audio/MIDI Configurations/*.mcfg works when AMS is closed, but AMS immediately locks and reverts my changes when it’s open. • CoreMIDI C API: I can create virtual ports with MIDISourceCreate, but attempting MIDIObjectGetDataProperty on the apple.midirtp.session plugin always returns err –10836. • Obj-C & Swift: Loading MIDINetworkSession and calling defaultSession, init, setNetworkName: and setting enabled = YES doesn’t produce a new session object in the Network panel. • dlopen/dlsym: I extracted the real CoreMIDI binary out of the dyld shared cache and tried binding _MIDINetworkSessionCreate, _SetName, _SetEnabled, etc., but all the symbols come back null or my tool segfaults. • Plugin registration: I’ve pulled the factory UUID (70C9C5EA-7C65-11D8-B317-000393A34B5A) from /System/Library/Extensions/AppleMIDIRTPDriver.plugin/Contents/Info.plist and called CFPlugInRegisterFactories, but it still never exposes the session-creation calls. At this point I’m convinced I’m either loading the wrong binary or missing one critical step in registering the RTP-MIDI plugin’s private API. Can anyone point me to: The exact path of the dylib or bundle that actually exports the MIDINetworkSessionCreate/MIDINetworkSessionSetName/MIDINetworkSessionSetEnabled symbols? A minimal working snippet (C or Obj-C) that reliably creates and names a Network-MIDI session? Any pointers, sample code, or even ideas about where Apple hides this functionality on macOS 15 would be hugely appreciated. Thanks!
Replies
0
Boosts
1
Views
217
Activity
Jun ’25
Making music….
Maybe a bit of a simplistic question, but…. Best ways to create music - programmatically - within an app? I have a few small, simple games that I would like to add some background music to. I’ve fiddled with AVAudio, SKAudio, and now AudioKit. I’m just not quite finding the holy grail of music generation. I don’t know if it’s more technique or tool. So I thought I’d hit up the pool of minds here for some suggestions…?
Replies
0
Boosts
0
Views
141
Activity
Jun ’25