Core MIDI

RSS for tag

Communicate with MIDI (Musical Instrument Digital Interface) devices using Core MIDI.

Core MIDI Documentation

Posts under Core MIDI tag

10 Posts
Sort by:
Post not yet marked as solved
5 Replies
207 Views
Hello, I have received 3 almost identical crash reports from the App Store. They all come from the same user, and they are spaced just ± 45 seconds apart. This is the backtrace of the crashed thread: Crashed Thread: 3 Exception Type: EXC_CRASH (SIGABRT) Exception Codes: 0x0000000000000000, 0x0000000000000000 Termination Reason: Namespace SIGNAL, Code 6 Abort trap: 6 Terminating Process: Ssssssss [46033] Thread 3 Crashed: 0 libsystem_kernel.dylib 0x00007ff81b90f196 __pthread_kill + 10 1 libsystem_pthread.dylib 0x00007ff81b946ee6 pthread_kill + 263 (pthread.c:1670) 2 libsystem_c.dylib 0x00007ff81b86dbdf __abort + 139 (abort.c:155) 3 libsystem_c.dylib 0x00007ff81b86db54 abort + 138 (abort.c:126) 4 libc++abi.dylib 0x00007ff81b901282 abort_message + 241 5 libc++abi.dylib 0x00007ff81b8f33fb demangling_terminate_handler() + 267 6 libobjc.A.dylib 0x00007ff81b5c67ca _objc_terminate() + 96 (objc-exception.mm:498) 7 libc++abi.dylib 0x00007ff81b9006db std::__terminate(void (*)()) + 6 8 libc++abi.dylib 0x00007ff81b900696 std::terminate() + 54 9 libdispatch.dylib 0x00007ff81b7a6047 _dispatch_client_callout + 28 (object.m:563) 10 libdispatch.dylib 0x00007ff81b7a87c4 _dispatch_queue_override_invoke + 800 (queue.c:4882) 11 libdispatch.dylib 0x00007ff81b7b5fa2 _dispatch_root_queue_drain + 343 (queue.c:7051) 12 libdispatch.dylib 0x00007ff81b7b6768 _dispatch_worker_thread2 + 170 (queue.c:7119) 13 libsystem_pthread.dylib 0x00007ff81b943c0f _pthread_wqthread + 257 (pthread.c:2631) 14 libsystem_pthread.dylib 0x00007ff81b942bbf start_wqthread + 15 (:-1) In the backtrace of the main thread, I can see that the error is caught by the app delegate which tries to display an alert, but obviously the message has no time to appear. Incidentally (but it's not my main question), I would like to know if it would be possible in such a case to block the background thread for the time the alert is displayed (e.g. using a dispatch queue)? ... (many other related lines) 72 SSUuuuIIIIIIIIIUUuuuu 0x000000010e8089f2 +[NSAlert(Ssssssss) fatalError:] + 32 73 Ssssssss 0x000000010dd5e75b __universalExceptionHandler_block_invoke (in Ssssssss) (SQAppDelegate.m:421) + 333659 74 libdispatch.dylib 0x00007ff81b7a4d91 _dispatch_call_block_and_release + 12 (init.c:1518) 75 libdispatch.dylib 0x00007ff81b7a6033 _dispatch_client_callout + 8 (object.m:560) 76 libdispatch.dylib 0x00007ff81b7b2fcf _dispatch_main_queue_drain + 954 (queue.c:7794) 77 libdispatch.dylib 0x00007ff81b7b2c07 _dispatch_main_queue_callback_4CF + 31 (queue.c:7954) 78 CoreFoundation 0x00007ff81ba62195 __CFRUNLOOP_IS_SERVICING_THE_MAIN_DISPATCH_QUEUE__ + 9 (CFRunLoop.c:1780) 79 CoreFoundation 0x00007ff81ba21ebf __CFRunLoopRun + 2452 (CFRunLoop.c:3147) 80 CoreFoundation 0x00007ff81ba20ec1 CFRunLoopRunSpecific + 560 (CFRunLoop.c:3418) 81 HIToolbox 0x00007ff8254a5f3d RunCurrentEventLoopInMode + 292 (EventLoop.c:455) 82 HIToolbox 0x00007ff8254a5d4e ReceiveNextEventCommon + 657 (EventBlocking.c:384) 83 HIToolbox 0x00007ff8254a5aa8 _BlockUntilNextEventMatchingListInModeWithFilter + 64 (EventBlocking.c:171) 84 AppKit 0x00007ff81eabfb18 _DPSNextEvent + 858 (CGDPSReplacement.m:818) 85 AppKit 0x00007ff81eabe9c2 -[NSApplication(NSEvent) _nextEventMatchingEventMask:untilDate:inMode:dequeue:] + 1214 (appEventRouting.m:407) 86 AppKit 0x00007ff81eab1037 -[NSApplication run] + 586 (NSApplication.m:3432) 87 AppKit 0x00007ff81ea85251 NSApplicationMain + 817 (NSApplication.m:9427) 88 dyld 0x00007ff81b5ec41f start + 1903 (dyldMain.cpp:1165) In all 3 reports, another thread indicates that it is sending or receiving a MIDI data (it's exactly the same backtrace in the 3 reports): Thread 1: 0 libsystem_kernel.dylib 0x00007ff81b908552 mach_msg2_trap + 10 1 libsystem_kernel.dylib 0x00007ff81b9166cd mach_msg2_internal + 78 (mach_msg.c:201) 2 libsystem_kernel.dylib 0x00007ff81b90f584 mach_msg_overwrite + 692 (mach_msg.c:0) 3 libsystem_kernel.dylib 0x00007ff81b90883a mach_msg + 19 (mach_msg.c:323) 4 CoreMIDI 0x00007ff834adfd50 XServerMachPort::ReceiveMessage(int&, void*, int&) + 94 (XMachPort.cpp:62) 5 CoreMIDI 0x00007ff834b118c5 MIDIProcess::MIDIInPortThread::Run() + 105 (MIDIClientLib.cpp:204) 6 CoreMIDI 0x00007ff834af9c44 CADeprecated::XThread::RunHelper(void*) + 10 (XThread.cpp:21) 7 CoreMIDI 0x00007ff834afae9f CADeprecated::CAPThread::Entry(CADeprecated::CAPThread*) + 77 (CAPThread.cpp:324) 8 libsystem_pthread.dylib 0x00007ff81b9471d3 _pthread_start + 125 (pthread.c:893) 9 libsystem_pthread.dylib 0x00007ff81b942bd3 thread_start + 15 (:-1) I wonder if this MIDI activity may be related to the crash, even if it doesn't occur in the crashed thread. I also wonder if the dispatch block starting the backtrace of the thread 3 is the same that leads to the thread 1 (to open the NSAlert), which would mean that it's after the error, or if it's a dispatch block into which the error occurs. Finally, 2024-03-25_13-04-40.6314.crash I wonder if std::terminate() could indicate a problem in C++ codes because I have some part of the application written in C++, and it would be a clue. More generally, is it something in these backtraces that can help me to find what's the problem ? Any help greatly appreciated, thanks! -dp
Posted
by -dp.
Last updated
.
Post not yet marked as solved
1 Replies
370 Views
Hello everyone, I'm relatively new to iOS development, and I'm currently working on a Flutter plugin package. I want to use the AVFAudio package to load instrument sounds from an SF2 file into different channels. Specifically, I'd like to load individual instruments from the SF2 file onto separate channels. However, I've been struggling to find a way to achieve this. Could someone guide me on how to load SF2 instrument sounds into different channels using AVFAudio? I've tried various combinations of parameters (program number, soundbank MSB, and soundbank LSB), but none seem to work. If anyone has experience with AVFAudio and SF2 files, I'd greatly appreciate your help. Perhaps there's a proven approach or a way to determine the correct values for these parameters? Should I use a soundfont editor to inspect specific values within the SF2 file? Thank you in advance for any assistance! Best regards, Melih
Posted Last updated
.
Post not yet marked as solved
0 Replies
345 Views
Developing for iphone/ipad/mac I have an idea for a music training app, but need to know of supporting libraries for recognizing a musical note's fundamental frequency in close to real time (100 ms delay) Accuracy should be within a few cents (hundredths of a semi tone) A search for "music" resolved the core-midi library -- fine if I want to take input from midi, but I want to be open to audio input too. And I found MusicKit, which seems to be a programmer's API for digging into Meta questions: Should I be using different search terms: Where are libraries listed? Who are the names in 3rd party libraries.
Posted Last updated
.
Post not yet marked as solved
0 Replies
321 Views
I am writing a midi polyfill, to bridge Core Midi with Safari. This, in itself, is not a problem. The problem is that the Web Extension will get suspended the moment it's no longer actively called. This means that no callbacks due to midi changes from core midi can be passed back to the web page. If I use a setInterval call in background.js, then this keeps the extension alive somewhat, but the setInterval self ping will get aborted eventually, making the Extension suspend itself. I know of a fairly contrived workaround using the container application over XPC, but I am hoping there is a way to keep the Web Extension alive - or at least keep a thread in the same process as the Web Extension alive. Or any such workaround. Setting background to "persistent": true does not seem to make any difference.
Posted
by lerno.
Last updated
.
Post not yet marked as solved
0 Replies
499 Views
Hi there, We're developing a product which has a BLE module that advertises itself as a BLE MIDI device. The goal for our iOS app is to have the phone auto-connect to the device, which it already bonded with. Exactly like headphones; Bond one time, and everytime the headphones turn on, the phone automaticly pairs/connects to the headphones. At the moment, a new connection is required every time the the device turns on and advertises. I've read on the apple BLE documentation page, that from iOS 16 or later "the system automatically reconnects Bluetooth Low Energy (BLE) MIDI peripherals when powered on, if the device supports pairing. Previously, it was necessary to use Audio MIDI Setup to establish BLE MIDI connections." ( https://developer.apple.com/documentation/coremidi/midi_bluetooth/ ) However, neither our iPhones that run iOS 16+ or macOS 13+ devices re-connect to the BLE MIDI device. How can I achieve this? As per official BLE documentation, pairing is initiated by the central device (smartphone etc.) and the peripheral (BLE MIDI device) should simply store the MAC address + security information of the central device that it is currently bonded with.
Posted Last updated
.
Post not yet marked as solved
0 Replies
457 Views
In previous iOS versions, if a MIDI device was plugged in, removed and plugged in again, then the unique IDs of its source source and destination endpoints (and also of the device itself and its entities) would persist. So there would be a msgObjectAdded for a source e.g. "Port 1", UID 123456, then a msgObjectRemoved for the same port, then when plugged in again another msgObjectAdded for "Port 1", UID 123456 i.e. it is the "same" port that was added, removed, and added again. And it would be on the "same" entity, which in turn would be on the "same" device. Now in iOS 17 the unique IDs do not persist. The second msgObjectAdded will come back with something like "Port 1", UID -438484, on an entity and a device that also have new changed UIDs. Is this an intentional change? It seems odd that while the device is unplugged I can still find out all about it (name, entities, sources, destinations, all as before except the device is now "offline") yet when it is plugged back in I get brand new versions of all that stuff.
Posted
by sjenkins.
Last updated
.
Post not yet marked as solved
3 Replies
926 Views
I've got an app that communicates with midi devices, this app worked fine for years, however since MacOS 13.3 and iOS 16 users complained about strange behaviors. After days of debugging I found out that the USB Midi Packets sent are sometimes invalid (wrapped in sysex headers). Here is the extract used to sent a single midi message UInt8 mBuffers[100]; MIDIPacketList* packetList = (MIDIPacketList*) mBuffers; MIDIPacket* packet = MIDIPacketListInit(packetList); MIDIPacketListAdd(packetList, sizeof(mBuffers), packet, 0, messageLength, (Byte *) messageData); printf("Sending %i bytes ", messageLength); for (int i = 0; i < messageLength; i++) { if (i > 0) printf(":"); printf("%02X", (Byte)messageData[i]); } printf("\n"); fflush(stdout); status = MIDISend(outputPortReference, endPointReference, packetList); Here is the output of the program when the issues occurs: Sending 3 bytes E0:00:00 Sending 3 bytes D0:00:00 Sending 3 bytes E1:00:00 Sending 3 bytes D0:1B:00 Sending 3 bytes E2:00:00 Sending 3 bytes D0:2B:00 Sending 3 bytes E3:00:00 Sending 3 bytes D0:3B:00 Sending 3 bytes D0:4A:00 Sending 3 bytes D0:5A:00 Sending 3 bytes D0:6A:00 Sending 3 bytes D0:79:00 The data rate is relatively high (~100ms per message) However when looking at the usb stack, the following data gets sent: 0e e0 00 00 -> Match 0d d0 00 00 -> Match 04 00 e1 00 -> WRONG - wrapped in Sysex 04 00 d0 1b -> WRONG - wrapped in Sysex 0f 00 00 00 -> Padding 0e e2 00 00 -> Match 0d d0 2b 00 -> Match 04 00 e3 00 -> WRONG - wrapped in Sysex 04 00 d0 3b -> WRONG - wrapped in Sysex 04 00 d0 4a -> WRONG - wrapped in Sysex 0f 00 00 00 -> Padding 0d d0 5a 00 -> Match 0f 00 00 00 -> Padding 0d d0 6a 00 -> Match 04 00 d0 79 -> WRONG - wrapped in Sysex 0f 00 00 00 -> Padding I can't explain anymore why this is happening and looking for some advice. Since I can't debug on that level on iOS I can only guess that the root cause is the same. Does anyone know about such an issue? Any ideas how to further debug?
Posted Last updated
.
Post not yet marked as solved
1 Replies
1.3k Views
I am trying to implement some data transfer mechanisms using CoreMIDI, but am having a lot of difficulties sending UMPs with type 5 (128-bit System Exclusive 8). I've successfully been able to send various voice messages like note-on/off as well as System Exclusive 7 (UMP Type 3) using both MIDISendSysex(:) and MIDISendEventList(:,:,:) but nothing seems to be getting through if I want to send a 128-bit message. If I send it to a MIDI 2.0 virtual destination directly, it works fine, but when I send it to the IAC driver bus or my MIDI-USB device, nothing gets through. I have scoured the internet and CoreMIDI docs looking for a solution, or even just some more information but have found nothing. The messages themselves are formatted correctly according to the MIDI 2.0 specs. Is there something I'm missing? Any help is greatly appreciated.
Posted
by hparish.
Last updated
.
Post not yet marked as solved
0 Replies
765 Views
I'm having a problem understanding MIDIEventPacket structure. An event packet can be represented as timeStamp words: [word, word, ...] ... Each word is a chunk of a message. A MIDI message can be constituted by 1 or more word. In the documentation, an event packet is defined as A series of simultaneous MIDI events. This would mean that words within a packet can be part of different events. This would mean that given words[0] is msgs[0].words[0] either of these is correct: words[1] is msgs[0].words[1] OR is msgs[1].words[0] BUT in the definition of MIDIEventListAdd function, its words parameter is defined as the new event, which may be a single MIDI event or a partial SysEx event which would imply that all words within a packet constitute either a Sysex message, or only one MIDI event. Unless this means that MIDIEventListAdd will add different events into the same packet if the timestamp provided is the same. The question basically boils down to this: if it's the case that packets can contain multiple messages, there seems to be no way to differentiate between the first and last words of each contained message. Am I correct in interpreting this to mean mean that unless the message is tagged as sysex, I have to check the message type tag of words[0] to project the expected msg[0] word count?
Posted
by plaukiu.
Last updated
.
Post not yet marked as solved
0 Replies
606 Views
Virtual MIDI ports created by Apple's Core MIDI framework are not listed in MIDI Studio of Apple's Audio MIDI Setup, unlike the ports of hardware and Bluetooth MIDI devices. Our Connect software (https://icubex.com/connect) creates virtual MIDI ports, using Core MIDI calls, that are accessible by many audio/MIDI software programs including Garageband, but these virtual MIDI ports do not appear in or cannot be added to Apple's MIDI Studio of Apple's Audio MIDI Setup. Some software programs can communicate with the MIDI ports provided by MIDI Studio yet don't recognize the virtual MIDI ports created by our Connect software, so they cannot communicate directly with our software. It's then necessary to use a program like MIDI Pipe (http://www.subtlesoft.square7.net/MidiPipe.html) to forward data from the virtual MIDI port to/from the IAC driver. How can we bypass the MIDI Pipe solution and get the virtual MIDI ports recognized and listed by MIDI Studio like other hardware and Bluetooth MIDI devices ?
Posted
by I-CubeX.
Last updated
.