An object- and coroutine-based abstraction over the Java MIDI API using Kotlin coroutines & channels.
Note: this is still in an experimental stage, expect breaking changes until the first stable release. This README reflects the current development branch, not the released version.
- Type-safe MIDI events
- MIDI as consumable channels with Coroutines
- Minimal code required for listening
- Easy input & output
Platform | Compatibility |
---|---|
JVM | ✅ |
Windows | ✖️ (waiting on Windows Midi Services) |
Linux | ✖️ (after Windows) |
MacOS | ✖️ (no device for testing available) |
Version currently available on Maven Central is outdated, current README is for the next one.
Gradle Kotlin
implementation("dev.stashy.midifunk", "midifunk", "x.x.x")
val device = MidiDevice.list()[0]
device.input.open(coroutineScope).onEach { println(it) }
val device = MidiDevice.list()[0]
val channel: SendChannel<MidiData> = device.output.open(coroutineScope)
channel.send(event)
NoteEvent.create {
noteStatus = true
note = Note.C(4)
velocity = 127u
}
Top priority at the moment is to finalize the device input/output API. The eventual goal is to be able to completely rip out the Java MIDI backend & use something else, which will also provide multiplatform support.
After the device API, virtual device support is going to be prioritized. This will most likely require separate backends for each platform to be implemented already.
Although not strictly, please try to adhere to the standard Kotlin code style guidelines, more thoroughly defined in JetBrains IDEs. Other than that, feel free to submit pull requests - I will gladly review them.