Join GitHub today
GitHub is home to over 36 million developers working together to host and review code, manage projects, and build software together.Sign up
ev3dev sound - support for MIDI in ALSA? #726
I think I might be pushing the limits of the EV3 but I'm trying a laser harp. I can't play anything but my wife has some music theory background and plays piano, I gave her a MIDI keyboard some years ago and she used it to generate scores in an UbuntuStudio computer, it was her first contact with Linux and my first contact with the linux sound system.
ev3dev kernel: Linux ev3dev 4.4.17-14-ev3dev-ev3
I already have 6 EV3 color sensors (and a MindSensors MUX, another one in the way for a 7th sensor and an ultrasonic sensor to read the distance of the hand).
I'm using python.
I can already read the 6 sensors in python. I can play tones (with Sound.tone) but of course it doesn't sound like an harp. So I tried Sound.play with some harp samples I found on the net. Sounds much better but samples are to long so the "artist" have to wait for the end of each note before moving the hand to another "string". And it doesn't allow multiple notes at the same time.
So I went for MIDI. It has polyphony, there are some python libraries and since it uses soundfonts
I installed timidity++ (a software MIDI synth) but id doesn't work, it seems to need a module snd_seq that doesn't exists.
Then I went for fluidsynth and mingus, a python library that works with fluidsynth and ALSA. The soundfount that cames with fluidsynth is to big to load on memory but I found two small soundfonts that load fine. In Ubuntu I can play some nice notes, even several at same time. But in ev3dev, when mingus initialize fluidsynth, I immediately get a clicking noise (even before playing a note)
I suspect fluidsynth also needs ALSA snd_seq module but found no evidences.
My "Plan B" is using the BrickPi since it supports the EV3 color sensor (but not sure if also supports the MindSensors MUX).
"Plan C" is much more drastic - use the EV3 as a MIDI instrument and send the MIDI commands to a Raspberry Pi running timidity++ or fluidsynth.
Update: fluidsynth seems to need snd_seq_hw
If I start just fluidsynth it also makes click noises but I have this output:
That's great, thanks!!
I also discovered that ALSA has "aseqnet" and "aconnect" commands that allows a client to send MIDI to an host but it also requires snd_seq so every alternative I find ends on ALSA MIDI support.
If it doesn't work I can always use MQTT or some other network protocol to send the sensor values to my laptop or to a Raspberry Pi and play the music there. Even with some latency it will probably sound good enough (considering it's LEGO, of course).
I saw somewhere in ev3dev' github that MIDI USB support is/was included. So I will also try my wife's MIDI Keyboard... if it works I might have to buy an EV3 just for my wife :)
Sorry to keep writing to a closed issued... just a last update (and thanks for advertising my harp at the front page)
fluydsynth seems to heavy for the EV3, I can start it from command line and load a small soundfont but there's to much clicking and when starting it from python with mingus I cannot pass parameters to reduce sampling rate, it doesn't work at all.
I installed python-rtmidi and mido and managed to choose the instrument and play notes through timidity in a similar way to mingus through fluidsynth. Sound is acceptable but latency is awful - each note plays almost 15 seconds after the instruction. I uncommented all performance options in timidity.cfg and even started timidity manually so I can tell it to use mono but no luck.
So it seems that programatically using MIDI on the EV3 is really to much and my approach with MQTT is still the best, at least for now - mido also has a MIDI client/server feature, will try it one of these days.
Nevertheless having MIDI support in ev3dev is still a great thing, thanks for the effort!
The 15 second latency must be from a new python process starting each time or something like that. I connected my keyboard via USB MIDI and used it to play sounds. The latency was still annoying, but less that 1 second. There is also a bunch of clicking due to a defect in the ev3dev sound driver. I've fixed this on BeagleBone/EVB but haven't backported it to the EV3 yet.
It will probably work better the other way around though. If you have a MIDI device like a keyboard, you can have the EV3 send MIDI messages to the other device and have the sound generated there. If you don't have any MIDI devices, a BeagleBone would make a nice substitute. BeagleBone has a realtime kernel for low latency and
I have a MIDI (USB) keyboard. It's just input, no output. But if the kernel recognizes it, I'll try to find it a usage as a controller for some animatronics.
Perhaps the 15 seconds are from the way mido uses python-rtmidi. I'll try just python-rtmidi.
Sorry to revive a closed issue but if someone ever gets here looking for MIDI support... ev3dev works now quite well as a network MIDI input/output device with ALSA aseqnet.
Thank you guys and Merry Xmas!