Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ev3dev sound - support for MIDI in ALSA? #726

Closed
JorgePe opened this issue Aug 21, 2016 · 13 comments

Comments

Projects
None yet
2 participants
@JorgePe
Copy link
Contributor

commented Aug 21, 2016

Hello.

I think I might be pushing the limits of the EV3 but I'm trying a laser harp. I can't play anything but my wife has some music theory background and plays piano, I gave her a MIDI keyboard some years ago and she used it to generate scores in an UbuntuStudio computer, it was her first contact with Linux and my first contact with the linux sound system.

ev3dev kernel: Linux ev3dev 4.4.17-14-ev3dev-ev3

I already have 6 EV3 color sensors (and a MindSensors MUX, another one in the way for a 7th sensor and an ultrasonic sensor to read the distance of the hand).

I'm using python.

I can already read the 6 sensors in python. I can play tones (with Sound.tone) but of course it doesn't sound like an harp. So I tried Sound.play with some harp samples I found on the net. Sounds much better but samples are to long so the "artist" have to wait for the end of each note before moving the hand to another "string". And it doesn't allow multiple notes at the same time.

So I went for MIDI. It has polyphony, there are some python libraries and since it uses soundfonts
I can easily change the instrument (me and my wife had the idea to use NFC tags to identify the "artist" or the "instrument").

I installed timidity++ (a software MIDI synth) but id doesn't work, it seems to need a module snd_seq that doesn't exists.

Then I went for fluidsynth and mingus, a python library that works with fluidsynth and ALSA. The soundfount that cames with fluidsynth is to big to load on memory but I found two small soundfonts that load fine. In Ubuntu I can play some nice notes, even several at same time. But in ev3dev, when mingus initialize fluidsynth, I immediately get a clicking noise (even before playing a note)

from mingus.midi import fluidsynth
from mingus.containers.note import Note
fluidsynth.init("Concert_Harp.sf2", "alsa")
# instrument 46 is Harp
fluidsynth.set_instrument(1, 46)
fluidsynth.play_Note(Note("C-3"))

I suspect fluidsynth also needs ALSA snd_seq module but found no evidences.

So...

  • anyone ever tried midi with ev3dev? Or some other method for polyphonic sound?
  • is it possible to include ALSA snd_seq module?

My "Plan B" is using the BrickPi since it supports the EV3 color sensor (but not sure if also supports the MindSensors MUX).

"Plan C" is much more drastic - use the EV3 as a MIDI instrument and send the MIDI commands to a Raspberry Pi running timidity++ or fluidsynth.

@JorgePe

This comment has been minimized.

Copy link
Contributor Author

commented Aug 21, 2016

Update: fluidsynth seems to need snd_seq_hw

If I start just fluidsynth it also makes click noises but I have this output:

robot@ev3dev:~$ fluidsynth --audio-driver=alsa Concert_Harp.sf2
(...)
ALSA lib seq_hw.c:457:(snd_seq_hw_open) open /dev/snd/seq failed: No such file or directory
fluidsynth: error: Error opening ALSA sequencer
Failed to create the MIDI thread; no MIDI input
will be available. You can access the synthesizer 
through the console.
@dlech

This comment has been minimized.

Copy link
Member

commented Aug 21, 2016

I'll turn on so MIDI stuff in the next kernel release and we will see what happens. I've always wanted to play around with MIDI on the EV3, but just never got around to it, so I'm curious to see what you come up with.

@JorgePe

This comment has been minimized.

Copy link
Contributor Author

commented Aug 21, 2016

That's great, thanks!!

I also discovered that ALSA has "aseqnet" and "aconnect" commands that allows a client to send MIDI to an host but it also requires snd_seq so every alternative I find ends on ALSA MIDI support.

If it doesn't work I can always use MQTT or some other network protocol to send the sensor values to my laptop or to a Raspberry Pi and play the music there. Even with some latency it will probably sound good enough (considering it's LEGO, of course).

I saw somewhere in ev3dev' github that MIDI USB support is/was included. So I will also try my wife's MIDI Keyboard... if it works I might have to buy an EV3 just for my wife :)

@dlech dlech added the kernel label Aug 23, 2016

@JorgePe

This comment has been minimized.

Copy link
Contributor Author

commented Aug 23, 2016

A quick demo using MQTT to send the notes to my Ubuntu laptop:
https://youtu.be/eKW4_QbIFak

I'm afraid the EV3 might not have enough power to run a MIDI soft synth but I hope to at least use ALSA to send the MIDI commands and drop the MQTT layer.

dlech added a commit to ev3dev/ev3-kernel that referenced this issue Aug 27, 2016

dlech added a commit to ev3dev/ev3-kernel that referenced this issue Sep 4, 2016

@dlech

This comment has been minimized.

Copy link
Member

commented Sep 4, 2016

Fix released in 15-ev3dev kernel.

@dlech dlech closed this Sep 4, 2016

@JorgePe

This comment has been minimized.

Copy link
Contributor Author

commented Sep 5, 2016

@dlech I confirm, after tonight update to '4.4.19-15-ev3dev-ev3' I have ALSA MIDI and I can run timidity and fluidsynth without errors related to missing modules.
No sound whatever but that's another question :)

@dlech

This comment has been minimized.

Copy link
Member

commented Sep 5, 2016

timidity file.mid --output-mono --sampling-freq=22500

\o/

@JorgePe

This comment has been minimized.

Copy link
Contributor Author

commented Sep 5, 2016

It works!!!
"Prepare for awesomeness"

@JorgePe

This comment has been minimized.

Copy link
Contributor Author

commented Sep 5, 2016

adding this entry to '/etc/timidity/timidity.cfg'
opt s22.5kHz
is enough. There are other entries there for reducing CPU usage and choose soundfont, will try later.

@JorgePe

This comment has been minimized.

Copy link
Contributor Author

commented Sep 5, 2016

Sorry to keep writing to a closed issued... just a last update (and thanks for advertising my harp at the front page)

fluydsynth seems to heavy for the EV3, I can start it from command line and load a small soundfont but there's to much clicking and when starting it from python with mingus I cannot pass parameters to reduce sampling rate, it doesn't work at all.

I installed python-rtmidi and mido and managed to choose the instrument and play notes through timidity in a similar way to mingus through fluidsynth. Sound is acceptable but latency is awful - each note plays almost 15 seconds after the instruction. I uncommented all performance options in timidity.cfg and even started timidity manually so I can tell it to use mono but no luck.

So it seems that programatically using MIDI on the EV3 is really to much and my approach with MQTT is still the best, at least for now - mido also has a MIDI client/server feature, will try it one of these days.

Nevertheless having MIDI support in ev3dev is still a great thing, thanks for the effort!

@dlech

This comment has been minimized.

Copy link
Member

commented Sep 5, 2016

The 15 second latency must be from a new python process starting each time or something like that. I connected my keyboard via USB MIDI and used it to play sounds. The latency was still annoying, but less that 1 second. There is also a bunch of clicking due to a defect in the ev3dev sound driver. I've fixed this on BeagleBone/EVB but haven't backported it to the EV3 yet.

It will probably work better the other way around though. If you have a MIDI device like a keyboard, you can have the EV3 send MIDI messages to the other device and have the sound generated there. If you don't have any MIDI devices, a BeagleBone would make a nice substitute. BeagleBone has a realtime kernel for low latency and modprobe g_midi should turn the BeeagleBone into a USB MIDI device via the peripheral port.

@JorgePe

This comment has been minimized.

Copy link
Contributor Author

commented Sep 5, 2016

I have a MIDI (USB) keyboard. It's just input, no output. But if the kernel recognizes it, I'll try to find it a usage as a controller for some animatronics.

Perhaps the 15 seconds are from the way mido uses python-rtmidi. I'll try just python-rtmidi.

@JorgePe

This comment has been minimized.

Copy link
Contributor Author

commented Dec 24, 2016

Sorry to revive a closed issue but if someone ever gets here looking for MIDI support... ev3dev works now quite well as a network MIDI input/output device with ALSA aseqnet.
I've made a MIDI robot that uses ALSA aseqnet to receive MIDI commands from my laptop and a pygame-based script to parse those commands, reacting to 3 velocity (intensity) on 3 different channels.
It's not S3NSORH3ADS robotic band but I'm pretty happy with my ANG3L

Thank you guys and Merry Xmas!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
You can’t perform that action at this time.