Noise Machine #1
Dear Overtone community,
Just wanted to drop a line to say thanks for being such an inspiring and helpful bunch of folks. We were asked to put together a motion sensing musical instrument this summer for the Honolulu Museum of Art, Spalding House's exhibition on music. This is the code we created for our Noise Machine #1 installation.
Ingredients: Overtone, Bifocals, Quil
- Overtone: Awesome live coding environment for making sound with SuperCollider.
- Bifocals: Wrapper for the SimpleOpenNI Kinect library.
- Quil: Processing in Clojure!
See the project.clj file for all dependencies.
Thanks, and what is this?
The musical parts are done with Overtone, the motion detection via Kinect, and the visuals via Quil/Processing. The installation is placed next to a bonafide, real, wood and metal harpsichord. So, we used that connection to direct our sound generation. We found Chris Ford's harpsichord code on a Gist he put online. Big thanks for that and all of his helpful stuff (Leipzig and a talk on Functional Composition).
Also, special shout out to Sam Aaron for the awesome development tools (both for Quil and Overtone). We spent a bunch of time running through all Overtone demos and sample code. We event took a look at Karsten Schmidt's great Resonate Overtone workshop files.
We had a ton of fun putting it together and the reception has been great. People are moving around and interacting with Overtone. What's especially rewarding for us is that it gets people talking and thinking about the technology, in the context of musical instruments and expression. It's a bit of an aside, but the real harpsichord was commissioned to be somewhat radical for its time. Times have changed!
Videos of people interacting with the installation:
Pics of the installation:
The real harpsichord:
This is what working in Emacs looks like on a wall:
Thanks to everyone!
- digital projector
- hdmi cables
- mac mini
- Kinect sensor (original model), with included power adapter
The launching point for the application is
-main function in core.clj. You'll see that we follow the structure of a typical Quil project. A Quil sketch is created and run (in
run-sketch), followed by a "row row row your boat" intro (in
The main sketch has two interesting parts,
draw. We'll dive into those now.
The more interesting bits of state are:
grid-state- Map keyed by [col row] vector, saving the dectection state of sector of the grid.
grid-sensors- Star grid... sort of twinkly. Also, contains a
:burstkey which retains what the burst looks like for that grid sector.
all-pixies- List of all the randomized circles showing up to visualize sound (the circles).
note-grid- Map (key: [col row]) containing the "notes" played in each sector of the grid.
The grid is x,y plane of detection seen by the Kinect. It's state is determined by the distance of an object (if detected) away from the Kinect.
The drawing bits are a bit crazy. There are a few elements layered together. The visuals evolved with the sound and the (somewhat random) trajectory of the project. Here are the main components.
- grid sensors (flickering stars - denote grid and spacing), look at
- grid bursts (wobbly lines - motion), look at
- pixies (circles), look at
- Open core.clj in buffer.
C-c C-keval core buffer in REPL.
C-c M-nswitch REPL to core namespace.
C-c C-ethe run-sketch call.
- add visuals
- flip screen
- rows >= 36 causes offset problem!
- full screen command, or on launch
- run as app
- make activated sensors fade until next activation
- refresh activated sensors on reactivation by increasing health
- closer to wall, make sensor dots more detailed, e.g. cluster of actors
- from time to time, close-up of closest dots to wall
- from time to time, close up to nice looking grid (that cara likes)
- turn off bottom or top rows via settings
- turn on/off mouse zoom/rotate controls via keystroke
This work is licensed under a Creative Commons Attribution-ShareAlike 3.0 Unported License.
Copyright © 2013 Pas de Chocolat, LLC