Skip to content
Open source sound-to-light DMX controller for Raspberry Pi, using Pure Data and Processing. Current version created for Joanne Cox's Enagage and Interact project.
Processing
Branch: master
Clone or download
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
Audio_DMX_Pd Theatre development: getting messy Feb 7, 2019
Processing_GUI fixed minor bug Feb 4, 2019
documentation/images even smaller image for readme Feb 7, 2019
manual Create instructions.md Jul 10, 2019
.gitattributes Initial commit Feb 3, 2019
README.md Update README.md Feb 9, 2019

README.md

Light Recorder Deck

This is an audio responsive lighting system developed in Pure Data and Processing, and intended to run on a Raspberry Pi touch screen. The idea is that this can be operated independently with a smaller set of lights for workshops, or scaled up to a larger system through DMX, MIDI etc to work with a lighting designer if needed. At present, the equipment needed can all be obtained “off the shelf” for under £150 including some basic lights.

The framework was originally developed for the Augmented Gamelan project, and this new open source version is being created in collaboration with Joanne Cox for her Engage and Interact project funded by Help Musicians UK.

A photo of the first prototype: a small touchscreen attached to a Raspberry Pi, with a screen showing coloured sliders: red, green, and blue

This is being designed with improved access in mind: although I will make no claim to this being “accessible” in a generic sense, in the longer term, it might provide an opportunity to explore different interfaces: touch screens, switch control, voice response and screen reading -- and maybe others that haven't been invented yet.

While there are plenty of DMX systems out there already, including some excellent free and open source projects for the Raspberry Pi, I’m interested in making this accessible using a set of tools used by digital artists and instrument builders.

In particular, by running the audio in Pure Data, I hope we can find an interesting way to build audio responsive units and build on existing code -- and perhaps integrate with sound processes to make this a combined audio/light synthesiser!

As time goes on, I will upload a shopping list and some more detailed instructions on how to set this up on a Raspberry Pi. For now, I am using this repository for daily development to ensure that the code is open and public, but it remains undocumented!

Please note: while I make this stuff a lot, I don't really identify as a coder, and GitHub remains fairly new and impenetrable to me...so any feedback or advice on how to make this work would be most appreciated!!

How does it work?

No programming is required (or at least, that's the eventual idea). The programs are loaded to an SD card to boot directly on the Raspberry Pi. Having chosen your lights and their channels, the various colours and intensities can be chosen and linked to sounds through a set of sliders on the touch screen.

You can’t perform that action at this time.