We built a live performance tool that combines music analysis, MIDI controller input, and real-time LED light visuals.
Songs are analyzed for beat, mood, and segmentation using Gracenote's timeline API. The beat data is used to automatically slice a song into samples which map to an 8x8 grid of buttons. We use the beat data in combination with mood and segmentation data to create unique real-time visuals.
The Mixcandy plays like a sampler, a drum machine, and a unique synesthetic light instrument all in one.