Tangible interaction with moving images and kinetic user interfaces with real-time RGB-D image analysis
A speculative project investigating the affordances of corporeal interfaces in augmenting digital content with materiality. Here, an exposed RGBD tracking system translates the spatial data to actionable user commands. The position of the viewer in space controls the playback rate and direction of the video, establishing a direct link between the temporal qualities of the media and the spatial coordinates of the viewer. Meanwhile, rows/columns of pixels from the current video frame are translated to waveform samples and fed to an additive synthesizer, creating a direct feedback system that consists of sound, image, and the body.
Made with openFrameworks. Requires: