The file "field.pd" can be run on a current version of PureData (extended).
Grab the zipfile and play around!
Requires a video camera (USB or other) with a functioning driver to work correctly.
Imagine an environment not as an ambience but rather as a complex ecology where each element is equally responsive to the overall becoming of itself. Key to this responsiveness is the awareness of inside and outside collapsing - that each element is embedded in a constellation whose patterns exceed the scope of any one elements' scale. It's realizing that we do not need to create immersive environments because we are already embedded in them whether we choose to acknowledge it or not.
This installation proposal is an attempt to exploit that understanding by making the invisible dynamics of our actions in space immediately visible.
Using video data, we will use PD to aggregate the velocity of movement in the space, which will generate a sound sequence adjusted in ratio to the velocity. This means that the sound will be tempo adjusted to match the aggregated velocity of the room - moving from a near-drone to something of a 180bpm techno track, all depending on what is occurring in the space. The data will be interpreted by a video camera and a series of hardware/software combinations. The sound will be played through a basic stereo or series of speakers, ideally surrounding the room. A short, elucidative description will be available for participants.