With the arrival of generative art models like DALLE-2 and MidJourney, unique high resolution images can be created with just a few words. This has created a disconnect between the art and artist. The EEG-Art project attempts to connect the artist to their art through the use of brain signals to affect the parameters of a visual space in real time. While listening to music, the artist's brain signals are processed in real time and used to represent a visual space - effectively mapping the sound representation into a visual space through the medium of brain signals. The first solution utilizes Petal Metrics, the osc python library, and MaxMSP to create an interactive art installation. Petal Metrics is a free application used to stream EEG signals from the 4 channels of the muse headset to a specified UDP port on any machine. To process the signals from the headset, we use the python-osc library to create an "OSC Server". This server is listening in on the same UDP port that Petal Metrics is streaming OSC messages to. These messages (aka raw brain signals) are then parsed, processed, and sent to an "OSC client" - which is another specified port where the final signal values are sent to. The specified port of the OSC client is the same port that Max is recieving the final signal values on, using the "UDP Recieve" object. This pipelne is visualized below.
The processing of the eeg signals happens within the osc.py script.
- The (x, y) coordinates are calcuated using the lateralization indexes between left and right hemishperes of the brain. Specifically the eeg values are streamed from the TP9 and TP10 channels, which are the electrodes located on the temporal lobes. When the x value is small - meaning more power of the theta band in the left hemisphere - then the x-ccordinate moves towards the left side of the screen. This x-coordinate moves to the left when there is more left temporal lobe brain acvitivy, which indicates language processing, and moves towards the right when there is more right temporal lobe brain acvitivy, which indicates sound/auitory processing.
- The attraction variables are calculated by alpha synchronization in the right and left hemispheres of the brain, as well as a beta/theta ration which both indicate levels of attention and creativity. When these values were large, this meant there was less creative thinking going on, and therefore the artist is said to be in less of a "flow state". We visualized this flow state through the attraction parameters by having the attraction be larger when the artist was not in a flow state, in order to contract the particles. When the artist was considered to be in a "flow state" the attraction parameters would be smaller, so that the particles would flow across the screen much easier.
The poster presented at the 2023 California Neurotechnology Conference, hosted by UCLA, is shown below.
Our final Max Patch is a redesign of @AmazingMaxStuff's tutorial on his youtube page.