Exhausting A Crowd
Inspired by the classic 60-page piece of experimental literature from Georges Perec, “An Attempt at Exhausting a Place in Paris”, written from a bench over three days in 1974. “Exhausting a Crowd” will automate the task of completely describing the events of 12 hours in a busy public space. This work speaks to the potential of a perfectly automated future of surveillance, enabled by a distributed combination of machine and human intelligence. A beautiful record of the energy present in shared space, as well as a disturbing look into the potential for control in a dystopian environment. Commissioned by the V&A for "All of This Belongs to You".
This piece builds on my previous work including "People Staring at Computers" and "keytweeter" which address the boundaries between public and private spaces, and the way computers mediate those spaces. Most similar is “Conversnitch” where I worked with Brian House to create a lamp that tweets overheard conversations in realtime using distributed human transcriptions. The potential for the hive mind to aide or disable investigations was made clear in the aftermath of the Boston marathon bombings, where Reddit took to vigilantism, trawling through thousands of images of the incident in an attempt to find the criminal, eventually settling on the wrong suspect.
In the work of others, I was inspired by David Rokeby's "Sorting Daemon", Rafael Lozano-Hemmer's "Subtitled Public", Standish Lawder's "Necrology". Late in the process of the developing the piece, someone shared "The Girl Chewing Gum" by John Smith which became more confirmation than inspiration. Another interesting reference a friend sent was "An Attempt at Exhausting and Augmented Place in Paris." Georges Perec, observer-writer of urban life, as a mobile locative media user by Christian Licoppe.
The primary location inspiring this piece was 14th Street Union Square in NYC, as viewed from the south side of the park. At any moment, there may be anywhere from 10 people at midnight, to 100 people on a cold afternoon, to 500 people at lunch or thousands for a protest. People are engaged in a variety of activities from playing chess, to dancing, singing, chanting, panhandling, eating, kissing, walking through, or just waiting.
The decision to go with Piccadilly Circus was at the request of the V&A to consider shooting in London. After exploring public spaces on street view and Wikipedia, I eventually found this picture on Flickr by Andrew Murray:
After a lot of discussion with different businesses around Piccadilly Circus we eventually found Lillywhites (Sports Direct) was willing to let us shoot the piece.
Two big decisions were made throughout the project, one was about whether to present a live stream or a pre-recorded stream, and the other was about whether to use computer-assisted tags or even computer-assisted targets based on pedestrian detection. The pre-recorded stream was essential to get the effect of an abundance of notes at any moment, and we tried to create the feeling of it being "live" by removing almost all user interface elements that suggested otherwise. The computer-assisted tags were dropped because it felt more disturbing to know that all the notes left behind were left there by a real human clicking and typing.
With 4k footage there was some concern about privacy. Legally, there are no privacy restrictions on filming and broadcasting people in public spaces in the UK (with the exception of a few places like The Royal Square, Trafalgar Square, the London Underground). But this piece is about the crowd, not any specific individual, I wanted to avoid making any persons face clearly recognizable. In practice, almost all individuals appear at enough of a distance, and most internet connections cannot support the full 4k video bandwidth required to make out faces in the foreground.
All footage was recorded over 12 hours at 4k 30fps on a GoPro Hero 4, modified with a 12mm lens, and two Lexar 64GB High-speed MicroSD cards that were swapped every two hours while the GoPro ran off USB power. The GoPro outputs a sequence of short videos that are then stripped of audio and concatenated with
ffmpeg. Before being concatenated the videos are copied to a temporary folder on the internal SSD which changes the processing time from days to minutes. Finally, all six videos (approximately two hours each) are uploaded to YouTube, which will accept up to 128GB or 11 hour videos after verification. All the videos are added to a playlist, and YouTube handles the streaming and buffering.
The frontend is written with TypeScript, and is getting definitions with
tsd. Install these with
npm install -g firstname.lastname@example.org tsd@next
Then, to run locally, execute:
$ npm install $ tsd reinstall $ npm start
The app should now be running on localhost:5000.
To make TypeScript automatically recompile changes to the .ts definitions, run
tsc -w --outDir public/compiled/ typescript/*.
To attach to the remote database, you will need to set the
export PGSSLMODE='require'. After making changes you can deploy to Heroku:
$ git push heroku master $ heroku open
If your computer is connected to AirPlay, the pause function is delayed (in order to sync the audio).
These are the credits for the piece as it was initially created for London:
EXHAUSTING A CROWD (2015) by KYLE MCDONALD with JONAS JONGEJAN COLLABORATION & SITE DEVELOPMENT / JONAS JONGEJAN COMMISSIONED by VICTORIA AND ALBERT MUSEUM for ALL OF THIS BELONGS TO YOU NICO TURNER / VIDEO SPECIAL THANKS to CORINNA GARDNER, DAN JOYCE, HELLICAR & LEWIS