This session is for anyone who would like to explore music, visuals and creative coding for the web. We'll demonstrate types of data we can get from digital signal processing using interactive sketches in p5.js and the p5.sound library that builds upon the Web Audio API. We'll explore various methods to map this data onto meaningful visuals that enhance our experience of music.
##1. Amplitude
##2. Frequency FFT - Fast Fourier Transform
Scaling the FFT
##3. Pitch autocorrelation in the time domain to detect fundamental frequency
#4. Musical Timing sync music to timestamped lyrics
Visualizaitons with the Spotify Audio Analysis API (formerly Echo Nest API)
Participants may use whatever tools they wish, but the demos in this repo use the following libraries:
p5.js is a JavaScript library that starts with the original goal of Processing, to make coding accessible for artists, designers, educators, and beginners, and reinterprets this for today’s web.
p5.sound.js is an addon library that brings the Processing approach to the Web Audio API.
p5.dom.js is an addon library that helps us manipulate the DOM.
- Download this github repo, and build off of the empty example sketch in the template folder. It links to the libraries.
- Running p5.js on a local server
- You'll need a text editor. Some options:
- Getting Started: Your First Sketch
p5.AudioIn - microphone! documentation | source code
p5.SoundFile - load and play .mp3 / .ogg files. documentation | source code
loadsound()
creates a SoundFile using a Web Audio API buffer. Use duringpreload()
, or with a callback, or with drag and drop.
p5.PeakDetect - detect beats and/or onsets within a frequency range documentation }| source code
p5.Amplitude - Analyze volume (amplitude). documentation | source code
.getLevel()
returns a Root Mean Square (RMS) amplitude reading, between 0.0 and 1.0, usually peaking at 0.5.smooth()
p5.FFT - Analyze amplitude over time / frequency. documentation | source code
.analyze()
returns amplitude readings from 0-255 in the frequency domain..waveform()
returns amplitude readings from -1 to 1 in the time domain. demo | source
Music included in the demos/repo:
- Yacht - Summer Song (Instrumental) - See Mystery Lights Instrumentals Creative Commons BY-NC-SA
- Broke For Free - As Colorful As Ever - Layers - Creative Commons BY-NC
- Alaclair Ensemble - Twit Journalist - This Is America - Creative Commons BY-SA
- Peter Johnston - La ere gymnopedie (Erik Satie) - Best of Breitband Vol1
- Inara George - Q - Sargent Singles Vol 1 Creative Commons BY-NC-SA
- For more Creative Commons resources, check out the Free Music Archive's Guide to Online Audio Resources
- Pitch Detection - Web Audio Demo
- Another Approach to Beat Detection Using Web Audio API
- Making Audio Reactive Visuals w/ Web Audio API
- Marius Watz' Sound As Data workshop with Processing // blog post
- Echo Nest Remix API can get you beats, tatums, regular API has more data about music/artists/songs.
- p5.gibber Rapid music sequencing and synthesis. Also its own live coding environment.
- Tone.js is a JS library for composing interactive music.
- dancer.js is a JS library for audio visualizations.
- heartbeat.js is a JS library for working with MIDI.
Notation
- Optical Poem, Oskar Fischinger's 1938 visualization of Franz Liszt's "2nd Hungarian Rhapsody"
- Notations21
- Piano Phase (Alex Chen)
- George & Jonathan
- dennis.video, generative video by George ^
- Stephen Malinowski's Music Animation Machine
- Artikulation (Rainer Wehinger / Gyorgy Ligeti)
- animatednotation.com
- John Whitney
- Mark Fell - Skydancer
Interactive
Audio
- Cymatics
- Golan Levin, Zach Lieberman, Jaap Blonk, Joan La Barbara (w/ autocorrelation)
- Oscillator Art (TK Broderick)
- Music Makes You Travel (makio135)
- Ripple
- Ryoji Ikeda
Data Sonification
- Listening to the Data
- Listen to Wikipedia
- Metadata - Echo Nest's Map of Musical Styles
- Making Music with Tennis Data
- Sonifying the Flocking Algorithm (@b2renger)
Musical Form
Lyrics