ThreeAudio helps you create music visualizations in Three.js or tQuery.
Clone or download
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
build Fix FIR filter for smooth change Dec 11, 2013
misc Screenshot Jul 12, 2012
shaders Three.js r52 compatibility + test demo Dec 11, 2012
src Fix FIR filter for smooth change Dec 11, 2013
test Revert back to bufferSource, mediaElementSource isn't working. Split … Dec 15, 2012
vendor Beat detection, peak tracker Jul 21, 2012
.gitignore Screenshot Jul 12, 2012
LICENSE.txt ThreeAudio.js Jul 12, 2012
README.md Syntax highlighting Dec 16, 2012
build.sh ignore <ecma5 errors Dec 15, 2012

README.md

ThreeAudio.js

ThreeAudio.js

ThreeAudio helps you create music visualizations in Three.js, by exposing audio data in GLSL shaders.

It can be used directly with Three.js or as a tQuery plug-in.


ThreeAudio will read from an audio source and provide frequency/time data in the form of textures, as well as derived values for volume, bass, mid range and treble. ThreeAudio also includes a real-time beat detector based on autocorrelation.

Use the included ThreeAudio.Material class to create ShaderMaterials that can read from the audio data.

Includes: microevent.js (Jerome Etienne), dsp.js (Corban Brook)

NOTE: ThreeAudio is still somewhat experimental and only Webkit Audio APIs are supported for now. Patches are welcome.

Demo (Chrome only!): http://acko.net/files/three-audio/

Builds:

  • ThreeAudio: microevent + core
  • ThreeAudio-tquery: microevent + core + tQuery plug-in
  • ThreeAudio.glsl.html: required GLSL shaders

Basic Usage

  1. Stand-alone

Create an audio source, load a file and request playback when ready.

var audioSource = (new ThreeAudio.Source()).load('/audio/file.mp3').play();

Create textures to hold the audio data, passing in the Three.js renderer and the audio source.

var audioTextures = new ThreeAudio.Textures(renderer, audioSource);

Create a material that uses the audio data, passing in the audio textures, a vertex/fragment shader program, as well as any other textures, uniforms and attributes you wish to use (as objects with key/value pairs). Specify a literal vertex/fragment program, or the ID of a script tag that contains the source code for the program.

var audioMaterial = new ThreeAudio.Material(audioTextures, vertexShader, fragmentShader);
// Or
var audioMaterial = new ThreeAudio.Material(audioTextures, vertexShader, fragmentShader, textures, uniforms, attributes);

Apply the material to a mesh and insert it into your scene. Use GridGeometry to get a plane with UV coordinates that are perfectly aligned with data samples.

// Sample entire data set
var geometry = new ThreeAudio.GridGeometry(audioTextures, 100, 100);
// OR: 16 frequency/time samples and 5 history samples
var geometry = new ThreeAudio.GridGeometry(audioTextures, 100, 100, 16, 5);

// Mesh
var audioMesh = new THREE.Mesh(geometry, audioMaterial);
scene.add(audioMesh);

Update the audio textures every frame before render.

audioTextures.update()
  1. tQuery

Create an audio source and start playing.

var audio = world.audio().load('/audio/file.mp3').play();

Create audio textures, make a material out of them with given shaders, and bind it to a mesh. Use .grid() on the material to get a planar grid ready for rendering.

var mesh = audio.textures().material(vertexShader, fragmentShader).grid().addTo(world);

Note: the textures are automatically updated on render. The chained calls above give you access to the intermediate ThreeAudio objects in between.

Shaders

See shaders/shaders.glsl.html for an example shader that generates a 3d spectrum voiceprint.


Steven Wittens - http://acko.net/