Musical Forest is a multiplayer musical experiment in virtual reality. It uses copresence —the synchronization of multiple people in a virtual space— to allow people to play music together in VR. WebSockets are used to sync all of the connected players in real-time.
Musical Forest, a WebVR Experiment.
Objects & Audio
There are three different shapes of musical objects corresponding to three different sets of sounds. Each set has six notes, chosen from a pentatonic scale.
- Spheres: Percussion (Conga, Woodblock, COWBELL!)
- Triangular Pyramids: Voice + Flute
- Cubes: Marimba
Sounds are positioned in 3D space using the Web Audio API’s PannerNode.
Headsets & Interaction Models
Musical Forest responsively adapts features depending on the capabilities of the VR Hardware.
Play: hit the shapes with your controller to hear its sound. The volume of the sound changes depending on how hard you hit it.
Create: tap the trigger to create a new shape. Rotate the circular pad to change the note.
Rearrange: place the controller over an existing object, press and hold the trigger to grab it. Move your controller and release to move it to a new space. Hovering your controller over an object and rotating the circular pad will change the type of shape and it’s sound.
Navigation: move within the bounds of your roomscale environment to interact with the objects within the experience.
Play: Hit the shapes with your controller to hear their sounds.
Navigation: point the Daydream controller at the ground and a circle will appear. Press the main button on the controller to teleport to that highlighted spot.
Play: gaze at an object and see it glow. Tap the interaction button to hear that object.
Navigation: gaze at the ground and a circle will appear. Tap the interaction button to teleport to the highlighted spot.
Interaction: tap any object to hear the sound of that object.
Navigation: gaze at the ground and a circle will appear. Tap anywhere to teleport to where the reticle is pointing.
Interaction: use the mouse to click any object to hear its sound. The volume of the sound is dictated by the object’s distance from the user.
Navigation: use the WASD keys on the keyboard. Use the mouse to change the field of view by clicking in empty space and dragging.
The backend is developed in Node.js. To get a full overview of the technologies and libraries used, see the backend readme
Running the Frontend Code
Download the source code, and install all dependencies by running
npm install. To run the frontend, run
npm start, this will start a local webserver using
budo connecting to the default backend server. If you have a local backend server running, append
?server=localhost to the url.