Skip to content
This repository has been archived by the owner on Apr 26, 2022. It is now read-only.

googlecreativelab/webvr-musicalforest

Repository files navigation

This project is no longer actively maintained by the Google Creative Lab but remains here in a read-only Archive mode so that it can continue to assist developers that may find the examples helpful. We aren’t able to address all pull requests or bug reports but outstanding issues will remain in read-only mode for reference purposes. Also, please note that some of the dependencies may not be up to date and there hasn’t been any QA done in a while so your mileage may vary.

For more details on how Archiving affects Github repositories see this documentation .

We welcome users to fork this repository should there be more useful, community-driven efforts that can help continue what this project began.

Musical Forest

Musical Forest is a multiplayer musical experiment in virtual reality. It uses copresence —the synchronization of multiple people in a virtual space— to allow people to play music together in VR. WebSockets are used to sync all of the connected players in real-time.

Musical Forest, a WebVR Experiment.

Basic Interaction

Each of the shapes triggers a note when it’s hit. Users can navigate the space, play notes and also hear and play with all of the other players in the space at the same time.
alt text

Objects & Audio

There are three different shapes of musical objects corresponding to three different sets of sounds. Each set has six notes, chosen from a pentatonic scale.

  • Spheres: Percussion (Conga, Woodblock, COWBELL!)
  • Triangular Pyramids: Voice + Flute
  • Cubes: Marimba

Sounds are positioned in 3D space using the Web Audio API’s PannerNode.

Headsets & Interaction Models

Musical Forest responsively adapts features depending on the capabilities of the VR Hardware.

Vive/Oculus

Play: hit the shapes with your controller to hear its sound. The volume of the sound changes depending on how hard you hit it.
Create: tap the trigger to create a new shape. Rotate the circular pad to change the note.
Rearrange: place the controller over an existing object, press and hold the trigger to grab it. Move your controller and release to move it to a new space. Hovering your controller over an object and rotating the circular pad will change the type of shape and it’s sound.
Navigation: move within the bounds of your roomscale environment to interact with the objects within the experience.

Daydream

Play: Hit the shapes with your controller to hear their sounds.
Navigation: point the Daydream controller at the ground and a circle will appear. Press the main button on the controller to teleport to that highlighted spot.

Cardboard

Play: gaze at an object and see it glow. Tap the interaction button to hear that object.
Navigation: gaze at the ground and a circle will appear. Tap the interaction button to teleport to the highlighted spot.

Magic Window

Interaction: tap any object to hear the sound of that object.
Navigation: gaze at the ground and a circle will appear. Tap anywhere to teleport to where the reticle is pointing.

Desktop

Interaction: use the mouse to click any object to hear its sound. The volume of the sound is dictated by the object’s distance from the user.
Navigation: use the WASD keys on the keyboard. Use the mouse to change the field of view by clicking in empty space and dragging.

Technologies Used

Frontend

Musical Forest uses A-Frame which is built with the WebVR standard and Tone.js for sound.

Backend

The backend is developed in Node.js. To get a full overview of the technologies and libraries used, see the backend readme

Running the Frontend Code

Download the source code, and install all dependencies by running npm install. To run the frontend, run npm start, this will start a local webserver using budo connecting to the default backend server. If you have a local backend server running, append ?server=localhost to the url.

Acknowledgements

Manny Tan, Igor Clark, Yotam Mann, Alexander Chen, Jonas Jongejan, Jeremy Abel, Saad Moosajee, Alex Jacobo-Blonder, Ryan Burke, and many others at Google Creative Lab.