Join users from around the world in a musical forest. A WebVR Experiment.
Switch branches/tags
Nothing to show
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Permalink
Failed to load latest commit information.
backend use updated sphere position label Apr 12, 2017
js first commit Apr 12, 2017
python first commit: backend Apr 12, 2017
static style updates Apr 12, 2017
style style updates Apr 12, 2017
third_party/aframe-daydream-controller-component first commit Apr 12, 2017
.gitignore first commit Apr 12, 2017
CONTRIBUTING.md first commit Apr 12, 2017
LICENSE first commit Apr 12, 2017
README.md Cleanup readme Apr 19, 2017
app.yaml first commit Apr 12, 2017
config.js first commit Apr 12, 2017
config.template first commit Apr 12, 2017
index.html header updates Apr 12, 2017
package.json first commit Apr 12, 2017
yarn.lock first commit Apr 12, 2017

README.md

alt text

Musical Forest

Musical Forest is a multiplayer musical experiment in virtual reality. It uses copresence —the synchronization of multiple people in a virtual space— to allow people to play music together in VR. WebSockets are used to sync all of the connected players in real-time.

Musical Forest, a WebVR Experiment.

Basic Interaction

Each of the shapes triggers a note when it’s hit. Users can navigate the space, play notes and also hear and play with all of the other players in the space at the same time.
alt text

Objects & Audio

There are three different shapes of musical objects corresponding to three different sets of sounds. Each set has six notes, chosen from a pentatonic scale.

  • Spheres: Percussion (Conga, Woodblock, COWBELL!)
  • Triangular Pyramids: Voice + Flute
  • Cubes: Marimba

Sounds are positioned in 3D space using the Web Audio API’s PannerNode.

Headsets & Interaction Models

Musical Forest responsively adapts features depending on the capabilities of the VR Hardware.

Vive/Oculus

Play: hit the shapes with your controller to hear its sound. The volume of the sound changes depending on how hard you hit it.
Create: tap the trigger to create a new shape. Rotate the circular pad to change the note.
Rearrange: place the controller over an existing object, press and hold the trigger to grab it. Move your controller and release to move it to a new space. Hovering your controller over an object and rotating the circular pad will change the type of shape and it’s sound.
Navigation: move within the bounds of your roomscale environment to interact with the objects within the experience.

Daydream

Play: Hit the shapes with your controller to hear their sounds.
Navigation: point the Daydream controller at the ground and a circle will appear. Press the main button on the controller to teleport to that highlighted spot.

Cardboard

Play: gaze at an object and see it glow. Tap the interaction button to hear that object.
Navigation: gaze at the ground and a circle will appear. Tap the interaction button to teleport to the highlighted spot.

Magic Window

Interaction: tap any object to hear the sound of that object.
Navigation: gaze at the ground and a circle will appear. Tap anywhere to teleport to where the reticle is pointing.

Desktop

Interaction: use the mouse to click any object to hear its sound. The volume of the sound is dictated by the object’s distance from the user.
Navigation: use the WASD keys on the keyboard. Use the mouse to change the field of view by clicking in empty space and dragging.

Technologies Used

Frontend

Musical Forest uses A-Frame which is built with the WebVR standard and Tone.js for sound.

Backend

The backend is developed in Node.js. To get a full overview of the technologies and libraries used, see the backend readme

Running the Frontend Code

Download the source code, and install all dependencies by running npm install. To run the frontend, run npm start, this will start a local webserver using budo connecting to the default backend server. If you have a local backend server running, append ?server=localhost to the url.

Acknowledgements

Manny Tan, Igor Clark, Yotam Mann, Alexander Chen, Jonas Jongejan, Jeremy Abel, Saad Moosajee, Alex Jacobo-Blonder, Ryan Burke, and many others at Google Creative Lab.