Natural movement algorithm for Oculus VR
Switch branches/tags
Nothing to show
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Permalink
Failed to load latest commit information.
Assets
ProjectSettings
.gitignore
LICENSE
README.md

README.md

NB! Project is no longer maintained. This repository is replaced with https://github.com/taphos/vr-natural-movement

Oculus Natural Movement

This is an implementation of character controller which allows player to move naturally in order to control the virtual character. Made for Oculus Rift DK2, no additional hardware or sensors reuired. Steps are detected by the head movement captured by DK2 motion tracker camera.

Based on Oculus SDK version 0.8 and Oculus Unity Utilities 0.1.2

Controls

  • To control the game player should stand up facing towards the DK2 motion tracker camera.
  • Camera should be placed about 1-2 meters in front of the player on the eye level height.
  • Using tripod is recommended.
  • Slow movements can be achieved by making small steps inside the in game marked tracker range.
  • Faster movements achieved by jogging in place. Steps are detected using head movements so it is better to jog hopping. It takes a bit to of practice to figure out the comfortable jogging style.

Check the example game video called EvilMech Runner https://www.youtube.com/watch?v=LALSXmYQxe0

Author

Filipp Keks