iOS VR app in Unity
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Type Name Latest commit message Commit time
Failed to load latest commit information.


Jim Peraino

This project is built as the capstone project for the Udacity Virtual Reality Nanodegree program. It is a prototype for a VR interface that tests methods of interaction and modes of perspectival awareness in a VR environment. Features listed below per Udacity's rubric.

Built for iOS using Unity and Google's Cardboard SDK

Video Walkthrough Link



  • Animation Achievement When the cube in the center of the scene is pressed, the spheres are animated into a new position. From the user's point of view, it appears that the spheres are simply growing in size, however, when the user moves his/her position, they can see that the spheres have actually moved in space.
  • Locomotion Achievement Users can click on arrows to move left and right, and locomotion is defined by iTween's methods.
  • Physics Achievement When the users look at spheres, the spheres are dislodged and fall to the bottom of the scene. A sound is played upon each collision. At the end of gameplay, the spheres all fall.


  • Gamification Achievement The goal of the game is to make as many spheres drop as possible within the allotted time. A scoreboard shows the time remaining and the number of spheres remaining in the scene. A lower score is a better outcome.
  • Alternative Storyline Achievement At the beginning of the game, the user is prompted with two alternative experiences: an easy version with fewer spheres, and a harder version with more spheres.
  • 3D Modeling Achievement The arrows for navigation were modeled in Rhino. Additionally, the sphere locations were generated by modeling an archway in rhino, and then using Grasshopper to export the point locations, which could then be fed into the objectMaker script.

Challenges (500 points):

  • User Testing achievement
    • Two rounds of user testing were held to develop the user interface and to improve user experience.
    • The first round focused on locomotion, and tested several different types of movement. While earlier iterations did not use tweening in an attempt to avoid motion sickness, users reported that animating the movement made it more clear what was happening as their perspective shifted, and did not result in motion sickness.
    • The second round of user testing focused on the UI, and resulted in user controls that move with the user as they change positions rather than staying in a globally fixed location.