Skip to content

Interactions

ludos1978 edited this page Jun 28, 2018 · 2 revisions

Interactions with UI Elements can be made using different interaction devices. Devices that have been used are the Vive Controllers, Leap Motion touch interaction, Gaze Controller based as well as Mouse controls.

The VR Interactions are based on 3 basic events, as well as some numbered events that have not been tested.

  • Grab
  • Use
  • Touch

Which button activates the events is defined in the NetXr/Prefabs/CameraRigs/.../...Controller (Mouse, Vive, Leap)

Each Input Controller Setup gives the following options:

  • Ui Raycast allways enabled
  • Physics Raycast allways enabled
  • When to show the Ui Ray (only on ui hit, allways or never)
  • When to show the Physics Ray (allways or never)
  • How the Physics Collsion detection should be done (as a raycast, a parabolic ray or as a spherecast)
  • The names of the GameObject-Childrens for the Ui & Physics Ray Source, SphereCastSource and Attachement point of objects)

For each event you can define which key, button or touch activates the event

  • which button activates the ui raycasting (can disable a allways enabled ui raycasting)
  • which button activates the physics raycasting
  • buttons to switch between the ray/parabola/sphere-casting modes
  • ui press activation button
  • use event for world objects
  • attach / grab event for world objects
  • touch event for world objects
  • additional events for world objects

Additional manual callbacks for the buttons of the device can be set as well. ( also see Event-System-(Advanced) )

Clone this wiki locally