Skip to content

Programming Guide

Tae-wook Kim edited this page Jul 13, 2020 · 2 revisions

When an onAirVR server application is launched and the first scene is loaded, the first awaken AirVRCameraRig instantiates an AirVRServer instance and starts it up. And each AirVRCameraRig registers itself to AirVRCameraRigManager in the scene (If no AirVRCameraRigManager exists in the scene, one instance of it is automatically instantiated). Then,

  1. When the onAirVR App on a mobile VR device connects to the onAirVR server application,
  2. AirVRServer establishes a session and informs AirVRCameraRigManager.
  3. AirVRCameraRigManager then finds an available AirVRCameraRig, and
  4. Binds the AirVRCameraRig to the session.
  5. Data from the client - such as the HMD orientation, input device values, etc. - are applied to the AirVRCameraRig through the session.
  6. Meanwhile AirVRCameraRig renders video frames using child Unity cameras then encodes and sends the video frames back to the client.

Figure 3.

If you load a new scene containing AirVRCameraRigs,

  1. AirVRCameraRigs in the old scene are unbound from the current sessions, then
  2. AirVRCameraRigManager tries to bind AirVRCameraRigs in the new scene to the sessions.

Figure 4.


You can set the server settings including network configuration and rendering framerate in Project Settings.

Figure 5.

License File : string
  • The path of an onAirVR server license file which the built executable will use. (In editor, Assets/onAirVR/Server/Editor/Misc/onairvr.license is used always. See “Build” for detail.)
Max Client Count : int
  • The maximum number of clients which can be connected simultaneously
Port : int
  • onAirVR server port number
Adaptive Frame Rate : bool
Minimum Frame Rate : int
  • The minimum rendering framerate when no client is connected

(Though only Oculus Quest is officially supported) each client may have its own appropriate video framerate, so onAirVR server needs to vary its rendering framerate for the current client. If Adaptive Framerate in Project Settings is enabled, onAirVR server turns VSync off and dynamically adjusts the application framerate to the video framerate requested by the client.


onAirVR server acts like a video streaming server which streams realtime-rendered video frames to clients. So it’s possible that two or more clients are connecting and playing simultaneously. To make a scene with multiple players, you just need to :

  1. Put two or more AirVRCameraRig instances in the scene, and
  2. Ensure that the maximum client count in Project Settings is two or more.

Then when a session is established for a client, AirVRCameraRigManager finds one of available AirVRCameraRigs in the scene randomly then binds it to the session. Or if you implement AirVRCameraRigManager.EventHandler, AirVRCameraRigManager requests you to select one of AirVRCameraRigs through AirVRCameraRigManager.EventHandler.AirVRCameraRigWillBeBound().

Figure 6.

Note

There is a limitation on the number of encoding sessions depending on your graphics card. For example, NVIDIA Geforce graphics cards allow up to two encoding sessions due to licensing restrictions. In this case you must not make your content available for more than two clients simultaneously.


There are two main components where events you might be interested in are occurred - AirVRServer and AirVRCameraRigManager. If you would like to do something for the events you need to

  1. Implement AirVRServer.EventHandler interface and set to AirVRServer.Delegate, and/or
  2. Implement AirVRCameraRigManager.EventHandler interface and set to AirVRCameraRigManager.managerOnCurrentScene.Delegate.

Please see “API References” for detail.

void Awake() {
    AirVRServer.Delegate = this;
}

public void AirVRServerFailed(string reason) {
    Debug.Log(reason);
}

public void AirVRServerClientConnected(int clientHandle) {}

public void AirVRServerClientDisconnected(int clientHandle) {}
void Awake() {
    AirVRCameraRigManager.managerOnCurrentScene = this;
}

public void AirVRCameraRigWillBeBound(int clientHandle, AirVRClientConfig config, List<AirVRCameraRig> availables, out AirVRCameraRig selected) {
    selected = availables.Count > 0 ? selected[0] : null;
}

public void AirVRCameraRigActivated(AirVRCameraRig cameraRig) {}

public void AirVRCameraRigDeactivated(AirVRCameraRig cameraRig) {}

public void AirVRCameraRigHasBeenUnbound(AirVRCameraRig cameraRig) {}

Using AirVRInput class, you can get the values of Oculus Touch controllers bound to an AirVRCameraRig. As you can see in “API References”, you can use AirVRInput class in the same manner with UnityEngine.Input except that AirVRInput methods require an AirVRCameraRig as an argument.

Please read "Raw Mapping" in Oculus Quest documentation - Map Controllers to see how each axis or button is mapped to Oculus Touch.


Like camera view point or mouse pointer in desktop UI environment, the hand controllers are used as pointing devices in mobile VR UI. onAirVR Event System integrates them into Unity Event System and you can use it in the same manner with Unity Event System. The below table describes which onAirVR Event System component corresponds to which Unity Event System component.

Component UnityEngine.EventSystems onAirVR
Event system EventSystem AirVREventSystem
Input module StandaloneInputModule, TouchInputModule, etc. AirVRInputModule
Raycaster on Canvas GraphicRaycaster AirVRGraphicRaycaster
(working on Canvas in world space only)
Raycaster in physics PhysicsRaycaster AirVRStereoCameraRig.Raycast Physics
(left/right hand anchor raycasts physics)
Pointer UnityEngine.Camera, mouse pointer, etc. AirVRStereoCameraRig.Event System Responsive
(left/right hand controller interacts to the event system)

Please see “B. Event System” sample scene for an example of how to use onAirVR Event System in detail.

Figure 13.


AirVRServerAudioOutputRouter routes “stereo” audio rendered by Unity audio engine to connected clients. It must be attached to an GameObject on which a UnityEngine.AudioListener is attached.

Note that AirVRStereoCameraRig.Send Audio automatically creates AirVRServerAudioOutputRouter at CenterEyeAnchor. However, in advanced use cases, you may want to disable AirVRStereoCameraRig.Send Audio and manually add the router component to any other transform.

There are several options for input and output.

  • Input
    • AudioListener : routes stereo audio heard by Unity’s AudioListener to clients
    • AudioPlugin : routes stereo audio passing through “AirVR Server Audio Output” audio plugin in a Unity’s AudioMixer to clients
  • Output
    • All : broadcast audio to all of connected clients
    • One : routes audio to a specific client

For advanced audio application, you might want to see the code of AirVRServerAudioOutputRouter, in which it takes raw audio data from Unity audio engine then send them to clients using AirVRServer.SendAudioFrame(). You can use this method if you want to send your own raw audio directly, for example in the case you are using a 3rd-party or your own audio engine.

Note

“AirVR Server Audio Output” audio plugin uses “audio renderer ID” to send audio data to AirVRCameraRig. When you set AudioPlugin as input, you must (1) expose RendererID parameter of the plugin then (2) set it to exposedRendererIDParameterName field of AirVRServerAudioOutputRouter.

Figure 13.5.


This feature is going to be deprecated. Please use the default model Head only.

Clone this wiki locally