This is an example HITL application that allows a user to interact with a scene, controlling a human avatar with mouse/keyboard or a VR headset. A policy-driven Spot robot also interacts with the scene. This is a proof of concept and not meant for rigorous evaluation or data-collection.
Note: The robot policy used in this example application will randomly move around and pick up objects. This is an example application from which you can base yourself to build your own VR application.
- Pick_throw_vr HITL application
- VR
HABITAT_SIM_LOG=warning MAGNUM_LOG=warning \
python examples/hitl/pick_throw_vr/pick_throw_vr.py
See config/pick_throw_vr.yaml
. You can also use the configs in experiment
as overrides, e.g. python examples/hitl/pick_throw_vr/pick_throw_vr.py +experiment=headless_server
.
The human avatar can optionally be controlled from VR. In this mode, the Pick_throw_vr app must still be run on a headed desktop machine, and it still offers a 3D window and some limited keyboard/mouse controls. However, it also acts as a server that communicates with our Unity-based VR client (below), which immerses the VR user in the Habitat environment.
The system is composed of the following components:
- The Server, which is the Pick_throw_vr app.
- The Client is a Unity app that can be run from within the Unity Editor or deployed to a VR headset.
Requirements | Notes |
---|---|
habitat-sim | Use a nightly conda build, or build from source. Bullet is required. |
Datasets | After installing habitat-sim , run the following command from the root habitat-lab directory:python -m habitat_sim.utils.datasets_download --uids hab3-episodes habitat_humanoids hab_spot_arm ycb hssd-hab --data-path data/ |
hssd-models | Required for processing datasets for Unity. Clone it anywhere. It will be specified later as a command-line argument. |
Requirements | Notes |
---|---|
VR Headset | We recommend Quest 3 (best) or Quest Pro with ~300 MB free storage. Make sure that developer mode is activated. On Quest 2, more complex HSSD scenes may run poorly or not at all. Other VR headsets supported by Unity should also work. |
siro_hitl_unity_client | Beware that a Unity license may be required by your organization. Follow these installation instructions. |
The standard keyboard-mouse launch command-line arguments can be used with those differences:
- The
habitat_hitl.networking.enable=True
config override launches the Pick_throw_vr app as a server, allowing a remote client (e.g. VR headset) to connect and control the human avatar.
HABITAT_SIM_LOG=warning MAGNUM_LOG=warning \
python examples/hitl/pick_throw_vr/pick_throw_vr.py \
habitat_hitl.networking.enable=True
We also have an experimental headless server:
python examples/hitl/pick_throw_vr/pick_throw_vr.py \
+experiment=headless_server
Because the Unity application is a remote client, it must have its own copy of the 3D models used for rendering scenes.
Habitat's 3D models are not directly compatible with Unity, and must be simplified to run at an acceptable performance on the VR devices.
Therefore, a script is provided so that you can process your datasets and add them to your Unity project.
The dataset processing script requires latest Magnum binaries, which should be installed separately from Habitat as described below.
Magnum is easiest to install on Mac via Homebrew.
- Follow magnum-bindings installation instructions.
- In addition to
corrade
,magnum
, andmagnum-bindings
, you may needmagnum-plugins
.
- In addition to
- Test your install:
python -c "from magnum import math, meshtools, scenetools, trade"
.- Beware homebrew installs python packages like magnum to its own Python location, not your current conda environment's Python.
- Depending on how Homebrew has installed Python, you may need to use
python3
instead ofpython
.
It is recommended that you create a new conda
environment so that it can be reused in the future without interfering with Habitat.
- Install magnum binaries for Linux.
- Navigate to the Magnum CI.
- Select the latest green workflow run.
- Scroll down to "Artifacts".
- Download your the binaries that match your system (e.g. On Linux:
magnum-tools-v2020.06-...-linux-x64
) - Extract to a convenient location.
- Create a new
conda
environment:
conda create --name magnum python=3.10
- Navigate to the
site-packages
of your new environment, e.g.~/anaconda/envs/magnum/lib/python3.10/site-packages/
. - Create a
magnum.pth
file in this directory. - Add the absolute path to
magnum-tools
'spython
folder to this file, e.g:
/home/USER/Documents/magnum-tools/linux-x64/python/
- The Magnum libraries will now be included upon activating your
magnum
environment. You may validate by assessing that the following commands don't return errors:conda activate magnum
python -c "from magnum import math, meshtools, scenetools, trade"
To process the dataset, navigate to your habitat-lab
root directory. Run the following command:
python ./scripts/unity_dataset_processing/unity_dataset_processing.py \
--hssd-hab-root-dir data/scene_datasets/hssd-hab \
--hssd-models-root-dir path_to/hssd-models/objects \
--scenes 105515448_173104512
The transformed assets will be output to data/hitl_simplified_data
.
In Unity, open the project and use Tools/Update Data Folder...
. From the dialog window, copy the path to the generated data/hitl_simplified/data
in the External Data Path
field. The resources will be imported into Unity.
At this point, you should be able to run HITL remotely from Unity Editor. To validate that everything is in place, follow the following steps:
- Start the HITL tool by running this command from the root
habitat-lab
directory. - In the Unity editor, load the
GfxReplayPlayerScene
. - Press play.
- After a short while, your Unity client will be connected to your local server instance. You can navigate in the Unity viewport and the movements will reflect on the server.
- Use WASD and the mouse to navigate.
- With some familiarity, you can use the XR Device Simulator (see on-screen help).
If the application works correctly from the Unity Editor, you may now deploy it to a Quest headset.
- Quest is an Android device. In the Unity editor, go to
Build Settings
. From the platform list, selectAndroid
, thenSwitch Platform
. - Plug your Quest to your machine via USB.
- A popup will show up in your Quest headset to authorize the computer.
- Still in
Build Settings
, refresh the device list, then look for your specific Quest device in the dropdown menu. Select it. - Click
Build and Run
and ensure that this completes without error. You'll be prompted for a build save location - any location will do. - Put on your headset. The app may already be running. You can find the application
siro_hitl_vr_client
in your applications list. - The application won't connect to the server. Follow the steps below to enable the connection.
Upon launching the server, it will start listening for incoming connections. The client will attempt to connect to the addresses listed in Android/data/com.meta.siro_hitl_vr_client/files/config.txt
. It rotates between the addresses periodically until a connection is established.
- Make sure that your Quest is connected to your machine via USB.
- A popup will show up in your Quest headset to authorize the computer.
- Navigate to
Android/data/com.meta.siro_hitl_vr_client/files/config.txt
- Put your server IP addresses there (which can be found using
hostname -i
, for example). - Save and restart
siro_hitl_vr_client
.- You may now disconnect the USB cable.
See troubleshooting notes if connection fails.
See on-screen help for information about controls.
- Use
T
to toggle between server-controlled mouse-keyboard and client-controlled VR controls.
See the troubleshooting steps on siro_hitl_unity_client if you have issues deploying the client to your VR device.
- Make sure that your server firewall allows incoming connections to the port
8888
. - Check that the Unity client
config.txt
file lists to the correct address. See this section. - Make sure that both devices are on the same network.
- Corporate networks may introduce additional hurdles. To circumvent these, you can use the wifi hotspot on your phone or a separate router.
- Check that only 1 server is running on your PC.
- If your server runs on Mac, consider disabling Retina. You can use displayplacer to achieve this.
- If you need to mirror your screen, do it before using the tool.
- Use
displayplacer list
to see all supported display modes. - Find a mode that does not use Retina and runs at 60FPS.
- Apply the mode using
displayplacer "id:X mode:Y"
.
- If using a laptop, make sure that power is connected.
- If running on a busy network, consider using the wifi hotspot on your phone or a separate router.
- Performance is poor on Quest Pro. Consider using smaller scenes, or increasing mesh simplification aggressiveness in the dataset processing tool. See decimate.py.
- The VR device transforms are currently sent to from the client to the server at frequency that is lower than the simulation. This causes grabbed objects to jitter. This is the current expected behavior.