Skip to content

Commit

Permalink
Merge branch 'main' of github.com:facebookresearch/habitat-lab into z…
Browse files Browse the repository at this point in the history
…f/get-cam-transform-fix
  • Loading branch information
An HPCaaS user for priparashar committed Apr 10, 2024
2 parents 0f2d71f + 1ff9b5f commit 5c1c555
Show file tree
Hide file tree
Showing 62 changed files with 2,885 additions and 1,354 deletions.
19 changes: 16 additions & 3 deletions .circleci/config.yml
Original file line number Diff line number Diff line change
Expand Up @@ -235,6 +235,7 @@ jobs:
while [ ! -f ~/miniconda/pytorch_installed ]; do sleep 2; done # wait for Pytorch
pip install -e habitat-lab
pip install -e habitat-baselines
pip install -e habitat-hitl
- save_cache:
key: conda-{{ checksum "habitat-lab/.circleci/config.yml" }}-{{ checksum "./date" }}
background: true
Expand Down Expand Up @@ -270,8 +271,18 @@ jobs:
. activate habitat; cd habitat-lab
export PYTHONPATH=.:$PYTHONPATH
export MULTI_PROC_OFFSET=0 && export MAGNUM_LOG=quiet && export HABITAT_SIM_LOG=quiet
python -m pytest --cov-report=xml --cov-report term --cov=./
python -m pytest test/ --cov-report=xml --cov-report term --cov=./
- codecov/upload
- run:
name: Run HITL tests
no_output_timeout: 60m
command: |
export PATH=$HOME/miniconda/bin:/usr/local/cuda/bin:$PATH
. activate habitat; cd habitat-lab
export PYTHONPATH=.:$PYTHONPATH
export MULTI_PROC_OFFSET=0 && export MAGNUM_LOG=quiet && export HABITAT_SIM_LOG=quiet
python -m habitat_sim.utils.datasets_download --uids hab3-episodes hab3_bench_assets habitat_humanoids hab_spot_arm ycb --data-path data/ --no-replace --no-prune
python -m pytest habitat-hitl/test
- run:
name: Run baseline training tests
no_output_timeout: 30m
Expand All @@ -280,9 +291,9 @@ jobs:
. activate habitat; cd habitat-lab
export PYTHONPATH=.:$PYTHONPATH
export MULTI_PROC_OFFSET=0 && export MAGNUM_LOG=quiet && export HABITAT_SIM_LOG=quiet
# This is a flag that enables test_test_baseline_training to work
# This is a flag that enables test_baseline_training to work
export TEST_BASELINE_SMALL=1
python -m pytest test/test_baseline_training.py -s
python -m pytest test/test_baseline_training.py -s
- run:
name: Run Hab2.0 benchmark
no_output_timeout: 30m
Expand Down Expand Up @@ -324,6 +335,8 @@ jobs:
python -c 'import habitat; print("habitat version:", habitat.__version__)'
pip install habitat-baselines/
python -c 'import habitat_baselines; print("habitat_baselines version:", habitat_baselines.__version__)'
pip install habitat-hitl/
python -c 'import habitat_hitl; print("habitat_hitl version:", habitat_hitl.__version__)'
- run: &build_sdist_and_bdist
name: Build sdist and bdist
command: |
Expand Down
2 changes: 2 additions & 0 deletions docs/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -103,6 +103,7 @@
"pages/habitat-lab-tdmap-viz.rst",
"pages/habitat2.rst",
"pages/view-transform-warp.rst",
"pages/metadata-taxonomy.rst",
]

PLUGINS = [
Expand Down Expand Up @@ -137,6 +138,7 @@
("Habitat Lab TopdownMap Visualization", "habitat-lab-tdmap-viz"),
("Habitat 2.0 Overview", "habitat2"),
("View, Transform and Warp", "view-transform-warp"),
("'user_defined' Metadata Taxonomy", "metadata-taxonomy"),
],
),
("Classes", "classes", []),
Expand Down
113 changes: 113 additions & 0 deletions docs/pages/metadata-taxonomy.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,113 @@
Taxonomy of "user_defined" configurations in habitat-lab
########################################################

This resource page outlines the expected taxonomy of expected metadata fields and systems in habitat-lab leveraging the non-official "user_defined" Configuration fields for objects, stages, and scenes.

As outlined on the `Using JSON Files to configure Attributes <https://aihabitat.org/docs/habitat-sim/attributesJSON.html#user-defined-attributes>`_ doc page, "user_defined" attributes provide a generic, reserved JSON configuration node which can be filled with user data. The intent was that no "officially supported" metadata would use this field, leaving it open for arbitrary user metadata. However, several prototype and bleeding-edge features are actively leveraging this system. The purpose of this doc page is to enumerate those known uses and their taxonomy to guide further development and avoid potential conflict with ongoing/future development.


`Receptacles`_
==============

Who: Stages, RigidObjects, and ArticulatedObjects.
Where: stage_config.json, object_config.json, ao_config.json, scene_instance.json (overrides)

What: sub_config with key string containing "receptacle\_". "receptacle_mesh\_" defines a TriangleMeshReceptacle while "receptacle_aabb\_" defines a bounding box (AABB) Receptacle. See the `parse_receptacles_from_user_config <https://github.com/facebookresearch/habitat-lab/blob/main/habitat-lab/habitat/datasets/rearrange/samplers/receptacle.py>`_ function.

Example:

.. code:: python
"user_defined": {
"receptacle_mesh_table0001_receptacle_mesh": {
"name": "table0001_receptacle_mesh",
"parent_object": "0a5df6da61cd2e78e972690b501452152309e56b", #handle of the parent ManagedObject's template
"parent_link": "table0001", #if attached to an ArticulatedLink, this is the local index
"position": [0,0,0], # position of the receptacle in parent's local space
"rotation": [1,0,0,0],#orientation (quaternion) of the receptacle in parent's local space
"scale": [1,1,1], #scale of the receptacles in parent's local space
"up": [0,0,1], #up vector for the receptacle in parent's local space (for tilt culling and placement snapping)
"mesh_filepath": "table0001_receptacle_mesh.glb" #filepath for the receptacle's mesh asset (.glb with triangulated faces expected)
}
}
`Scene Receptacle Filter Files`_
================================

Who: Scene Instances
Where: scene_instance.json
What: filepath (relative to dataset root directory) to the file containing Receptacle filter strings for the scene.

Example:

.. code:: python
"user_defined": {
"scene_filter_file": "scene_filter_files/102344022.rec_filter.json"
}
`Object States`_
================

Who: RigidObjects and ArticulatedObjects
Where: object_config.json, ao_config.json, scene_instance.json (overrides)

What: sub_config containing any fields which pertain to the ObjectStateMachine and ObjectStateSpec logic. Exact taxonomy in flux. Consider this key reserved.

.. code:: python
"user_defined": {
"object_states": {
}
}
`Marker Sets`_
==============

Who: RigidObjects and ArticulatedObjects
Where: object_config.json, ao_config.json, scene_instance.json (overrides)

What: sub_config containing any 3D point sets which must be defined for various purposes.

.. code:: python
"user_defined": {
"marker_sets": {
"handle_marker_sets":{ #these are handles for opening an ArticulatedObject's links.
0: { # these marker sets are attached to link_id "0".
"handle_0": { #this is a set of 3D points.
0: [x,y,z] #we index because JSON needs a dict and Configuration cannot digest lists
1: [x,y,z]
2: [x,y,z]
},
...
},
...
},
"faucet_marker_set":{ #these are faucet points on sinks in object local space
0: { # these marker sets are attached to link_id "0". "-1" implies base link or rigid object.
0: [x,y,z] #this is a faucet
...
},
...
}
}
}
`ArticulatedObject "default link"`_
======================================

Who: ArticulatedObjects
Where: ao_config.json

What: The "default" link (integer index) is the one link which should be used if only one joint can be actuated. For example, the largest or most accessible drawer or door. Cannot be base link (-1).

.. code:: python
"user_defined": {
"default_link": 5 #the link id which is "default"
}
1 change: 0 additions & 1 deletion examples/hitl/basic_viewer/basic_viewer.py
Original file line number Diff line number Diff line change
Expand Up @@ -73,7 +73,6 @@ def _update_lookat_pos(self):
self._get_camera_lookat_pos(),
radius,
mn.Color3(255 / 255, 0 / 255, 0 / 255),
24,
)

@property
Expand Down
6 changes: 3 additions & 3 deletions examples/hitl/pick_throw_vr/pick_throw_vr.py
Original file line number Diff line number Diff line change
Expand Up @@ -79,7 +79,9 @@ def __init__(self, app_service: AppService):
assert not self._app_service.hitl_config.camera.first_person_mode

self._nav_helper = GuiNavigationHelper(
self._app_service, self.get_gui_controlled_agent_index()
self._app_service,
self.get_gui_controlled_agent_index(),
user_index=0,
)
self._throw_helper = GuiThrowHelper(
self._app_service, self.get_gui_controlled_agent_index()
Expand Down Expand Up @@ -483,12 +485,10 @@ def _get_target_object_positions(self):
)

def _draw_circle(self, pos, color, radius):
num_segments = 24
self._app_service.gui_drawer.draw_circle(
pos,
radius,
color,
num_segments,
)

def _add_target_object_highlight_ring(
Expand Down
8 changes: 4 additions & 4 deletions examples/hitl/rearrange/rearrange.py
Original file line number Diff line number Diff line change
Expand Up @@ -84,7 +84,9 @@ def __init__(
)

self._nav_helper = GuiNavigationHelper(
self._app_service, self.get_gui_controlled_agent_index()
self._app_service,
self.get_gui_controlled_agent_index(),
user_index=0,
)
self._episode_helper = self._app_service.episode_helper

Expand Down Expand Up @@ -121,7 +123,7 @@ def _update_grasping_and_set_act_hints(self):
color = mn.Color3(0, 255 / 255, 0) # green
goal_position = self._goal_positions[self._held_target_obj_idx]
self._app_service.gui_drawer.draw_circle(
goal_position, end_radius, color, 24
goal_position, end_radius, color
)

self._nav_helper.draw_nav_hint_from_agent(
Expand All @@ -138,7 +140,6 @@ def _update_grasping_and_set_act_hints(self):
can_place_position,
self._can_grasp_place_threshold,
mn.Color3(255 / 255, 255 / 255, 0),
24,
)

if self._app_service.gui_input.get_key_down(GuiInput.KeyNS.SPACE):
Expand Down Expand Up @@ -272,7 +273,6 @@ def _update_task(self):
can_grasp_position,
self._can_grasp_place_threshold,
mn.Color3(255 / 255, 255 / 255, 0),
24,
)

def get_gui_controlled_agent_index(self):
Expand Down
14 changes: 14 additions & 0 deletions examples/hitl/rearrange_v2/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,15 @@ git clone --branch articulated-scenes --single-branch --depth 1 https://huggingf
mv fphab fpss
```

To test the Habitat-LLM episodes in `rearrange_v2` you'll need to download and unzip the following [episode dataset](https://drive.google.com/file/d/1zFCBiWE_XFY0Ry9CZOV_NF_rfxBw1y-F/view?usp=sharing) in Habitat-Lab root directory. In addition, you'll need YCB, GSO, AI2THOR, and ABO object assets. To download these assets use the following commands:

```
cd data
git clone https://huggingface.co/datasets/ai-habitat/OVMM_objects objects --recursive
cd objects
git checkout 3893a735352b92d46505f35d759553f5fc82a39b
```

## Data directory

Run `rearrange_v2` from the Habitat-lab root directory. It will expect `data/` for Habitat-lab data, and it will also look for `examples/hitl/rearrange_v2/app_data/demo.json.gz` (included alongside source files in our git repo).
Expand All @@ -33,6 +42,11 @@ Headless server:
python examples/hitl/rearrange_v2/rearrange_v2.py +experiment=headless_server
```

To test Habitat-LLM episodes using a user-controlled humanoid use:
```bash
python examples/hitl/rearrange_v2/rearrange_v2.py --config-name lang_rearrange_humanoid_only
```

## Controls
See on-screen help text. In addition, press `1` or `2` to select an episode.

Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,40 @@
# @package _global_

defaults:
- language_rearrange
- hitl_defaults
- _self_

habitat:
# various config args to ensure the episode never ends
environment:
max_episode_steps: 0
iterator_options:
# For the demo, we want to showcase the episodes in the specified order
shuffle: False

habitat_baselines:
# todo: document these choices
eval:
should_load_ckpt: False
rl:
agent:
num_pool_agents_per_type: [1, 1]
policy:


habitat_hitl:
window:
title: "Rearrange"
width: 1300
height: 1000
gui_controlled_agents:
- agent_index: 0
lin_speed: 10.0
ang_speed: 300
hide_humanoid_in_gui: True
camera:
first_person_mode: True
data_collection:
save_filepath_base: my_session
save_episode_record: True

0 comments on commit 5c1c555

Please sign in to comment.