The repository https://github.com/NVIDIA-Omniverse/Orbit provides the example reinforcement learning environments for Isaac orbit
These environments can be easily loaded and configured by calling a single function provided with this library. This function also makes it possible to configure the environment from the command line arguments (see Running an RL environment) or from its parameters as a python dictionary
Note
The command line arguments has priority over the function parameters
Note
Isaac Orbit environments implement a functionality to get their configuration from the command line. Setting the headless
option from the trainer configuration will not work. In this case, it is necessary to invoke the scripts as follows: python script.py --headless
Function parameters
# import the environment loader
from skrl.envs.torch import load_isaac_orbit_env
# load environment
env = load_isaac_orbit_env(task_name="Isaac-Cartpole-v0")
Command line arguments (priority)
# import the environment loader
from skrl.envs.torch import load_isaac_orbit_env
# load environment
env = load_isaac_orbit_env()
Run the main script passing the configuration as command line arguments. For example:
python main.py --task Isaac-Cartpole-v0
skrl.envs.torch.loaders.load_isaac_orbit_env