Intro to Reinforcement learning term project
- Drive folder : https://drive.google.com/drive/u/1/folders/1IWBaSYP6VH7vti9cb_MgK4lh1kEbcMzq
- First meeting minutes : https://docs.google.com/document/d/1b4IXfd1U2ebf4b-UMGvzG1uoiOsij3C3kAkuit6AaRk/edit
- Project proposal : https://www.overleaf.com/8236317519rqfskxkjrkyn#9f34f4
- https://arxiv.org/abs/2107.04034 (Rapid Motor Adaptation - DP 2021)
- https://cs.gmu.edu/~xiao/Research/Verti-Wheelers/ (Only learning based off road navigation work we know yet)
- https://arxiv.org/pdf/2107.08325.pdf (Imitation + Reinforcement learning example)
- https://drive.google.com/file/d/1YFdlU5zrgtFw-yxGwb9kknlHN_dyCmfw/view?usp=sharing (Wenli's slides, relevant work)
- https://theairlab.org/offroad/ (Airlab's work on off-road driving)
- Clone this repo
git clone https://github.com/dvij542/uneven-terrain-driver.git
- Download and install unity hub from here : https://unity.com/unity-hub . Currently, only Windows (7+) and MacOS support unity game engine. Ubuntu 2022 support is a WIP
- Open Unity hub and navigate to Installs tab from the left bar. Click on Install Editor and install Unity 2021.3 (NOTE : My version was 2021.3.24f1 and now it looks to be 2021.3.31f1, hopefully there should only be minor changes/ bug fixes from 24f1 to 31f1)
- Click on New project. This will point to a window as shown below
- Once you are in the Unity editor window, close it and open from Unity Hub, point to this github repo. make sure the editor version is 2021.3. This will take some time when initializing for the first time. This should open the editor window as shown here :-
Follow these instructions : Link . You might need to install C# dev tools if vscode doesn't install by default on adding the plugin. Autocompletes should work if everythin worked well. To test, just double click on one of the scripts in unity and that should open the VSCode project. Check if the code is well formatted by VSCode, try typing something to check autocomplete.
Easist way to read code is to just directly go to individual objects and see what scripts are attached to them.
- If you point to Terrain object on the left pane, it should have a procedural_heightmap.cs script attached to it as shown below. Just double click on the script pointer and that should take you to that script in vscode. This is the script used to generate the procedurally random heightmap and the features (trees, rocks, grass etc)
You can directly use the built environment inside Build-ubuntu and Build-windows. Some of the additional custom built environments can be found here: https://drive.google.com/drive/u/1/folders/1IWBaSYP6VH7vti9cb_MgK4lh1kEbcMzq
- Sim-to-real transfer to the RC car
Follow this Link
- Make sure you change the permission of the Unity directory from Read-Only.
- Install Linux Build System in Unity Hub
- Follow the above link and choose Linux as the target platform while building
conda create -n terrain_env python=3.8
conda activate terrain_env
python3.8 -m pip install mlagents==0.28.0
python3.8 -m pip install mlagents-envs==0.28.0
Ref: Link. Run the following commands
pip install mlagents==0.28.0
pip install mlagents-envs==0.28.0
cd Build-ubuntu
sudo chmod 777 *
cd ..
export PROTOCOL_BUFFERS_PYTHON_IMPLEMENTATION=python
mlagents-learn config.yaml --env "Build-ubuntu/exec" --run-id ubuntu_try15 --num-envs 10 --resume
You can also interact with the simulator as a ROS2 node. Full ROS2 integration is a WIP. To get started, follow the given steps:-
- Install ROS2 (Foxy, Iron or any other version)
- Install dependencies :-
sudo apt-get install ros-<ROS_VERSION>-ackermann-msgs
sudo apt-get install ros-<ROS_VERSION>-sensor-msgs
sudo apt-get install ros-<ROS_VERSION>-geometry-msgs
- If on linux, also install mono-project required for reading images within the application
- Change permissions of the ros2-env built environment with :-
cd ros2-env
sudo chmod 777 *
cd ..
- Your params.yaml file should be in the same directory you run from. It should contain the path to the heightmap, the spawn position and which sensors to use. Run 'ros2_interface.py'
source /opt/ros/<ROS_VERSION>/setup.bash
python3 ros2_interface.py
This should open a new simulator session with ROS2 wrapper. You can see the available topics with 'ros2 topic list' in a new terminal and it should list all sensor topics, pose, velocity and acceleration topics and the /cmd topic from where it listens to acceleration and throttle commands 6. (optional) You can launch the example keyboard teleop script under Scripts/keyboard_controller.py that publishes to '/cmd' topic from the arrow keys to control the car. You may need to install pygame to run the script
source /opt/ros/<ROS_VERSION>/setup.bash
python3 Scripts/keyboard_controller.py
- (optional) You can open rviz2 to visualize the sensor readings
source /opt/ros/<ROS_VERSION>/setup.bash
rviz2