Skip to content

Ashutosh1510/DeepRL_Robot_Nav

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Deep Reinforcement Learning For Real Autonomous Mobile Robot Navigation In social Environments

Simulation

Setput Packages and libraries

cd simulation
git clone https://github.com/sybrenstuvel/Python-RVO2.git
pip install -r requirements.txt 
python setup.py build
python setup.py install
  • Install crowd_sim and crowd_nav into pip
pip install -e .

After installing all the packages Run below commands to train and test the RL model

  1. Train a policy.
python train.py --policy scr

Trained Model optimal output:

video_e0.mp4
  1. Test policies with 500 test cases.
python acceptance_test_score.py --policy scr --model_dir data/output --phase test

Score Report:

2022-05-04 14:18:57, INFO: Using device: cpu 2022-05-04 14:18:57, INFO: Policy: SCR w/ global state 2022-05-04 14:18:57, INFO: human number: 5 2022-05-04 14:18:57, INFO: Not randomize human's radius and preferred speed 2022-05-04 14:18:57, INFO: Square width: 10.0, circle width: 4.0 2022-05-04 14:18:57, INFO: Agent is visible and has holonomic kinematic constraint 2022-05-04 14:49:25, INFO: TEST has success rate: 0.94, collision rate: 0.06, nav time: 9.06, total reward: 0.3574 2022-05-04 14:49:25, INFO: Frequency of being in danger: 3.61, average min separate distance in danger: 0.08 2022-05-04 14:49:25, INFO: Collision cases: 16 21 42 56 60 75 95 112 115 116 129 150 152 166 212 218 220 238 370 392 404 407 426 434 436 442 459 464 469 483 489 490 2022-05-04 14:49:25, INFO: Timeout cases:

  1. Visualize a test case
python acceptance_test_visualisation.py --policy scr --model_dir data/output --phase test --visualize --test_case 0

Test Case 0

test0.mov

Test Case 490

test490.mov
  1. Ocupancy Map visualization:

scr

Hardware

Setput Packages and libraries

  1. Install ROS Noetic.
  2. Create and build a catkin workspace and download the codes into src/:
mkdir -p ~/catkin_ws/src
cd ~/catkin_ws/
catkin build
source devel/setup.bash
cd src
git clone https://github.com/Ashutosh1510/DeepRL_Robot_Nav

Start the Navigation

  1. Before starting the navigation, make sure your PC is connected with Turtlebot3 and the lidar sensor (https://emanual.robotis.com/docs/en/platform/turtlebot3/appendix_lds_01/#lds-01).
  2. Bringup the turtlebot
roslaunch turtlebot3_bringup turtlebot3_robot.launch
  1. Build a map of your environment using gmapping package:
roslaunch turtlebot3_navigation move_base.launch
roslaunch turtlebot3_navigation amcl.launch

Then push or tele-operate the robot to explore the environment and build a map. You will be able to view the real-time navigation in RVIZ. To save the map, open a new terminal and run:

mkdir -p ~/catkin_ws/src/DeepRl-Planning/deepRL_ros/map
rosrun map_server map_saver -f ~/catkin_ws/src/DeepRl-Planning/deepRL_ros/map/new_map
  1. Start navigation using the our proposed policy to reach goal location:
roslaunch deeprl_ros deepRL_navigation.launch

Output

rl1

rl2

rl3

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published