Skip to content

Deep Learning based wall/corridor following P3AT robot (ROS, Tensorflow 2.0)

License

Notifications You must be signed in to change notification settings

salihmarangoz/deep_navigation

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Deep Navigation

1. Introduction

Generating velocity commands by the laser data to a neural network? Sounds like a simple idea. But it is not that easy. I created this repository to try different approaches to solve this problem, but I couldn't find time to focus on the model itself. Maybe you can create a better model?

ss1

IMAGE ALT TEXT HERE

2. Installation

  1. Install ROS Noetic.

  2. Clone the repository into ~/catkin_ws/src folder and build the workspace:

    $ cd ~/catkin_ws/src
    $ git clone https://github.com/salihmarangoz/deep_navigation
    $ cd ..
    $ rosdep install --from-paths src --ignore-src -r -y
    $ catkin build --cmake-args -DCMAKE_BUILD_TYPE=RelWithDebInfo
  3. Install Python dependencies (ToDo)

3. Running

$ source /opt/ros/melodic/setup.bash; source ~/catkin_ws/devel/setup.bash
$ roscore
  1. For creating a dataset:

    # Each in different terminal. Do not forget to source ros setup.bash files
    $ roslaunch deep_navigation simulation.launch
    $ roslaunch deep_navigation create_dataset.launch
    $ roslaunch deep_navigation ros_navigation.launch
  2. For training the network:

    • notebooks/example.ipynb (with jupyter notebook)
  3. For running the network:

    # Each in different terminal. Do not forget to source ros setup.bash files
    $ roslaunch deep_navigation simulation.launch
    $ roslaunch deep_navigation deep_navigation.launch

4. Designing a World

  1. Run gazebo:
$ gazebo
  1. In the gazebo gui: Edit -> Building Editor
  2. After building your world save the design into models in this repository.
  3. In world folder, copy deep_parkour.world, modify line 14 (change deep_parkour to your worlds model name)
  4. In simulation.launch set start_fake_mapping to false for enabling mapping. Also set world_file to its new value.
  5. Run simulation.launch and ros_navigation.launch for exploring whole world.
  6. After mapping the world, save the map with the following command:
$ rosrun map_server map_saver # Files may be saved into home folder
  1. Copy the map files into world folder, and in simulation.launch modify map_file parameter and also set start_fake_mapping to true again.
  2. Rename files and then modify the first line of .yaml file.
  3. Done.

5. Simulator Parameters

5.1. Lidar Model

Modify filedeep_navigation/models/custom_p3at/model.sdf

<ray>
  <scan>
    <horizontal>
    <samples>360</samples>          <!-- 1040!!! -->
    <resolution>2</resolution>
    <min_angle>-3.14</min_angle> <!-- 90deg: -1.570796 -->
    <max_angle>3.14</max_angle>  <!-- 90deg: 1.570796  -->
    </horizontal>
  </scan>
  <range>
    <min>0.08</min>
    <max>20.0</max>
    <resolution>0.01</resolution>
  </range>
  <noise>
  <type>gaussian</type>
      <mean>0.01</mean>
      <stddev>0.005</stddev>
  </noise>
</ray>

5.2. Simulator Speed

Modify file deep_navigation/worlds/custom.world

For parameter real_time_update_rate:

  • 1000 is real world time.
  • 2000 is 2x faster.
  • 5000 is 5x faster... and so on.
<!-- Simulator -->
<physics name="ode_100iters" type="ode">
  <real_time_update_rate>1000</real_time_update_rate>
  <ode>
    <solver>
      <type>quick</type>
      <iters>100</iters>
    </solver>
  </ode>
</physics>