Skip to content

A Project on exploring autonomously and simultaneously mapping the environment using Turtlebot3.

Notifications You must be signed in to change notification settings

OmKharat/Autonomous-Exploration-IvLabs-

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 

Repository files navigation

Autonomous-Exploration-IvLabs

A Project on exploring autonomously and simultaneously mapping the environment using Turtlebot3.

Table of Contents

Description 🤖

This project uses Robot Operating System(ROS) to move Turtlebot autonomously in an unexplored area with the help of LiDAR and at the same time create a map of the area.

Turtlesim 🐢

To understand some basic functionalities of rospy library which is used throughout this project and velocity control of a robot, Turtlesim node is used.

Tracing some common shapes -

Circle Square Square Spiral

Applying P controller to control the turtles -

Go-to-goal Leader Follower Formation Control

Turtlebot3 Simulation and Mapping

We first performed teleoperation(operating manually through keyboard) and both mapping techniques in simulation and then we implemented the same on hardware.

Teleoperation -

Hector mapping -

In hector-slam, it uses previous scan results to estimate the current state of the system. So a drift from the beginning will be recorded and results in a random rotation and translation of the map frame against other ground truth frames

Gmapping -

Docker

To run the code on Turtlebot2, we used ROS melodic installed in a Docker container.

To setup the Docker Container and connect to Turtlebot2

  • Install Docker on your device
  • Extract the Turtlebot2.zip file, provided in this repository, and open it in VsCode
  • Build the container and try resolving the errors, if any
  • In new Terminal, Run the command bash .devcontainer/post_create_commands.sh
  • Connect the Kobuki cable to your pc and in new terminal run roslaunch turtlebot_bringup minimal.launch

Results

We finally deployed our code on Turtlebot2 and, after facing some issues and refactoring the code, we got the following results -

Avoiding Obstacles -

The ranges' list in LiDAR data has a length of 720. Central region is defined in first 40 values and last 40 values which would correspond to 20 degrees left and right of the normal line to the robot. Left and Right regions are defined by next 140 values from beginning and end of the list respectively. Average value of range is found in each of these regions and if found less than the safe distance of robot from an obstacle, in any region then move or turn or move in some other direction.

As we used only one sensor in this project, which is LiDAR, only hector mapping was possible.

Maps of a Corner of Lab -

Maps of other parts -

Looking Forward to -

  • Removing Distortions in the maps
  • Applying Path Planning Algorithms

About

A Project on exploring autonomously and simultaneously mapping the environment using Turtlebot3.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published