A ROS package for running carpet localisation on a ROS enabled robot.
Includes gazebo simulation of a representative carpet world for testing.
For an overview of the broader project, see the wiki.
Figure: Roomba in target localisation enviroment. Localisation is achieved with reference to a map of the carpet color pattern throughout the office.
Build process is tested on ROS Noetic / Ubuntu Focal.
create_robot is also built from source, as the gazebo simulation uses create_description, and this is not currently released in Noetic.
# apt deps:
sudo apt install python3-vcstool python3-catkin-tools
# clone sources:
mkdir -p catkin_ws/src && cd catkin_ws/src
curl https://raw.githubusercontent.com/tim-fan/carpet_localisation_ros/main/workspace.repos | vcs import
# install rosdeps:
rosdep install --from-paths carpet_localisation/ create_robot/create_description/ map_to_odom_publisher/ rviz_textured_quads/ --ignore-src --rosdistro noetic
# disable buiding packages which fail on noetic:
touch create_robot/create_bringup/CATKIN_IGNORE
touch create_robot/create_driver/CATKIN_IGNORE
# build:
cd ..
catkin config --extend /opt/ros/noetic/
catkin build
To launch the simulated carpet world:
source devel/setup.bash
roslaunch carpet_localisation carpet_simulation.launch
You should see a gazebo world like the following:
If you have an xbox-style controller connected, you will be able to drive the simulated carpet robot around the environment.
Now launch localisation:
roslaunch carpet_localisation carpet_world_localisation.launch
And visualisation:
roslaunch carpet_localisation rviz.launch
RViz will show a localisation dashboard, showing the carpet camera feed, images with classification results, and current particle filter state:
- find a better way of determining current pose from particle cloud (mean of oldest 20%?)
- expose particle filter config params (n_particles, noise params, update thresholds)
- dynamic reconfigure support
- parameter optimisation (w.r.t accuracy to ground truth in gazebo)