This is project 2 of the course ECE 276A: Sensing & Estimation in Robotics at UCSD, being taught by professor Nikolay Atanisov.
The project is based on data collected by a differential drive robot with specifications provided in the Documentation File. The robot collects Encoder Wheels, IMU, LiDAR and Kinect data.
The data from wheels and IMU are to get a dead reckoning trajectory for the robot.
This is then combined with LiDAR data to get a rough scan of the room.
An attempt to improve the results, is done using particle filter SLAM, introducing Gaussian Noise and simulating 100 particles at each step. An effective number of 10 particles over the iterations. An occupancy grid is finially built showing the room traversed by the robot.
Finally, using the trajectory obtained, the kinect RGBD data is used to project images onto the map for texture mapping.
Data for the robot's
-
Download the Kinect dataset from this Google Drive link
-
Place the sensor datasets in the folder 'data' and RGBD image data in a folder named 'dataRGBD' with the following file structure:
. ├── data │ ├── Encoder20.npz │ ├── Encoder21.npz │ ├── Hokuyo20.npz │ ├── Hokuyo21.npz │ ├── Imu20.npz │ ├── Imu21.npz │ ├── Kinect20.npz │ └── Kinect21.npz └── dataRGBD ├── Kinect20.npz └── Kinect21.npz
-
Install the following packages (assuming numpy and matplotlib are already installed):
pip install transforms3d
-
Run the file project_2_final.py
python project_2_final.py