This are the implementations of Udacity course "Sensor Fusion". One of the sensor which I implemented is a basic LIDAR Sensor utilizing the PCL Framework. The second one for environmental modelling is the RADAR sensor and last but not least the camera sensor. The end product we will be fused data from these sensors to track multiple cars on the road, estimating their positions and speed.
Lidar sensing gives us high resolution data by sending out thousands of laser signals. These lasers bounce off objects, returning to the sensor where we can then determine how far away objects are by timing how long it takes for the signal to return. Also we can tell a little bit about the object that was hit by measuring the intesity of the returned signal. Each laser ray is in the infrared spectrum, and is sent out at many different angles, usually in a 360 degree range. While lidar sensors gives us very high accurate models for the world around us in 3D, they are currently very expensive, upwards of $60,000 for a standard unit.
Radar data is typically very sparse and in a limited range, however it can directly tell us how fast an object is moving in a certain direction. This ability makes radars a very pratical sensor for doing things like cruise control where its important to know how fast the car infront of you is traveling. Radar sensors are also very affordable and common now of days in newer cars.
Camera TODO
Sensor Fusion by combing lidar's high resoultion imaging with radar's ability to measure velocity of objects we can get a better understanding of the sorrounding environment than we could using one of the sensors alone.
$> sudo apt install libpcl-dev
$> cd ~
$> git clone https://github.com/bonakdarf92/SensorFusion.git
$> mkdir build && cd build
$> cmake ..
$> make
$> ./environment