Skip to content

This project aims to leverage existing LIDAR data used in SLAM alongside the You Only Look Once (YOLO) network for efficient object lo- calization, classification, and tracking.

License

Notifications You must be signed in to change notification settings

Esteb37/object-conscious-slam

Repository files navigation

Easy Object-Conscious SLAM

Esteban Padilla Cerdio


About the Project

Simultaneous Localization and Mapping (SLAM) allows robots to understand where they currently are in relation to an environment, and at the same time create a map of their surroundings. Object-Conscious SLAM (OCSLAM) introduces an additional layer of knowledge by allowing the robot to locate and classify objects within this world. I propose a simple method that leverages LIDAR information already being used for regular SLAM, in combination with the You Only Look Once (YOLO) network for locating, classifying and tracking objects.

System Architecture

OCSLAM ARchitecture

Demonstration Video

Easy Object-Conscious SLAM

Presentation Video

OCSLAM Presentation

About

This project aims to leverage existing LIDAR data used in SLAM alongside the You Only Look Once (YOLO) network for efficient object lo- calization, classification, and tracking.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published