Our approach relies on three main sub-systems :
- The Unmanned Aerial Vehicle(UAV) :
Patrols preset area using :
- Proportional Integral Derivative(PID) controller
- records video and pose
- sends to base station.
- The base station:
- Per frame video feed analysis by Neural net(NN)
- Neural Networks Detects trash
- Filters trash positive frames
- K-means cluster location stored in CSV file.
- The Unmanned Ground Vehicle(UGV):
- Collects cluster location and moves to them;
- Scans cluster vicinity for accurate trash detection.
- ROS
- PYTHON
- XML
- Files wrt to simulation is located under lidar package.
- Files wrt to trash detection and clustering is present in vision package.
Procedure to launch the simulation:
-
Make sure you have installed models necessary for trash.world from (https://github.com/osrf/gazebo_models)
-
Make sure the .world files are present in worlds folder of your ROS package and all the models must be present in the .gazebo/models folder for the launch file to work.
-
Update the following files :
-
droneobsavoid.py
-
botmove.py
-
dronespiral.py
-
UAVpathRecorderNode.py
-
-
Command to launch the simulation :
roslaunch lidar sample.launch
Change lidar
to the name of your ROS package