Package to control MAVs using Visual Odometry Systems and Computer Vision
This package assumes you have a Visual Odometry Package running, such as orb_slam2_ros
or svo
(those are the already tested ones)
-
The node
control
sends the actual velocity commands to the MAV, according to a PID controller stabilizing the drone at the position stablished by the topic/viscon/set_position
-
manual_control
is a teleop_twist_keyboard based node, that controls MAV via keyboard on a terminal, with options to- Append positions to a trajectory
- Save trajectory to a file - this file will be later used by
head
node to set positions. - Activate and deactivate autonomous control (both
head
andcontrol
nodes)
-
head
and reads a trajectory from a file and publishes local positions to/viscon/set_position
topic
- Run tello driver with
roslaunch tello_driver tello_node.launch
- Set camera parameters to worse image quality (frequency is key to visual odometry algorithms) with
rosrun dynamic_reconfigure dynparam load /tello/tello [viscon]/config/dump.yaml
- Run manual control with
roslaunch viscon manual.launch
and take it of pressing '{' - Run visual odometry algorithm (in this case, orb_slam2_ros) with
roslaunch orb_slam2_ros orb_slam2_tello.launch
(it needs to be cloned from Skyrats repository) - Watch it's functioning on rqt GUI with
rosrun rqt rqt -d [orb_slam2_ros]/ros/config/rviz_config.rviz
- Save positions on the manual controller with 's'
- Save trajectory with 'f'
- run
rosrun viscon head.py
- run
rosrun viscon control.py
- Finally, press 'a' on the manual controller to enable autonomous mode!
Have fun!
We can also control our MAV based on it's position relative to an object detected by it's camera
-
Run tello driver with
roslaunch tello_driver tello_node.launch
-
Set camera parameters to worse image quality (frequency is key to visual odometry algorithms) with
rosrun dynamic_reconfigure dynparam load /tello/tello [viscon]/config/dump.yaml
-
Run manual control with
roslaunch viscon manual.launch
and take it of pressing '{' -
Run
cv_detection
'sh_node
withrosrun cv_detection h_node
- This node detects an H in the webcam image and publishes its position in
/cv_detection/h_detection
as a custom messageH_info.msg
which contains:detected
: boolean that shows if H was detecetdcenter_x
: x coordinate of H's centercenter_y
: y coordinate of H's center- area_ratio: ratio between H's area and that of the entire image
- This node detects an H in the webcam image and publishes its position in
-
Run
control
node withrosrun viscon cv_control.py
To use dynamic_reconfigure, rosrun rqt_gui rqt_gui -s reconfigure
along with the simple_control.py node
Procedure for MAVROS (using simulation package)
-
Run
simulate.sh
script- Check if last line is
roslaunch simulation H_world.launch
rosrun simulation simulate.sh
- Check if last line is
-
Run
roslaunch viscon cv_control.launch
- depends on cv_detection, it runs- cv_detection's
h_node
- viscon's
run_h_mission.py
- viscon's
cv_control.py
- cv_detection's