Skip to content
Branch: master
Find file History
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Permalink
Type Name Latest commit message Commit time
..
Failed to load latest commit information.
README.md
colorTuner.conf
follow_turtlebot.launch
my_solution.py

README.md

Follow Turtlebot Exercise

Goal

The goal of this exercise is to implement the logic that allows a quadrotor to follow a turtlebot on the ground by following the movements made by it.

World

Requirements

As this is a drones exercise, you will need to additionally install the jderobot-assets, dronewrapper and rqt_drone_teleop packages. These can be installed as

sudo apt-get install ros-kinetic-drone-wrapper ros-kinetic-rqt-drone-teleop ros-kinetic-jderobot-assets

There is an additional dependancy on MAVROS and PX4 that we are in the process of simplifying, however at the moment just use the script provided here

Apart from these, as this exercise also requires a ground robot, you will need the rqt_ground_robot_teleop package. We are in the process of making it available through apt-get however until that is available, the best method for it would be to clone it into your catkin_ws.

Execution

To launch the exercise, simply use the following command from this directory:

roslaunch follow_turtlebot.launch

As an easy way to find the values for the color filtering, you can use the colorTuner tool provided in your jderobot installation. After launching the previous command, launch the colorTuner in a seperate terminal as follows:

colorTuner colorTuner.conf

Solution

To solve the exercise, you must edit the my_solution.py file and insert the control logic into it. Your code is to be entered in the execute function between the Insert your code here comments. my_solution.py

def execute(event):
  global drone
  img_frontal = drone.get_frontal_image()
  img_ventral = drone.get_ventral_image()
  # Both the above images are cv2 images
  ################# Insert your code here #################################

  set_image_filtered(img_frontal)
  set_image_threshed(img_ventral)

  #########################################################################

API

  • set_image_filtered(cv2_image): If you want to show a filtered image of the camera images in the GUI
  • set_image_threshed(cv2_image): If you want to show a thresholded image in the GUI
  • drone.get_frontal_image() : Returns the latest image from the frontal camera as a cv2_image
  • drone.get_ventral_image() : Returns the latest image from the ventral camera as a cv2_image
  • drone.get_position(): Returns the position of the drone as a numpy array [x, y, z]
  • drone.get_orientation(): Returns the roll, pitch and yaw of the drone as a numpy array [roll, pitch, yaw]
  • drone.get_roll(): Returns the roll of the drone
  • drone.get_pitch(): Returns the pitch of the drone
  • drone.get_yaw(): Returns the yaw of the drone
  • drone.set_cmd_vel(vx, vy, vz, az): Commands the linear velocity of the drone in the x, y and z directions and the angular velocity in z in its body fixed frame

Demonstrative video (in spanish)

https://www.youtube.com/watch?v=uehDVlBzpmU

You can’t perform that action at this time.