This GitHub repository documents a Winter 2022 Mechanical Engineering capstone project at Ryerson University describing the design, simulation, and construction of an unmanned aerial manipulator (UAM). In this repo you will find information on our project, the ROS software architecture and code, Gazebo simulation files, and CAD files for 3D printing.
This work represents the efforts of the following individuals over a three-month period: Jill Aghyourli Zalat, Muhammad Ali, Isaac Correa, and Matthew Drummond-Stoyles.
A Kakute-F7 flight controller is used for manual control of the drone.
A Raspberry Pi Zero 2 is used as the onboard computer. To run ROS and interface with the ground computer, we decided to install Ubuntu Server on the Pi (Ubuntu running on a command line) as opposed to operating it using Raspian. Instructions for doing so can be found here.
For image recognition, a Raspberry Pi Camera v1.3 was used.
For control of three servo motors mounted on the arm, the PCA9685 servo driver board was used.
Most parts were 3D printed on an Ender 3 Pro using PLA filament, including the arm links, gripper, drone-to-arm interface, drone legs, and drone guards. CAD files for 3D printing, as well as for viewing the whole assembly, can be found in the folder "CAD Files".
Ardupilot was chosen to run on the drone flight controller. Due to time constraints, the drone was only able to be controlled manually.
The arm control is run jointly via the ground computer and the robotic arm using a ROS architecture, which can be seen below. The script object_tracking.py
publishes arm angles as ROS topics based on what is seen in the video feed. The Raspberry Pi subscribes to these topics and moves the corresponding servo motors in the script arm_client.py
.
To run ROS, ROS Noetic (ROS-Base installation) was installed on both the ground computer and the Raspberry Pi, using these instructions.
When ROS is set up, the IP addresses should be configured on both the Pi and computer. For this project, it was chosen that the Pi is the ROS Master. To set this up, run the following code.
On the computer:
nano ~/.bashrc
- Scroll to the end of the file (can be done using CTRL+END). Add or edit the following lines:
export ROS_MASTER_URI=http://PI_IP_ADDRESS:11311
export ROS_IP=COMPUTER_IP_ADDRESS
export ROS_HOSTNAME=COMPUTER_IP_ADDRESS
- Save the file and exit, then run
source ~/.bashrc
.
On the Pi:
nano ~/.bashrc
- Scroll to the end of the file (can be done using CTRL+END). Add or edit the following lines:
export ROS_MASTER_URI=http://PI_IP_ADDRESS:11311
export ROS_IP=PI_IP_ADDRESS
export ROS_HOSTNAME=PI_IP_ADDRESS
- Save the file and exit, then run
source ~/.bashrc
.
You can find out the IP address of the computer or Pi by running ifconfig
in a terminal.
In one terminal on the Pi:
- Clone raspicam_node into
catkin_ws/src
:git clone https://github.com/UbiquityRobotics/raspicam_node
cd ..
source devel/setup.bash
catkin_make
- Run a launch file for
raspicam_node
(parameters might need to be adjusted if there is high delay in the camera feed):roslaunch raspicam_node camerav1_1280x720.launch
To ensure the camera is working, run the roslaunch command given in Step 4 on the Pi, then on the ground computer run the following command: rqt_image_view
. The rqt_image_view should open showing the live camera feed.
In another terminal on the Pi:
- Ensure the PCA9685 servo driver module is wired up to the Pi properly, then enable I2C on the Pi.
- Install Adafruit's PCA9685 servo driver library. Run the test code given in the link to ensure everything is functional. You should see the servo motors move when running the code.
- Ensure the folder
jimm_arm_control
is in thecatkin_ws/src
directory, and you rancatkin_make
andsource devel/setup.bash
in thecatkin_ws
directory.
On the ground computer:
- Ensure the folder
jimm_arm_control
is in thecatkin_ws/src
directory, and you rancatkin_make
andsource devel/setup.bash
in thecatkin_ws
directory. - Ensure OpenCV is installed.
- In a terminal, run the following code:
rosrun jimm_arm_control object_tracking.py
. After a few moments the camera feed should open up. When the payload is moved closer to the camera, a green contour is drawn, and the coordinates of the object center as well as its area are printed in red.
In another terminal on the Pi:
- Run the following command:
rosrun jimm_arm_control arm_client.py
.
Now the arm servos should move in response to the payload seen in the image feed.
Credit goes to Intelligent Quads for their resources on Gazebo.
- Install Ardupilot and MAVProxy. This will also create a SITL simulation using Ardupilot and will act as a ground station which is later utilized to send commands to the drone.
- Next, install Gazebo, a robotic simulator and Ardupilot Plugin which will be needed to interface models in Gazebo and enable communication with Ardupilot.
- Next, install ROS noetic distribution and MAVROS. Subsequently, install the ROS plugins using
sudo apt install ros-noetic-gazebo-ros ros-noetic-gazebo-plugins
.
-
In a terminal, to ensure no Gazebo servers are running in the background use the command
killall gzservers
. -
In a new terminal window, to spawn the default drone in the SITL simulation in a custom gym environment run:
roslaunch iq_sim myworld.launch
-
Once Gazebo opens up with the quadcopter spawned in the gym environment, click on the drone, open the pose option on the left hand panel and change the z coordinate to 1.1. Click on the drone again to deselect it.
-
In a new terminal to start the Arducopter SITL simulation and start the mavproxy ground station by using the following command:
~/startsitl.sh
-
In a new terminal run mavros which is used to allow communication between Ardupilot and ROS. This can be done by running:
roslaunch iq_sim apm.launch
-
In a new terminal to get the quadcopter to move in the desire path and achieve specific waypoints (written in the python script
drone_path.py
), run the following command:rosrun iq_gnc drone_path.py
-
Finally, in the terminal running the ground station, once the following lines are seen in the MAV console window (may take a few minutes):
“APM: EKF2 IMU0 Origin set to GPS”
“APM: EKF2 IMU1 Origin set to GPS”
Type stabilize
and subsequently type mode guided
in the command window. This will arm the drone and enable it to move in the desired path. It will appear in the MAV Console window as in the figure below.