Skip to content
Visual-servoing based navigation for monitoring row-crop fields.
C++ CMake
Branch: master
Clone or download
Latest commit 0d6634f Sep 14, 2019

Visual Crop Row Navigation


This is a visual-servoing based robot navigation framework tailored for navigating in row-crop fields. It uses the images from two on-board cameras and exploits the regular crop-row structure present in the fields for navigation, without performing explicit localization or mapping. It allows the robot to follow the crop-rows accurately and handles the switch to the next row seamlessly within the same framework.

This implementation uses C++ and ROS and has been tested in different environments both in simulation and in real world and on diverse robotic platforms.

This work has been developed @ IPB, University of Bonn.

Check out the video of our robot following this approach to navigate on a test row-crop field.



  • No maps or localization required.
  • Running on embedded controllers with limit processing power (Odroid, Raspberry Pi).
  • Simulation environment in Gazebo.
  • Robot and cameras agnostic.

Robotic setup

This navigation framework is designed for mobile robots equipped with two cameras mounted respectively looking to the front and to the back of the robot as illustrated in the picture below.

agribot_3d camera_img

A complete Gazebo simulation package is provided in agribot_robot repository including simulated row-crop fields and robot for testing the navigation framework.

husky_navigation gazebo_navigation


  • c++11
  • catkin
  • opencv >= 2.4
  • Eigen >= 3.3

How to build and run

  1. Clone the package into your catkin_ws
cd ~/catkin_ws/src
git clone
  1. Build the package
cd ~/catkin_ws
catkin build visual-crop-row-navigation
  1. Run ROS driver to stream images from the robot's cameras, for example using usb_cam
  1. Run visual servoing navigation
roslaunch visual-crop-row-navigation visualservoing.launch

Successfully tested using:

  • Ubuntu 16.04
  • ROS kinetic

Test data

Download the bagfile used for our experiments here.


This work has partly been supported by the German Research Foundation under Germany’s Excellence Strategy, EXC-2070 - 390732324 (PhenoRob).

You can’t perform that action at this time.