Skip to content

isarlab-department-engineering/ARDVO

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

22 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

ARD-VO: Agricultural Robot Dataset of Vineyards and Olive groves

ARD-VO is a Real-world extensive set of data to support the development of solutions and algorithms for precision farming technologies.

License

ARD-VO is released under the following License:

Attribution-NonCommercial-ShareAlike 3.0 CC BY-NC-SA 3.0.

This means it is possible:

  • to copy, distribute, display, and perform the work.
  • to make derivative works.

Under the following conditions:

  • Attribution: You must give the original author credit.
  • Non-Commercial β€” You may not use this work for commercial purposes.
  • Share Alike β€” If you alter, transform, or build upon this work, you may distribute the resulting work only under a licence identical to this one.

If you use ARD-VO in an academic work, please cite:

@article{crocetti2023ard,
  title={ARD-VO: Agricultural robot data set of vineyards and olive groves},
  author={Crocetti, Francesco and Bellocchio, Enrico and Dionigi, Alberto and Felicioni, Simone and Costante, Gabriele and Fravolini, Mario L and Valigi, Paolo},
  journal={Journal of Field Robotics},
  year={2023},
  publisher={Wiley Online Library}
}

1. Project Agrobot

The AGROBOT project was funded by Umbria Region PSR program 2014-2020, Focus Area 2A and aims to develop and demonstrate in real application contexts the technologies necessary to automate some crop scouting and monitoring operations (mainly related to olive grove and vineyard). The explicit objective of the project is to use state-of-the-art technologies in the field of mobile robotics and image processing in order to reduce the time and costs of regularly monitoring the physiological and phytosanitary crop state. The project outcome is a prototype vehicle able to drive autonomously, equipped with sensing equipment to carry out on-line agriculturally meaningful monitoring.

2. The robotic platform

We collected a dataset with a robotic platform so-called "Agrobot". In the following the body specs and the sensors equipments of the robotic platform.

Body Specs

Robot Body Measurements
Agrobot - Body measurements
π‘Ž 1.30 [π‘š] 𝑏 0.77 [π‘š] 𝑐 0.80 [π‘š]
𝑑 0.20 [π‘š] 𝑒 0.77 [π‘š] 𝑓 0.74 [π‘š]
𝑔 β‰ˆ 0.24 [π‘š] weight β‰ˆ 700 [𝐾𝑔]
Tyres
Genial Tyre Agri Line 6.5βˆ•80 𝑅13

Sensors, Equipments and connection overview

Agrobot - Sensors displacement Agrobot - Sensors connection
Multispectral Camera: RedEdge MX camera.
360Β° LIDAR: Velodyne Puck Lite.
Inertial and Position measurement units: Swift Duro Inertial.
Front camera rig (two units, left and right) Camera Module: Blackfly S BFS-PGE-04S2C-C
Lens FIFO-0420MM C-mount
DC Brushless Motor Inverters: Roboteq HBL2360a
X90 mobile controller: X90 mobile control system
Main computer: Intel i7-9700E CPU with an NVIDIA GeForce RTX 2060. Two SO-DIMM slots equip 32GB of DDR4 2666 MHz RAM. We decided to install two different storage systems: one PCI Express x4 NVMe 1.3, 256GB for the operating system (Ubuntu 20.04 LTS 64 bit) and two SATA SSD 2.5", 2TB disks for data.
Ethernet switch: Oring TGXPS-1080-M12-24V Series

3. Data collection campaign

ARD-VO has been collected with an Unmanned Ground Vehicle (UGV) equipped with different heterogeneous sensors that capture information essential for robot localization and plant monitoring tasks. It is composed of sequences gathered in 11 experimental sessions between August and October 2021, navigating the UGV for several kilometers in four cultivation fields in Umbria, a central region of Italy: (a)-(b) vineyards, (c)-(d) olive crops

Vineyard an Olive Crops used to gather data

Alias name Crop Variety lat(N) , lon(E) # Sessions Date: dd, mm, yyyy
Vynrd A Grechetto Todi G5 43.004491, 12.294889 1 3, September 2021
Vynrd B Grechetto Todi G5 42.812355, 12.418741 2 4, August 2021
1, September 2021
OlvCs A Moraiolo 42.967206, 12.407057 4 14-23-30, September 2021
13, October 2021
OlvCs B Moraiolo, Leccino, Frantoiano 42.961702, 12.412744 4 14-23-30, September 2021
13, October 2021

4. Dataset

For each session, two sets of data are available: the first is made using the sensors connected to the onboard computer, and the second only with the multispectral camera, whose streams are independently geotagged by using the GPS of the RedEdge Micasese kit.

LINK DATASET DOWNLOAD

http://sira.diei.unipg.it/supplementary/public/Datasets/ARD-VO/

4.1 ROSBAG for onboard devices recorded data.

We used ROS Noetic to handle the data related to the devices directly connected to the onboard unit, while the multispectral images are stored directly on the SSD memory of the RedEdge camera. Since the duration of the sessions may vary between one and two hours, each one is split into a number of shorter sequences.

The rosbag package allows recording these topics and messages in a unique file that can be executed in batches to reproduce the experiment. The following table reports a description of the collected data, the message types, and the topics available in this dataset.

Topic Message Type Description
/flir_adk/front/left/image_raw
/flir_adk/front/right/image_raw
sensor_msgs/Image Raw images from the frontal FLIR cameras.
/gps/duro/current_pose geometry_msgs/PoseStamped (π‘₯, 𝑦, 𝑧) metric position and orientation (quaternion)
/gps/duro/fix sensor_msgs/NavSatFix GPS lat/long with variance.
/gps/duro/imu sensor_msgs/Imu Angular velocity and acceleration about (π‘₯, 𝑦, 𝑧) axes.
/gps/duro/odom nav_msgs/Odometry Estimates of position and velocity with the respect to the reference frame.
/gps/duro/rollpitchyaw geometry_msgs/Vector3 Orientation.
/velodyne_points sensor_msgs/PointCloud2 Velodyne laser scanned points transformed in the original frame of reference.
/agrobot/Inverter/HBL2360A_L
/agrobot/Inverter/HBL2360A_R
diagnostic_msgs/DiagnosticArray Diagnostic messages from the Inverters.

4.1.1 Extracted data examples

RGB AND LIDAR

In the following, as en example some RGB images and laser scans extracted from ARD-VO dataset.
The first two rows contain the RGB images collected by the left (a-f) and the right (g-l) cameras, respectively.
The third row includes examples of laser scans. The first three columns (a-c, g-i, m-o) contain samples from the sequences gathered in the olive crops (OlvCS-A,B), while the latter three (d-f, j-l, p-r) from those collected in the vineyards (Vynrd-A,B).

RGB images and Lidar example images

INVERTERS

The topics related to the inverters contain useful diagnostic data and flags provided from the inverters. Some flags are coded according to the manufacturer (Roboteq) specs. To further use them you need to download the inverter manual already provided in the previous table. Each inverter has two channels (CH1,CH2) that are used to drive the front and rear wheels of each side (Left and Right) of the robot. The gathered data also include RPM of the motors and the power consumption.

Inverters data example

Note: Each wheel is connected with a kinematic chain of 80:1 transmission ratio to a 2 kW three-phase brushless BLCD motor that can reach 4300 rpm and 4.6 Nm torque.

DURO IMU AND GPS-RTK

The Duro intertial device provides GNSS + INS position with a centimeter-level accuracy at up to 10Hz Update Frequency. The latitude/longitude measurements with variance are provided within a specific topic. It should be noted that the topic /gps/duro/imu containing theThe IMU messages does not include the orientation but only accelerations and angular velocities. The orientation is computed by the IMU firmware and published on the following topics: /gps/duro/rollpitchyaw, /gps/duro/current_pose and /gps/duro/odom. The image below shows an example of accelerations and angular velocities extracted from a sequence.

Imu data example

4.1.2 Post-processing

Due to voltage oscillations, shocks, network congestion, and ROS node miscommunications, we experienced spurious and corrupted frames among the raw RGB image sets.
In order to produce a high-quality dataset, the collected video sequences were, therefore, subjected to a frame-by-frame automated check to ensure their integrity and guarantee frames equally spaced in time.
As a result, the image streams in the postprocessed sequences are free from corrupted data and characterized by a framerate in the range of 8-10 FPS.

The un-processed (raws) sequences are also provided in separate folders for completeness and further studies.

4.2 Multispectral data

We employed a RedEdge MX camera.

4.2.1 Images data

The module shots five different images with a fixed resolution of 1280x960 pixels at 1 Hz, corresponding to different bands:

  • Blue,475 (32)
  • Green, 560 (27)
  • Red, 668 (14)
  • Red, Edge 717 (12)
  • Near, IR 842 (57)

with the following convention: Band-name,CenterFrequency (Bandwidth).

The following figure shows an example of a set of images referring to the same subject.

Micasense multi-bands images

4.2.2 Calibration

Before each experimental session, radiometric calibration is performed to compensate for sensor black-level, sensor sensitivity, sensor gain and exposure settings, and lens vignette effects. The radiometric model is used to normalize the pixel value in the range 0 to 1: dividing the raw digital number for the pixel by 2𝑁 , where 𝑁 is the number of bits in the image: in this case we selected 16-bit thus 𝑁 = 65536.
This normalization applies to both pixel and black-level values.

4.2.3 Data

The camera embeds five optimized imagers with their sensor and filter. The camera module also has one stand-alone external GPS to geotag the images. all the information are stored in the meta-data (EXIF) of the images that can be extracted with dedicated tools or programming libraries:

Note: the images are not synchronized with the rest of the data. Nonetheless, the metadata associated with the images provides the GPS and the timestamp information can be used to perform the alignment with the other sensors. Since, depending on the needs, there might be different strategies to achieve this alignment, we preferred to keep the images provided by the sensor, leaving to the users the possibility to implement the post-processing and the alignment procedures as deemed appropriate.

4.2.4 Examples of stand-alone study

The availability of isolate geotagged multi-spectral images allows to perform studies that are not directly related to robotics, so there is no need to download the entire dataset.
The following set of images shows as example, the NDVI maps computed for the four cultivation using just the multispectral images.

Ndvi map example

4.3 Data organization

ARD-VO
β”œβ”€β”€ Processed 
β”‚   β”œβ”€β”€ Olvcs A
β”‚   β”‚     └── 13_Oct_2021
β”‚   β”‚     β”‚        └── 2021-10-13-12-43-46_clean.bag
β”‚   β”‚     β”‚        └── 2021-10-13-12-53-28_clean.bag
β”‚   β”‚     └── ....
β”‚   β”‚     └── 30_Sep_2021
β”‚  ...
β”‚   └── Vynrd B
β”‚         └── 01_Sep_2021
β”‚         β”‚        └── 2021-09-01-10-49-44_clean.bag
β”‚         β”‚        └── ....
β”‚         β”‚        └── 2021-09-01-12-25-09_clean.bag
β”‚         └── 04_Aug_2021
β”œβ”€β”€ Raw
β”‚   β”œβ”€β”€ Olvcs A
β”‚   β”‚     └── 13_Oct_2021
β”‚   β”‚     β”‚        └── 2021-10-13-12-43-46.bag
β”‚   β”‚     β”‚        └── 2021-10-13-12-53-28.bag
β”‚   β”‚     └── ....
β”‚   β”‚     └── 30_Sep_2021
β”‚  ...
β”‚   └── Vynrd B
β”‚         └── 01_Sep_2021
β”‚         β”‚        └── 2021-09-01-10-49-44.bag
β”‚         β”‚        └── ....
β”‚         β”‚        └── 2021-09-01-12-25-09.bag
β”‚         └── 04_Aug_2021
└── Multispectral

5. Partners

partners_agrobot

The project consortium comprises the Engineering Department, University of Perugia; the Agricultural, Food and Environmental Sciences Department, University of Perugia; the Institute of Life Sciences, Sant’Anna School of Advanced Studies; Cratia srl; Assoprol Umbria soc. coop. Agr; the β€œCiuffelli Einaudi” Technical Agricultural Institute; and Infomobility srl.

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published