Herearii Metuarea, Abdoul djalil Ousseni hamza, Walter Guerra†, Andrea Patocchi, Lidia Lozano, Shauny Van Hoye, Francois Laurens, Jeremy Labrosse, Pejman Rasti, David Rousseau†.
† project lead
DeepPhenoTree is though as a tool to enable automatic detection of phenological stages associated with flowering, fruitlet, and fruit in harvest time from images using deep learning–based object detection models.
This napari plugin was generated with copier using the napari-plugin-template (None).
DeepPhenoTree – Apple Edition: a Multi-site apple phenology RGB annotated dataset with deep learning baseline models. Herearii Metuarea, Abdoul djalil Ousseni hamza, Walter Guerra, Andrea Patocchi, Lidia Lozano, Shauny Van Hoye, Francois Laurens, Jeremy Labrosse, Pejman Rasti, David Rousseau.
Herearii Metuarea; Abdoul djalil Ousseni hamza; Lou Decastro; Jade Marhadour; Oumaima Karia; Lorène Masson; Marie Kourkoumelis-Rodostamos; Walter Guerra; Francesca Zuffa; Francesco Panzeri; Andrea Patocchi; Lidia Lozano; Shauny Van Hoye; Marijn Rymenants; François Laurens; Jeremy Labrosse; Pejman Rasti; David Rousseau, 2026, "DeepPhenoTree - Apple Edition", https://doi.org/10.57745/NORPF1, Recherche Data Gouv, V5, UNF:6:FyJNuJx4BVZxWuG8hI4gEw== [fileUNF]
You can install deepphenotree via pip:
pip install deepphenotree
If napari is not already installed, you can install deepphenotree with napari and Qt via:
pip install "deepphenotree[all]"
To install latest development version :
pip install git+https://github.com/hereariim/deepphenotree.git
GPU is mandatory for time processing and models running (especially RT-DETR). Please visit the official PyTorch website to get the appropriate installation command: 👉 https://pytorch.org/get-started/locally
Exemple : GPU (CUDA 12.1)
pip install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cu121
from deepphenotree._sample_data import DeepPhenoTreeData
# Flowering data
data_flower = DeepPhenoTreeData('Flowering')
images = data_flower.data # Dimension : (5120, 5120, 3, 4)
country = data_flower.names # ['Belgium', 'Italy', 'Spain', 'Switzerland']
# Fruitlet data
data_fruitlet = DeepPhenoTreeData('Fruitlet')
# Fruit data
data_fruit = DeepPhenoTreeData('Fruit')
from deepphenotree.inference import YoloInferencer
image = # Your RGB image
# Flowering task
infer = YoloInferencer("Flowering")
bbx = infer.predict_boxes(image)
# Fruitlet task
infer = YoloInferencer("Fruitlet")
bbx = infer.predict_boxes(image)
# Fruit task
infer = YoloInferencer("Fruit")
bbx = infer.predict_boxes(image)
This plugin is a tool to perform targeted image inference on user-provided images. Users can run three specific detection tasks via dedicated buttons: flowering, fruitlet, and fruit detection. The plugin returns the coordinates of bounding boxes around detected objects, and a message informs the user of the number of detected boxes. Several developments are ongoing—feel free to contact us if you have requests or suggestions.
User drag and drop RGB image on napari window. Otherwise, user can select an image among suggested images from the plugin :
File > Open Sample > DeepPhenoTree > images
Note : The images available in Open Sample > DeepPhenoTree correspond to the test data associated with the models provided in this plugin.
User click to make inference in image :
- Flowering : Detect all objects (from BBCH 00 to BBCH 69) from bud developpement to flowering.
- Fruitlet : Detect fruit in developement (from BBCH 71 to 77)
- Fruit : Detect all fruit in harvest time (from BBCH 81 to 89)
Bounding box displayed in layer Flowering for flowering, Fruitlet for fruitlet and Fruit for fruit.
DeepPhenoTree consists of a RT-DETR trained on DeepPhenoTree dataset.
The trained models used in this project are not publicly available. They are part of ongoing research and collaborative projects, and therefore cannot be distributed at this time.
However, the codebase is provided to ensure reproducibility and transparency of the proposed methodology.
Standard deviation is computed over 5-fold cross-validation. Overall (4 sites) denotes the aggregated evaluation across the four experimental sites (Switzerland, Belgium, Spain, and Italy).
| Dataset | Location | Precision | Recall | mAP@.5 | mAP@.5:.95 |
|---|---|---|---|---|---|
| Overall (4 sites) | 0.69 ± 0.01 | 0.58 ± 0.02 | 0.65 ± 0.02 | 0.37 ± 0.02 | |
| Switzerland | 0.73 ± 0.02 | 0.60 ± 0.04 | 0.68 ± 0.03 | 0.40 ± 0.04 | |
| Flowering | Belgium | 0.72 ± 0.02 | 0.63 ± 0.03 | 0.69 ± 0.03 | 0.40 ± 0.03 |
| Spain | 0.66 ± 0.01 | 0.53 ± 0.05 | 0.60 ± 0.03 | 0.30 ± 0.02 | |
| Italy | 0.69 ± 0.04 | 0.61 ± 0.03 | 0.67 ± 0.04 | 0.40 ± 0.04 | |
| ------------ | ---------------------- | ---------------- | ---------------- | -------------- | -------------- |
| Overall (4 sites) | 0.85 ± 0.02 | 0.73 ± 0.02 | 0.82 ± 0.02 | 0.53 ± 0.01 | |
| Switzerland | 0.86 ± 0.04 | 0.78 ± 0.04 | 0.84 ± 0.06 | 0.56 ± 0.04 | |
| Fruitlet | Belgium | 0.83 ± 0.03 | 0.65 ± 0.04 | 0.77 ± 0.04 | 0.52 ± 0.14 |
| Spain | 0.86 ± 0.02 | 0.72 ± 0.03 | 0.81 ± 0.03 | 0.52 ± 0.03 | |
| Italy | 0.88 ± 0.01 | 0.80 ± 0.01 | 0.88 ± 0.01 | 0.61 ± 0.01 | |
| ------------ | ---------------------- | ---------------- | ---------------- | -------------- | -------------- |
| Overall (4 sites) | 0.87 ± 0.01 | 0.79 ± 0.01 | 0.86 ± 0.01 | 0.57 ± 0.01 | |
| Switzerland | 0.86 ± 0.03 | 0.80 ± 0.02 | 0.87 ± 0.02 | 0.59 ± 0.01 | |
| Fruit | Belgium | 0.90 ± 0.01 | 0.84 ± 0.01 | 0.90 ± 0.01 | 0.63 ± 0.02 |
| Spain | 0.86 ± 0.02 | 0.75 ± 0.02 | 0.84 ± 0.02 | 0.51 ± 0.03 | |
| Italy | 0.88 ± 0.02 | 0.84 ± 0.03 | 0.90 ± 0.02 | 0.66 ± 0.02 |
DeepPhenoTree – Apple Edition, a multi-site, multi-variety, RGB image dataset dedicated to the classification of key apple treephenological stages.
This work was supported by the PHENET project. The authors also acknowledge IDRIS for providing access to high-performance computing resources.
Imhorphen team, bioimaging research group 42 rue George Morel, Angers, France
- Herearii Metuarea, herearii.metuarea@univ-angers.fr
- Abdoul-Djalil Ousseini Hamza, abdoul-djalil.ousseini-hamza@inrae.fr
- Pr David Rousseau, david.rousseau@univ-angers.fr
Contributions are very welcome. Tests can be run with tox, please ensure the coverage at least stays the same before you submit a pull request.
Distributed under the terms of the GNU LGPL v3.0 license, "deepphenotree" is free and open source software
If you encounter any problems, please file an issue along with a detailed description.
If you use DeepPhenoTree plugin in your research, please use the following BibTeX entry.
Not available