Skip to content

PRBonn/TCoRe

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

47 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Efficient and Accurate Transformer-Based 3D Shape Completion and Reconstruction of Fruits for Agricultural Robots

This repo will contain the code for the fruit completion and reconstruction method proposed in our ICRA'24 paper that you can find at this link

The main contribution of this paper is a novel approach for completing 3D shapes combining template matching with deep learning. First, we use a 3D sparse convolutional backbone to extract point-wise features. We then aggregate such features into vertex features and feed them to a transformer decoder that iteratively deforms our template. Such an architecture allows us to estimate the complete 3D shape of fruits when only a partial point cloud is available

How to Install

[Details (click to expand)] We tested our code on a system with Ubuntu 22.04 and CUDA 11.8.

For compatibility reasons, we reccomend creating a conda environement with Python 3.9:
conda create --name tcore python=3.9 && conda activate tcore

Installing python packages pre-requisites:

sudo apt install build-essential python3-dev libopenblas-dev
pip3 install -r requirements.txt

Installing MinkowskiEngine:

pip3 install -U git+https://github.com/NVIDIA/MinkowskiEngine -v --no-deps
NB: At the moment, MinkowskiEngine is not comaptible with python 3.10+, see this issue

Install Pytorch3D:

pip3 install "git+https://github.com/facebookresearch/pytorch3d.git"

To setup the code run the following command on the code root directory:

pip3 install -U -e .

How to Run

[Details (click to expand)]

Train

Run python tcore/scripts/train_model.py to train our approach, where parameters are specified in the config tcore/config/model.yaml.

You can use --model_cfg_path <path-to-cfg> to specify a different configuration file.

Test

Run python tcore/scripts/evaluate_model.py --w <path-to-checkpoint> for inference and computing metrics with the directory specified in tcore/config/model.yaml.

You can use --model_cfg_path <path-to-cfg> to specify a different configuration file.

Running our Approach on Sample Data

[Details (click to expand)]

For running the demo of our approach, we assume that you are using Ubunut 22.04 with a CUDA-capable device, but the scripts can be adapted to other platforms. We assume that you are in the root directory of the repository. We prepare a small sample dataset for testing this repo.

  1. Download and extract the sample data: sh scripts/download_data.sh
  2. Download the checkpoint of our trained model: sh scripts/download_checkpoint.sh

These commands will download the dataset and the checkpoint in ./data/ and ./checkpoints, respectively.

  1. Run the inference on the data: python tcore/scripts/demo.py --w checkpoints/pretrained_model.ckpt

You should get the following image on your machine:

How to Cite

If you use this repo, please cite as:

@inproceedings{magistri2024icra,
author = {F. Magistri and R. Marcuzzi and E.A. Marks and M. Sodano and J. Behley and C. Stachniss},
title = {{Efficient and Accurate Transformer-Based 3D Shape Completion and Reconstruction of Fruits for Agricultural Robots}},
booktitle = {Proc.~of the IEEE Intl.~Conf.~on Robotics \& Automation (ICRA)}, 
year = 2024,
}