Skip to content
No description, website, or topics provided.
Python C++ C
Branch: master
Clone or download
aboulch Merge pull request #4 from ywyue/patch-1
fix import data_utils
Latest commit 16cae1e Aug 7, 2019
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
data_conversions fix import data_utils Aug 6, 2019
doc update README Apr 3, 2019
layers Update: GPU or CPU Jul 15, 2019
nearest_neighbors Add NanoFLANN nearest neighbors computation Jun 19, 2019
utils
LICENSE.md Initial Commit Apr 3, 2019
README.md Add NanoFLANN nearest neighbors computation Jun 19, 2019
global_tags.py Udate missing file Jul 31, 2019
metrics.py Initial Commit Apr 3, 2019
modelnet_classif.py Update: GPU or CPU Jul 15, 2019
network_classif.py Update: GPU or CPU Jul 15, 2019
network_seg.py Update: GPU or CPU Jul 15, 2019
s3dis_eval.py Add S3DIS dataset Jun 11, 2019
s3dis_seg.py Update: GPU or CPU Jul 15, 2019
shapenet_seg.py Update: GPU or CPU Jul 15, 2019
shapenet_seg_eval.py
tree.py Initial Commit Apr 3, 2019

README.md

ConvPoint: Generalizing discrete convolutions for unstructured point clouds

SnapNet products

Updates

Major performance update: by reformulating the convolutional layer using matrix mulitplications, the memory consumption has been highly reduced.

Major interface update: the spatial relations are now computed in the network class. The framework is then easier to use and more flexible.

Introduction

This repository propose python scripts for point cloud classification and segmentation. The library is coded with PyTorch.

A preprint of the paper can be found on Arxiv:
http://arxiv.org/abs/1904.02375

License

Code is released under dual license depending on applications, research or commercial. Reseach license is GPLv3. See the license.

Citation

If you use this code in your research, please consider citing: (citation will be updated as soon as 3DOR proceedings will be released)

@article{boulch2019generalizing,
  title={Generalizing discrete convolutions for unstructured point clouds},
  author={Boulch, Alexandre},
  journal={arXiv preprint arXiv:1904.02375},
  year={2019}
}

Dependencies

  • Pytorch
  • Scikit-learn for confusion matrix computation, and efficient neighbors search
  • TQDM for progress bars
  • PlyFile
  • H5py

All these dependencies can be install via conda in an Anaconda environment or via pip.

The library

Nearest neaighbor module

The nearest_neighbors directory contains a very small wrapper for NanoFLANN with OpenMP. To compile the module:

cd nearest_neighbors
python setup.py install --home="."

In the case, you do not want to use this C++/Python wrapper. You still can use the previous version of the nearest neighbors computation with Scikit Learn and Multiprocessing, python only version (slower). To do so, add the following lines at the start of your main script (e.g. modelnet_classif.py):

from global_tags import GlobalTags
GlobalTags.legacy_layer_base(True)

Usage

We propose scripts for training on several point cloud datasets:

ModelNet40

Training

python modelnet_classif.py --rootdir path_to_modelnet40_data

Testing

For testing with one tree per shape:

python modelnet_classif.py --rootdir path_to_modelnet40_data --savedir path_to_statedict_directory --test

For testing with more than one tree per shape: (this code is not optimized at all and is very slow)

python modelnet_classif.py --rootdir path_to_modelnet40_data --savedir path_to_statedict_directory --test --ntree 2

ShapeNet

Data preparation

The script for downloading and preparing the point clouds are from PointCNN repository https://github.com/yangyanli/PointCNN[](https://github.com/yangyanli/PointCNN). They are in the data_conversions folder.

python3 ./download_datasets.py -d shapenet_partseg -f path_to_directory
python3 ./prepare_partseg_data.py -f path_to_shapenet_partseg

Training

The training script is shapenet_seg.py:

python shapenet_seg.py --savedir path_to_save_directory --rootdir path_to_shapenet_partseg

Testing

  • Inference

Testing is a two-step process. First, inference with shapenet_seg.py script:

python shapenet_seg.py --savedir path_to_model_directory  --rootdir path_to_shapenet_partseg --test

You can also use the --ply flag to generate PLY file for result visualization.

The previous command line is result with one spatial tree. To test with multiple spatial trees, use the --ntree flag:

python shapenet_seg.py --savedir path_to_model_directory  --rootdir path_to_shapenet_partseg --test --ntree 4
  • Scores

The scores computation scripts are also adapted from PointCNN repository https://github.com/yangyanli/PointCNN[](https://github.com/yangyanli/PointCNN).

python shapenet_seg_eval.py --rootdir path_to_shapenet_partseg --preddir path_to_predictions

You can also compute the part average scores using the flag --part_av.

S3DIS

#### Data preparation

Data is prepared using the prepare_s3dis_label.py script in the data_conversions folder.

Training

For training on area 2:

python s3dis_seg.py --rootdir path_to_data_processed/ --area 2 --savedir path_to_save_directory

Test

For testing on area 2:

python s3dis_seg.py --rootdir path_to_data_processed --area 2 --savedir path_to_save_directory --test
 Class Area 2
clutter  0.40
ceiling 0.88
floor 0.96
wall  0.79
beam 0.20
column 0.41
door 0.62
window  0.43
table 0.38
chair 0.26
sofa  0.01
bookcase  0.39
board 0.10
OA 0.81
Av. IoU  0.45

Semantic8

Code to be released

You can’t perform that action at this time.