Skip to content
No description, website, or topics provided.
Python C++ C
Branch: master
Clone or download
aboulch Merge pull request #4 from ywyue/patch-1
fix import data_utils
Latest commit 16cae1e Aug 7, 2019
Type Name Latest commit message Commit time
Failed to load latest commit information.
data_conversions fix import data_utils Aug 6, 2019
doc update README Apr 3, 2019
layers Update: GPU or CPU Jul 15, 2019
nearest_neighbors Add NanoFLANN nearest neighbors computation Jun 19, 2019
utils Initial Commit Apr 3, 2019 Add NanoFLANN nearest neighbors computation Jun 19, 2019 Udate missing file Jul 31, 2019 Initial Commit Apr 3, 2019 Update: GPU or CPU Jul 15, 2019 Update: GPU or CPU Jul 15, 2019 Update: GPU or CPU Jul 15, 2019 Add S3DIS dataset Jun 11, 2019 Update: GPU or CPU Jul 15, 2019 Update: GPU or CPU Jul 15, 2019 Initial Commit Apr 3, 2019

ConvPoint: Generalizing discrete convolutions for unstructured point clouds

SnapNet products


Major performance update: by reformulating the convolutional layer using matrix mulitplications, the memory consumption has been highly reduced.

Major interface update: the spatial relations are now computed in the network class. The framework is then easier to use and more flexible.


This repository propose python scripts for point cloud classification and segmentation. The library is coded with PyTorch.

A preprint of the paper can be found on Arxiv:


Code is released under dual license depending on applications, research or commercial. Reseach license is GPLv3. See the license.


If you use this code in your research, please consider citing: (citation will be updated as soon as 3DOR proceedings will be released)

  title={Generalizing discrete convolutions for unstructured point clouds},
  author={Boulch, Alexandre},
  journal={arXiv preprint arXiv:1904.02375},


  • Pytorch
  • Scikit-learn for confusion matrix computation, and efficient neighbors search
  • TQDM for progress bars
  • PlyFile
  • H5py

All these dependencies can be install via conda in an Anaconda environment or via pip.

The library

Nearest neaighbor module

The nearest_neighbors directory contains a very small wrapper for NanoFLANN with OpenMP. To compile the module:

cd nearest_neighbors
python install --home="."

In the case, you do not want to use this C++/Python wrapper. You still can use the previous version of the nearest neighbors computation with Scikit Learn and Multiprocessing, python only version (slower). To do so, add the following lines at the start of your main script (e.g.

from global_tags import GlobalTags


We propose scripts for training on several point cloud datasets:



python --rootdir path_to_modelnet40_data


For testing with one tree per shape:

python --rootdir path_to_modelnet40_data --savedir path_to_statedict_directory --test

For testing with more than one tree per shape: (this code is not optimized at all and is very slow)

python --rootdir path_to_modelnet40_data --savedir path_to_statedict_directory --test --ntree 2


Data preparation

The script for downloading and preparing the point clouds are from PointCNN repository[]( They are in the data_conversions folder.

python3 ./ -d shapenet_partseg -f path_to_directory
python3 ./ -f path_to_shapenet_partseg


The training script is

python --savedir path_to_save_directory --rootdir path_to_shapenet_partseg


  • Inference

Testing is a two-step process. First, inference with script:

python --savedir path_to_model_directory  --rootdir path_to_shapenet_partseg --test

You can also use the --ply flag to generate PLY file for result visualization.

The previous command line is result with one spatial tree. To test with multiple spatial trees, use the --ntree flag:

python --savedir path_to_model_directory  --rootdir path_to_shapenet_partseg --test --ntree 4
  • Scores

The scores computation scripts are also adapted from PointCNN repository[](

python --rootdir path_to_shapenet_partseg --preddir path_to_predictions

You can also compute the part average scores using the flag --part_av.


#### Data preparation

Data is prepared using the script in the data_conversions folder.


For training on area 2:

python --rootdir path_to_data_processed/ --area 2 --savedir path_to_save_directory


For testing on area 2:

python --rootdir path_to_data_processed --area 2 --savedir path_to_save_directory --test
 Class Area 2
clutter  0.40
ceiling 0.88
floor 0.96
wall  0.79
beam 0.20
column 0.41
door 0.62
window  0.43
table 0.38
chair 0.26
sofa  0.01
bookcase  0.39
board 0.10
OA 0.81
Av. IoU  0.45


Code to be released

You can’t perform that action at this time.