Skip to content

SPVD: Efficient and Scalable Point Cloud Generation with Sparse Point-Voxel Diffusion Models

Notifications You must be signed in to change notification settings

JohnRomanelis/SPVD

Repository files navigation

Efficient and Scalable Point Cloud Generation with Sparse Point-Voxel Diffusion Models

Paper | Project Page | Video | Lightning Version

This repository contains the official implementation for our publication: "Efficient and Scalable Point Cloud Generation with Sparse Point-Voxel Diffusion Models."

News:

  • 12/8/2024: Arxiv submission of the SPVD preprint.
  • 12/9/2024: Release of SPVD Lightning. We replace the pclab custom library with Pytorch Lightning ⚡

Installation

1. Set Up an Anaconda Environment

We recommend using Anaconda to manage your Python environment.

conda create --name spvd python=3.9
conda activate spvd

2. Clone the Repository

git clone https://github.com/JohnRomanelis/SPVD.git

3. Install PyTorch and other Python libraries

We have tested our code with PyTorch 2.0 and CUDA 11.8. You can install the compatible version using the following command:

conda install pytorch==2.0.0 torchvision==0.15.0 torchaudio==2.0.0 pytorch-cuda=11.8 -c pytorch -c nvidia

You can also install most of the required libraries through the requirements.txt by running:

pip install -r requirements.txt

4. Install pclab

pclab is an helper library, based on the fast.ai Practical Deep Learning-Part 2 course.

Note: Make sure you have installed PyTorch before install pclab to make sure you install the correct version.

  1. Clone the pclab repository.
git clone https://github.com/JohnRomanelis/pclab.git
  1. Navigate into the pclab directory:
cd pclab
  1. Install pclab. This will automatically install the required dependencies:
pip install -e .

5. Installing TorchSparse

  1. TorchSparse depends on the Google Sparse Hash librabry. To install on ubuntu run:
sudo apt-get install libsparsehash-dev
  1. Clone the torchsparse repo:
git clone https://github.com/mit-han-lab/torchsparse.git
  1. Navigate into the torchsparse directory:
cd torchsparse
  1. Install torchsparse:
pip install -e .

6. Install Chamfer Distance and Earth Mover Distance

  • Chamfer
  1. Navigate to the SPVD/metrics/chamfer_dist directory:
cd SPVD/metrics/chamfer_dist
  1. Run:
python setup.py install --user
  • EMD
  1. Navigate to the SPVD/metrics/PyTorchEMD directory:
cd SPVD/metrics/PyTorchEMD
  1. Run:
python setup.py install
  1. Run:
cp ./build/lib.linux-x86_64-cpython-310/emd_cuda.cpython-310-x86_64-linux-gnu.so .

If an error is raised in this last command, list all directories inside build and replace the name of the derictory with the one in your pc named lib.linux-x86_64-cpython-*

Experiments

You can replicate all the experiments from our paper using the notebooks provided in the experiments folder. Below is a catalog of the experiments featured in our paper, along with brief descriptions.

A more comprehensive list, including additional comments and experiments, is available here.

Note:

All the #export commands are used with the `utils/notebook2py.py' script, to export parts of the notebooks to .py scripts.

Data

For generation, we use the same version of ShapeNet as PointFlow. Please refer to their instructions for downloading the dataset.

For completion we use PartNet. Download the data from the official PartNet website. To process the data check the PartNetDataset notebook.

Citation

If you find this work useful in your research, please consider citing:

@misc{romanelis2024efficientscalablepointcloud,
      title={Efficient and Scalable Point Cloud Generation with Sparse Point-Voxel Diffusion Models}, 
      author={Ioannis Romanelis and Vlassios Fotis and Athanasios Kalogeras and Christos Alexakos and Konstantinos Moustakas and Adrian Munteanu},
      year={2024},
      eprint={2408.06145},
      archivePrefix={arXiv},
      primaryClass={cs.CV},
      url={https://arxiv.org/abs/2408.06145}, 
}