π¦ Simba: Towards High-Fidelity and Geometrically-Consistent Point Cloud Completion via Transformation Diffusion
Simba is an advanced deep learning framework specifically designed for high-quality 3D point cloud completion tasks. By leveraging transformation diffusion techniques, Simba achieves high fidelity and geometric consistency in completing 3D point clouds. This repository provides the implementation, datasets, and tools necessary to reproduce the results presented in our work.
- 2025-11-22 Initial release of Simba repository and pretrained models.
- 2025-11-08 Simba accepted at AAAI 2026.
- High-fidelity 3D point cloud completion.
- Geometrically consistent results using transformation diffusion.
- Support for multiple datasets, including PCN, ShapeNet and KITTI.
- Modular and extensible codebase for research and development.
We provide the model weights trained on the PCN dataset for Stage 2 (Simba). You can download them via Google Drive.
| Model | Download Link |
|---|---|
| Simba(Stage 2) | [Google Drive] |
- CUDA 12.1 compatible GPU
- Anaconda or Miniconda installed
- Python 3.10
Create and activate a new conda environment:
conda create --name Simba python=3.10
conda activate SimbaInstall PyTorch with CUDA 12.1 support:
conda install pytorch==2.3.1 torchvision==0.18.1 torchaudio==2.3.1 pytorch-cuda=12.1 -c pytorch -c nvidiaInstall required Python packages and custom wheels:
pip install -r requirements.txt
pip install --upgrade https://github.com/unlimblue/KNN_CUDA/releases/download/0.2/KNN_CUDA-0.2-py3-none-any.whlInstall PyTorch3D compatible with your PyTorch version:
conda install https://anaconda.org/pytorch3d/pytorch3d/0.7.8/download/linux-64/pytorch3d-0.7.8-py310_cu121_pyt231.tar.bz2Install specialized neural network components by downloading the required .whl files from the following links:
- Mamba SSM v1.2.1 (File:
mamba_ssm-1.2.1+cu122torch2.3cxx11abiFALSE-cp310-cp310-linux_x86_64.whl) - Causal Conv1D v1.2.1 (File:
causal_conv1d-1.2.1+cu122torch2.3cxx11abiFALSE-cp310-cp310-linux_x86_64.whl)
After downloading the specified files, install them using:
pip install mamba_ssm-1.2.1+cu122torch2.3cxx11abiFALSE-cp310-cp310-linux_x86_64.whl
pip install causal_conv1d-1.2.1+cu122torch2.3cxx11abiFALSE-cp310-cp310-linux_x86_64.whlCompile various 3D processing extensions by running the provided install.sh script:
bash install.shThe script will:
- Navigate to each extension directory.
- Run the installation command (
python setup.py install). - Log the progress and errors to
install_extensions.log.
If any step fails, the script will stop and display an error message.
Details about the datasets used in this project can be found in DATASET.md.
Simba employs a two-stage training process to achieve high-fidelity and geometrically consistent point cloud completion. Below are the details for each stage:
In the first stage, the SymmGT model is trained to generate high-quality intermediate representations. Use the following command to train the SymmGT model:
CUDA_VISIBLE_DEVICES=0,1 bash ./scripts/dist_train.sh 2 13232 \
--config ./cfgs/PCN_models/SymmGT.yaml \
--exp_name SymmGT_stage_1The trained model weights will be saved in the experiment/SymmGT_stage_1/ directory.
In the second stage, the Simba model is trained using the pretrained weights from the first stage. Update the pretrain field in cfgs/PCN_models/Simba.yaml to point to the path of the trained SymmGT model (e.g., experiment/SymmGT_stage_1/best_model.pth). Then, run the following command:
CUDA_VISIBLE_DEVICES=0,1 bash ./scripts/dist_train.sh 2 13232 \
--config ./cfgs/PCN_models/Simba.yaml \
--exp_name Simba_stage_2Alternatively, you can use the automated training script train.sh, which handles both stages and automatically sets the pretrained model path for the second stage:
bash train.shThis script will:
- Train the SymmGT model in the first stage.
- Automatically retrieve the best model weights from the first stage and use them for training the Simba model in the second stage.
To run inference with a pretrained model:
python tools/inference.py \
--config <config_file> \
--checkpoint <checkpoint_file> \
--input <input_path> \
--output <output_path>To train a model from scratch:
bash scripts/train.sh \
--config <config_file> \
--output <output_dir>This project is licensed under the MIT License.
This project is inspired by PoinTr and SymmCompletion.
If you find this work useful, please consider citing:
@inproceedings{zhang2026simba,
title={Simba: Towards High-Fidelity and Geometrically-Consistent Point Cloud Completion via Transformation Diffusion},
author={Lirui Zhang and Zhengkai Zhao and Zhi Zuo and Pan Gao and Jie Qin},
booktitle={Proceedings of the AAAI Conference on Artificial Intelligence (AAAI)},
year={2026},
url={https://arxiv.org/abs/2511.16161},
note={To appear}