Skip to content

shenglandu/PushBoundary

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

15 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Pytorch Implementation of Push-the-Boundary

This repo is implementation for Push-the-Boundary: Boundary-aware Feature Propogation for Semantic Segmentation of 3D Point Clouds in pytorch.

overview

Baseline Reference

Two networks PointNet++ and KP-Conv are adopted as baselines in our work.

Installation

The codes using both baselines are tested on Ubuntu 20.04, Python 3.8. Install the following dependencies:

  • numpy
  • scikit-learn 0.23.2
  • pytorch 1.7.1
  • cudatoolkit 10.1

For KP-Conv backbone, you also need to compile the C++ extension modules in cpp_wrappers. Open a terminal in this folder, and run:

    sh compile_wrappers.sh

Indoor Semantic Segmentation (S3DIS)

Data Preparation

Download the data from the data link.

For running PointNet++ backbone, use the data extracted from pointnet_data_s3dis/stanford_indoor3d.zip. Unzip the data and put the .npy files under the folder PointNet2_Backbone/data_s3dis/. The point clouds are pre-processed, containing the following fields:

  • coordinate, i.e., x, y, z
  • color, i.e., r, g, b
  • label
  • normal, i.e., nx, ny, nz
  • boundary, i.e., 0 for interior and 1 for boundary
  • direction, i.e., dx, dy, dz

For running KP-Conv backbone, use the data extracted from kpconv_data_s3dis/s3dis.zip. The scenes are stored in .ply format, containing the same fields. You can also find the subsampled point clouds using the default voxel size of 5cm. Unzip the data and put both the original point clouds and the subsampled point clouds under the folder KPConv_Backbone/data_s3dis/. Note that there is an additional field "dis_boundary", denoting the distance from current point to the closest boundary point. However, this field is not used in our final network.

Running using the baseline PointNet++

Train the model using:

    python train_semseg_boundary.py

Test the model using:

    python test_semseg_boundary.py --log_dir your_resulted_log --test_area 5

with the specified path to your model directory.

Running using the baseline KP-Conv

Train the model using:

    python train_S3DIS_boundary.py

Test the model using:

    python test_models.py

In L12, you can specify your model directory.

Outdoor Semantic Segmentation (SensatUrban)

Data Preparation

Download the data from the data link, use the data extracted from data_sensaturban.zip. The scenes are stored in .ply format. Unzip the data and put both the original point clouds and the subsampled point clouds under the folder KPConv_Backbone/data_sensat/. The subsampled point clouds are pre-processed and contain the following fields:

  • coordinate, i.e., x, y, z
  • color, i.e., r, g, b
  • label
  • boundary, i.e., 0 for interior and 1 for boundary
  • direction, i.e., dx, dy, dz

Note that normal information is contained but not used in our final network. Information of label, boundary, and direction is not available in the test set, i.e., Birmingham block 2 and 8, Cambridge block 15, 16, 22 and 27.

Running using the baseline KP-Conv

Train the model using:

    python train_SensatUrban_boundary.py

Test the model using:

    python test_models.py

In L12, you can specify your model directory.

Visualization

The segmentation outputs are stored as .ply files which contain the predictions of pointwise boundaries, directions and semantic classes. They can be visualized using various softwares (e.g., Easy3D, CloudCompare, MeshLab).

Citation

If you use (part of) the code / approach in a scientific work, please cite our paper:

@inproceedings{du2022pushboundary,
  title={Push-the-Boundary: Boundary-aware Feature Propagation for Semantic Segmentation of 3D Point Clouds},
  author={Du, Shenglan and Ibrahimli, Nail and Stoter, Jantien and Kooij, Julian and Nan, Liangliang},
  journal={International conference on 3D vision (3DV)},
  year={2022}
}

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published