Skip to content

Blazingly fast encoding for neural networks based on permutohedral lattices

License

Notifications You must be signed in to change notification settings

RaduAlexandru/permutohedral_encoding

Repository files navigation

Permutohedral Encoding

This contains the official implementation of the permutohedral encoding which was originally proposed in PermutoSDF: Fast Multi-View Reconstruction with Implicit Surfaces using Permutohedral Lattices

Permutohedral encoding is used to obtain high-dimensional features from low-dimensional space. The features can then be used by a small MLP for decoding density, SDF, RGB or whatever quantity is needed. The encoding works by interpolating features from a multi-resolution permutohedral hash-map. The permutohedral encoding is similar in spirit to the hash-map encoding of InstantNGP but it scales linearly with dimensionality instead of exponentially. This makes it significantly faster to optimize especially for higher dimensions.

Usage

The permutohedral encoding can be used directly in PyTorch:

import torch
import permutohedral_encoding as permuto_enc
import numpy as np

#create encoding
pos_dim=3
capacity=pow(2,18) 
nr_levels=24 
nr_feat_per_level=2 
coarsest_scale=1.0 
finest_scale=0.0001 
scale_list=np.geomspace(coarsest_scale, finest_scale, num=nr_levels)
encoding=permuto_enc.PermutoEncoding(pos_dim, capacity, nr_levels, nr_feat_per_level, scale_list)

#create points
nr_points=1000
points=torch.rand(nr_points, 3).cuda()

#encode
features=encoding(points)

#use the features, for example as input to an MLP
#sdf=mlp(features) 

A more complete example on how to use the permutohedral encoding is shown in ./examples/train_toy_example.py

Install

One can easily install it as a python package by using

$ git clone --recursive https://github.com/RaduAlexandru/permutohedral_encoding
$ cd permutohedral_encoding
$ make #this runs python install for the current user

This requires that PyTorch and CUDA are installed.

Performance

The permutohedral lattice scales linearly wrt. input dimensionality in contrast to exponentially in the case of cubical voxels. This make it particularly attractive for problems dealing with 3D or 4D data. Higher dimensions are also readily supported.

The script ./examples/performance.py compares the throughput between permutohedral encoding and the cubical hash-encoding from tiny-cudan-nn. To be noted that currently we only support floating point precision so for a fair comparison tiny-cudan-nn is compiled without the half-float operations by disabling here Additionally, tiny-cudan-nn supports by default up to dimension 4. This can be increased by uncommenting the corresponding lines here

Citation

@inproceedings{rosu2023permutosdf,
    title={PermutoSDF: Fast Multi-View Reconstruction with 
            Implicit Surfaces using Permutohedral Lattices  },
    author={Radu Alexandru Rosu and Sven Behnke},
    booktitle = {IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
    year={2023}
}

License

Permutohedral Encoding is provided under the terms of the MIT license (see LICENSE).

About

Blazingly fast encoding for neural networks based on permutohedral lattices

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published