This page is under construction...
The implementation of Geometry-aware Mesh Convolutional Network based on Semi-Supervised Classification with Graph Convolutional Networks [ICLR 2017].
torch 1.7.0
torch-geometric 1.7.1
scipy 1.6.2
numpy 1.19.2
from util.layer import MeshConv
class MeshNet(nn.Module):
def __init__(self, mesh):
super(MeshNet, self).__init__()
self.model = nn.Sequential(
MeshConv(6, 32, mesh),
nn.BatchNorm1d(32),
nn.LeakyReLU(),
MeshConv(32, 128, mesh),
nn.BatchNorm1d(128),
nn.LeakyReLU(),
MeshConv(128, 128, mesh),
nn.BatchNorm1d(128),
nn.LeakyReLU(),
MeshConv(128, 32, mesh),
nn.BatchNorm1d(32),
nn.LeakyReLU(),
MeshConv(32, 16, mesh),
nn.BatchNorm1d(16),
nn.LeakyReLU(),
nn.Linear(16, 3),
)
def forward(self, x):
out = self.model(x)
return out
-
$A \in {0, 1}^{n\times n} $ : Adjacency matrix -
$D \in \mathbb{R}^{n \times n}$ : Degree matrix -
$L \in \mathbb{R}^{n \times n}$ : Graph Laplacian matrix -
$L^{sym} = D^{-\frac{1}{2}} L D^{-\frac{1}{2}} = I_N - D^{-\frac{1}{2}} A D^{-\frac{1}{2}}$ : Symmetrically normalized graph Laplacian -
$\hat{L} = \frac{2}{\lambda_{max}} L^{sym} - I_N$ : Scaled graph Laplcian -
$X \in \mathbb{R}^{n \times d}$ : Vertex feature matrix (Input to each layer) -
$W \in \mathbb{R}^{d \times d^{\prime}}$ : Learnable weight matrix -
$\sigma$ : Activation function (ex. ReLU, softmax)
Replace graph laplacian
-
$L \in \mathbb{R}^{n \times n}$ : Mesh Laplacian matrix -
$D_{ii} = L_{ii}$ : Degree matrix (continuous) -
$A = D - L$ : Adjacency matrix (continuous)
Assign a 6-dimensional feature to each vertex and train MeshConv to restore an original mesh from a smoothed mesh. Compare the performance between GCNConv and MeshConv.
- Learning rate: 0.001
- Epoch: 1000
- Metrix
$\epsilon$ : MSELoss of vertex position
- MeshConv outperforms GCNConv.
- MeshConv can restore bumpy-sphere (bottom) more accurately than GCNConv.
Input | GCNConv | MeshConv | Ground truth |
![]() |
![]() |
![]() |
![]() |
--- | 0.008221 | 0.007452 | --- |
![]() |
![]() |
![]() |
![]() |
--- | 0.09511 | 0.05705 | --- |