Skip to content
No description, website, or topics provided.
Python Shell
Branch: master
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
data data May 1, 2018
gen data fixes May 12, 2018
.gitignore fix dataloader for -1/+1 graph label datasets Feb 8, 2019
LICENSE Initial commit Mar 12, 2018
README.md Create README.md Feb 8, 2019
aggregators.py graphsage Mar 13, 2018
cross_val.py val script Oct 15, 2018
encoders.py modulelist May 31, 2019
example.sh Update example.sh Mar 14, 2019
graph_embedding.py data generation Mar 13, 2018
graph_sampler.py val script Oct 15, 2018
graphsage.py train Mar 14, 2018
load_data.py fix dataloader for -1/+1 graph label datasets Feb 8, 2019
partition.py max nodes Apr 30, 2018
set2set.py set2set base May 14, 2018
train.py modulelist May 31, 2019
util.py visualize performance May 3, 2018

README.md

diffpool

This is the repo for Hierarchical Graph Representation Learning with Differentiable Pooling (NeurIPS 2018)

Recently, graph neural networks (GNNs) have revolutionized the field of graph representation learning through effectively learned node embeddings, and achieved state-of-the-art results in tasks such as node classification and link prediction. However, current GNN methods are inherently flat and do not learn hierarchical representations of graphs—a limitation that is especially problematic for the task of graph classification, where the goal is to predict the label associated with an entire graph. Here we propose DIFFPOOL, a differentiable graph pooling module that can generate hierarchical representations of graphs and can be combined with various graph neural network architectures in an end-to-end fashion. DIFFPOOL learns a differentiable soft cluster assignment for nodes at each layer of a deep GNN, mapping nodes to a set of clusters, which then form the coarsened input for the next GNN layer. Our experimental results show that combining existing GNN methods with DIFFPOOL yields an average improvement of 5–10% accuracy on graph classification benchmarks, compared to all existing pooling approaches, achieving a new state-of-the-art on four out of five benchmark data sets.

Paper link: https://arxiv.org/pdf/1806.08804.pdf

You can’t perform that action at this time.