Skip to content
Pytorch Repo for "DeepGCNs: Can GCNs Go as Deep as CNNs?" ICCV2019 Oral https://deepgcns.org
Python Shell
Branch: master
Clone or download
Latest commit 7b66afb Oct 29, 2019
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
examples release pretrained model Oct 26, 2019
gcn_lib detach Oct 28, 2019
misc Initial Commit Jul 30, 2019
utils
.gitignore exp sem_seg_dense fix bug Oct 12, 2019
LICENSE
README.md
deepgcn.yml environment details Oct 18, 2019
deepgcn_env_install.sh

README.md

DeepGCNs: Can GCNs Go as Deep as CNNs?

In this work, we present new ways to successfully train very deep GCNs. We borrow concepts from CNNs, mainly residual/dense connections and dilated convolutions, and adapt them to GCN architectures. Through extensive experiments, we show the positive effect of these deep GCN frameworks.

[Project] [Paper] [Slides] [Tensorflow Code] [Pytorch Code]

Overview

We do extensive experiments to show how different components (#Layers, #Filters, #Nearest Neighbors, Dilation, etc.) effect DeepGCNs. We also provide ablation studies on different type of Deep GCNs (MRGCN, EdgeConv, GraphSage and GIN).

Further information and details please contact Guohao Li and Matthias Muller.

Requirements

Install enviroment by runing:

source deepgcn_env_install.sh

Code Architecture

.
├── misc                    # Misc images
├── utils                   # Common useful modules
├── gcn_lib                 # gcn library
│   ├── dense               # gcn library for dense data (B x C x N x 1)
│   └── sparse              # gcn library for sparse data (N x C)
├── examples 
│   ├── sem_seg_dense       # code for point clouds semantic segmentation on S3DIS (data type: dense)
│   ├── sem_seg_sparse      # code for point clouds semantic segmentation on S3DIS (data type: sparse)
│   ├── part_sem_seg        # code for part segmentation on PartNet
│   └── ppi                 # code for node classification on PPI dataset
└── ...

How to train, test and evaluate our models

Please look the details in Readme.md of each task inside examples folder. All the information of code, data, and pretrained models can be found there.

Citation

Please cite our paper if you find anything helpful,

@InProceedings{li2019deepgcns,
    title={DeepGCNs: Can GCNs Go as Deep as CNNs?},
    author={Guohao Li and Matthias Müller and Ali Thabet and Bernard Ghanem},
    booktitle={The IEEE International Conference on Computer Vision (ICCV)},
    year={2019}
}
@misc{li2019deepgcns_journal,
    title={DeepGCNs: Making GCNs Go as Deep as CNNs},
    author={Guohao Li and Matthias Müller and Guocheng Qian and Itzel C. Delgadillo and Abdulellah Abualshour and Ali Thabet and Bernard Ghanem},
    year={2019},
    eprint={1910.06849},
    archivePrefix={arXiv},
    primaryClass={cs.CV}
}

License

MIT License

Acknowledgement

Thanks for Guocheng Qian for the implementation of the Pytorch version.

You can’t perform that action at this time.