Skip to content

Transformer for Graph Classification (Pytorch and Tensorflow)

License

Notifications You must be signed in to change notification settings

moaminsharifi/Graph-Transformer

 
 

Repository files navigation

Transformer for Graph ClassificationTwitter

GitHub top languageGitHub issues GitHub repo size GitHub last commit GitHub forks GitHub stars GitHub

This program provides the implementation of our graph transformer as described in our paper, where we leverage the transformer self-attention network to learn graph representations.

Usage

News

  • 04-05-2021: Release a variant (in Pytorch 1.5.0) to train a fully-connected graph transformer, by leveraging the transformer directly on all nodes of a given graph.

  • 17-05-2020: Release a Pytorch 1.5.0 implementation.

  • 11-12-2019: Release a Tensorflow 1.14 implementation.

Training

  • Variant 1: Leveraging the transformer on sampled neighbors of each node:

      $ python train_UGT_Sup.py --dataset IMDBBINARY --batch_size 4 --ff_hidden_size 1024 --fold_idx 1 --num_neighbors 8 --num_epochs 50 --num_timesteps 4 --learning_rate 0.0005 --model_name IMDBBINARY_bs4_fold1_1024_8_idx0_4_1
    
      $ python train_UGT_Sup.py --dataset PTC --batch_size 4 --ff_hidden_size 1024 --fold_idx 1 --num_neighbors 16 --num_epochs 50 --num_timesteps 3 --learning_rate 0.0005 --model_name PTC_bs4_fold1_1024_16_idx0_3_1
    
  • Variant 2: Leveraging the transformer directly on all nodes to train a fully-connected graph transformer:

      $ python train_pytorch_Full_GT.py --dataset PTC --ff_hidden_size 1024 --fold_idx 1 --num_epochs 50 --num_timesteps 3 --learning_rate 0.0005 --model_name PTC_fold1_1024_idx0_1
    

Requirements

  • Python 3.x
  • Tensorflow 1.14 & Tensor2tensor 1.13
  • Pytorch 1.5.0
  • Networkx 2.3
  • Scikit-learn 0.21.2

Cite

Please cite the paper whenever our graph transformer is used to produce published results or incorporated into other software:

@article{Nguyen2019UGT,
	author={Dai Quoc Nguyen and Tu Dinh Nguyen and Dinh Phung},
	title={Universal Self-Attention Network for Graph Classification},
	journal={arXiv preprint arXiv:1909.11855},
	year={2019}
}

License

As a free open-source implementation, Graph-Transformer is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. All other warranties including, but not limited to, merchantability and fitness for purpose, whether express, implied, or arising by operation of law, course of dealing, or trade usage are hereby disclaimed. I believe that the programs compute what I claim they compute, but I do not guarantee this. The programs may be poorly and inconsistently documented and may contain undocumented components, features or modifications. I make no guarantee that these programs will be suitable for any application.

Graph-Transformer is licensed under the Apache License 2.0.

About

Transformer for Graph Classification (Pytorch and Tensorflow)

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 98.5%
  • C++ 1.4%
  • Makefile 0.1%