Skip to content
/ GRPE Public

Official Implementation of "GRPE: Relative Positional Encoding for Graph Transformer"

License

Notifications You must be signed in to change notification settings

lenscloth/GRPE

Repository files navigation

GRPE: Relative Positional Encoding for Graph Transformer, (Oral) MLDD Workshop at ICLR 2022

Official implementation of GRPE. We achieve the second best model on the PCQM4Mv2 dataset of the OGB-LSC Leaderboard.

Quick Start

Prepair environment

conda env create --file environment.yaml
conda activate chemprop

Prepare pretrained weight

Download pretrained weights from https://drive.google.com/drive/folders/1Oc3Ox0HAoJ5Hrihfp5-jFvStPIfFQAf9?usp=sharing and create folder pretrained_weight.

Reproduce results

Please check {dataset-name}.sh for detailed commands to reproduce the results.

Hardware requirements

  • 4 gpus (A100 with 80GiB) are required to run experiments for PCQM4M, PCQM4Mv2, PCBA and HIV.
  • 1 gpu is required to run experiments for MNIST and CIFAR10.

Molecule Finger-Print

Installation

pip install git+https://github.com/lenscloth/GRPE

# Install pytorch & pytorch geometric version according to your environment
pip install torch==1.9.0+cu111 torchvision==0.10.0+cu111 torchaudio==0.9.0 -f https://download.pytorch.org/whl/torch_stable.html
pip install torch-scatter torch-sparse torch-cluster torch-spline-conv torch-geometric -f https://data.pyg.org/whl/torch-1.9.0+cu111.html

Example

from grpe.pretrained import load_pretrained_fingerprint

fingerprint_model = load_pretrained_fingerprint(cuda=True)
finger = fingerprint_model.generate_fingerprint(
    [
        "CC(=O)NCCC1=CNc2c1cc(OC)cc2",
    ],
    fingerprint_stack=5,
) # 1x3840 Pytorch Tensor

Citation

Please use the bibtex below

@inproceedings{park2022grpe,
  title={GRPE: Relative Positional Encoding for Graph Transformer},
  author={Park, Wonpyo and Chang, Woong-Gi and Lee, Donggeon and Kim, Juntae and Seungwon Hwang},
  booktitle={ICLR2022 Machine Learning for Drug Discovery},
  year={2022}
}

About

Official Implementation of "GRPE: Relative Positional Encoding for Graph Transformer"

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published