Graph Transformer Architecture. Source code for "A Generalization of Transformer Networks to Graphs", DLG-AAAI'21.
-
Updated
Jul 27, 2021 - Python
Graph Transformer Architecture. Source code for "A Generalization of Transformer Networks to Graphs", DLG-AAAI'21.
Universal Graph Transformer Self-Attention Networks (TheWebConf WWW 2022) (Pytorch and Tensorflow)
Recipe for a General, Powerful, Scalable Graph Transformer
The official implementation for ICLR23 spotlight paper "DIFFormer: Scalable (Graph) Transformers Induced by Energy Constrained Diffusion"
Source code for GNN-LSPE (Graph Neural Networks with Learnable Structural and Positional Representations), ICLR 2022
Official Pytorch code for Structure-Aware Transformer.
[ICLR 2023] One Transformer Can Understand Both 2D & 3D Molecular Data (official implementation)
Code for AAAI2020 paper "Graph Transformer for Graph-to-Sequence Learning"
Deep learning toolkit for Drug Design with Pareto-based Multi-Objective optimization in Polypharmacology
[AAAI2023] A PyTorch implementation of PDFormer: Propagation Delay-aware Dynamic Long-range Transformer for Traffic Flow Prediction.
Official Code Repository for the paper "Accurate Learning of Graph Representations with Graph Multiset Pooling" (ICLR 2021)
SignNet and BasisNet
Code for our paper "Attending to Graph Transformers"
[ICDE'2023] When Spatio-Temporal Meet Wavelets: Disentangled Traffic Forecasting via Efficient Spectral Graph Attention Networks
[SIGIR'2023] "GFormer: Graph Transformer for Recommendation"
Video Graph Transformer for Video Question Answering (ECCV'22)
Implementation for the paper: Representation Learning on Knowledge Graphs for Node Importance Estimation
[NeurIPS'23 Spotlight] Learning Probabilistic Symmetrization for Architecture Agnostic Equivariance (LPS), in PyTorch
An unofficial implementation of Graph Transformer (Masked Label Prediction: Unified Message Passing Model for Semi-Supervised Classification) - IJCAI 2021
Triplet Graph Transformer
Add a description, image, and links to the graph-transformer topic page so that developers can more easily learn about it.
To associate your repository with the graph-transformer topic, visit your repo's landing page and select "manage topics."