A collection of important graph embedding, classification and representation learning papers with implementations.
-
Updated
Mar 18, 2023 - Python
A collection of important graph embedding, classification and representation learning papers with implementations.
Pytorch implementation of the Graph Attention Network model by Veličković et. al (2017, https://arxiv.org/abs/1710.10903)
Graph Attention Networks (https://arxiv.org/abs/1710.10903)
KGAT: Knowledge Graph Attention Network for Recommendation, KDD2019
A PyTorch implementation of "Capsule Graph Neural Network" (ICLR 2019).
Keras implementation of the graph attention networks (GAT) by Veličković et al. (2017; https://arxiv.org/abs/1710.10903)
DeepInf: Social Influence Prediction with Deep Learning
PyTorch implementation of MTAD-GAT (Multivariate Time-Series Anomaly Detection via Graph Attention Networks) by Zhao et. al (2020, https://arxiv.org/abs/2009.02040).
An implementation of "MixHop: Higher-Order Graph Convolutional Architectures via Sparsified Neighborhood Mixing" (ICML 2019).
[ICDE2023] A PyTorch implementation of Self-supervised Trajectory Representation Learning with Temporal Regularities and Travel Semantics Framework (START).
PyTorch code for ICPR 2020 paper "DAG-Net: Double Attentive Graph Neural Network for Trajectory Forecasting"
ECML 2019: Graph Neural Networks for Multi-Label Classification
Fake news detector based on the content and users associated with it using BERT and Graph Attention Networks (GAT).
Heterogeneous Graph Attention Networks for Early Detection of Rumors on Twitter (IJCNN 2020)
A Context-aware Visual Attention-based training pipeline for Object Detection from a Webpage screenshot!
This project is a scalable unified framework for deep graph clustering.
[NLPCC 2020] Sentence Constituent-Aware Aspect-Category Sentiment Analysis with Graph Attention Networks
Source code for paper "Conversational Question Answering over Knowledge Graphs with Transformer and Graph Attention Networks"
GraphCON (ICML 2022)
Add a description, image, and links to the graph-attention-networks topic page so that developers can more easily learn about it.
To associate your repository with the graph-attention-networks topic, visit your repo's landing page and select "manage topics."