-
Data Science and Analytic Thrust, Information Hub, HKUST(GZ)
- GuangZhou
- https://www.zhihu.com/people/peijieDong
- https://pprp.github.io
- https://scholar.google.com/citations?user=TqS6s4gAAAAJ
Attention
Summary of related papers on visual attention. Related code will be released based on Jittor gradually.
Improving Convolutional Networks via Attention Transfer (ICLR 2017)
🍀 Pytorch implementation of various Attention Mechanisms, MLP, Re-parameter, Convolution, which is helpful to further understand papers.⭐⭐⭐
The official pytorch implemention of our ICML-2021 paper "SimAM: A Simple, Parameter-Free Attention Module for Convolutional Neural Networks".
Code for our CVPR2021 paper coordinate attention
Implementing Stand-Alone Self-Attention in Vision Models using Pytorch
We design an effective Relation-Aware Global Attention (RGA) module for CNNs to globally infer the attention.
Official PyTorch Implementation for "Rotate to Attend: Convolutional Triplet Attention Module." [WACV 2021]
Code/Model release for NIPS 2017 paper "Attentional Pooling for Action Recognition"
Code for ECA-Net: Efficient Channel Attention for Deep Convolutional Neural Networks
list of efficient attention modules
Pyramidal Convolution: Rethinking Convolutional Neural Networks for Visual Recognition (https://arxiv.org/pdf/2006.11538.pdf)
Awesome List of Attention Modules and Plug&Play Modules in Computer Vision
Dual Attention Network for Scene Segmentation (CVPR2019)
Keras Attention Layer (Luong and Bahdanau scores).
GCNet: Non-local Networks Meet Squeeze-Excitation Networks and Beyond
Implementation of Non-local Block.
Official PyTorch code for "BAM: Bottleneck Attention Module (BMVC2018)" and "CBAM: Convolutional Block Attention Module (ECCV2018)"
[ICCV W] Contextual Convolutional Neural Networks (https://arxiv.org/pdf/2108.07387.pdf)
PyTorch code for our paper : "SRM : A Style-based Recalibration Module for Convolutional Neural Networks" (https://arxiv.org/abs/1903.10829)





