PyTorch implementation of ResNeSt : Split-Attention Networks [1].
This implementation is only for my understanding of the architecture of ResNeSt.
Mostly the radix-major implementation of the bottleneck block.
- docker
- docker-compose
- Only supports dilation=1.
- Evaluate the model
[1] ResNeSt : Split-Attention Networks, Hang Zhang, Chongruo Wu, Zhongyue Zhang, Yi Zhu, Zhi Zhang, Haibin Lin, Yue Sun, Tong He, Jonas Mueller, R. Manmatha, Mu Li, Alexander Smola, https://arxiv.org/abs/2004.08955