Skip to content

cagdasbas/swin-td-bu-attention

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Top-Down Bottom-Up Attention on top of Swin Transformer

Implementation of top-down bottom-up attentional modules on top of Swin Transformer object detection network.

Installation:

  • Follow official doc from here.
  • Download the model from here

How to run

Test

Single image test

python -m swin_td_bu_att demo.jpg configs/td_bu_attention/topdown_bottomup_attentional_swin.py swin_tiny_patch4_window7_224.pth --device cuda --out-file result.jpg

Evaluate all test set:

python swin_td_bu_att/eval.py

Train

python swin_td_bu_att/train.py

Current implementation progress:

  • Implement modules
  • Implement functioning test code
  • Fix training parameters in config file
  • Implement missing training functionality
  • Decrease the training loss

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages