You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, thanks for your appreciation in our work.
I think that replacing the original transformer with Swin-T may reduce some computational complexity but will not bring performance gain, because the shifted-window attention is an approximation to the full attention.
In fact, instead we have tried replacing ResNet with Swin-T as the backbone, this does bring much better performance.
Hi, when you used Swin-T as the backbone, which layers of Swin-T were used in your work? Could you send me the STARK code using Swin-T as the backbone? Thank you very much!
Hi,
Thanks for your good job!What is the effect of replacing transformer with swin transformer?
Looking forward to your reply~
The text was updated successfully, but these errors were encountered: