We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
感谢作者公开自己工作的代码! 我注意到可视化部分使用了attention rollout的方法,效果看起来不错,请问作者有没有在transformer基础上和TSCAM基础上使用attention rollout后的map来计算最终的定位效果,会有提升么?还是因为没有精度上的提升仅作为可视化的参考?
The text was updated successfully, but these errors were encountered:
你好,感谢你关注我们的工作。用attention rollout的方法,在ILSVRC2012上的定位结果和论文所使用的方法差不多,但是CUB-200-2011上attention rollout的定位性能要差一些,gt-known只有83.8,论文中的方法是87.8。
Sorry, something went wrong.
原来如此,感谢你的回复!
No branches or pull requests
感谢作者公开自己工作的代码!
我注意到可视化部分使用了attention rollout的方法,效果看起来不错,请问作者有没有在transformer基础上和TSCAM基础上使用attention rollout后的map来计算最终的定位效果,会有提升么?还是因为没有精度上的提升仅作为可视化的参考?
The text was updated successfully, but these errors were encountered: