We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
如题,请问能提供8卡训练COCO-CN的finetune脚本吗?效果不会差太多的那种:)
The text was updated successfully, but these errors were encountered:
我们整理一下哈
Sorry, something went wrong.
还有,我想问下,8卡(A100 80G)能复现flickr30k-cn和coco-cn中论文给的finetune的结果吗?比如通过调超参和梯度累加之类的方法。
您好,coco-cn的finetune脚本更新到了最新的代码中run_scripts/coco-cn_finetune_vit-b-16_rbt-base.sh,其中超参已经适配到8卡A100,启用了grad-checkpointing。准备好数据集后,您只需要将MASTER_ADDR修改为localhost应该就可以直接运行了,可以选择验证集上表现最好的ckpt测试,应该可以与论文中32卡的结果较为一致。 对于8卡A100复现flickr30k-cn的结果,目前有一组经验上的超参batchsize=400 lr=1e-5 max_epochs=16 warmup=20,可以大体接近32卡的结果,理论上通过调超参还可以进一步提升,供您参考,也欢迎您提供更好的超参配置。
run_scripts/coco-cn_finetune_vit-b-16_rbt-base.sh
grad-checkpointing
MASTER_ADDR
localhost
batchsize=400 lr=1e-5 max_epochs=16 warmup=20
No branches or pull requests
如题,请问能提供8卡训练COCO-CN的finetune脚本吗?效果不会差太多的那种:)
The text was updated successfully, but these errors were encountered: