Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

请问能提供COCO-CN的训练脚本吗? #53

Closed
ddNTP opened this issue Feb 13, 2023 · 3 comments
Closed

请问能提供COCO-CN的训练脚本吗? #53

ddNTP opened this issue Feb 13, 2023 · 3 comments

Comments

@ddNTP
Copy link

ddNTP commented Feb 13, 2023

如题,请问能提供8卡训练COCO-CN的finetune脚本吗?效果不会差太多的那种:)

@yangapku
Copy link
Member

我们整理一下哈

@ddNTP
Copy link
Author

ddNTP commented Feb 13, 2023

还有,我想问下,8卡(A100 80G)能复现flickr30k-cn和coco-cn中论文给的finetune的结果吗?比如通过调超参和梯度累加之类的方法。

@DtYXs
Copy link
Collaborator

DtYXs commented Feb 20, 2023

您好,coco-cn的finetune脚本更新到了最新的代码中run_scripts/coco-cn_finetune_vit-b-16_rbt-base.sh,其中超参已经适配到8卡A100,启用了grad-checkpointing。准备好数据集后,您只需要将MASTER_ADDR修改为localhost应该就可以直接运行了,可以选择验证集上表现最好的ckpt测试,应该可以与论文中32卡的结果较为一致。
对于8卡A100复现flickr30k-cn的结果,目前有一组经验上的超参batchsize=400 lr=1e-5 max_epochs=16 warmup=20,可以大体接近32卡的结果,理论上通过调超参还可以进一步提升,供您参考,也欢迎您提供更好的超参配置。

@yangapku yangapku closed this as completed Mar 1, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants