Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

gained 0.094 for eval_top1, and 0.33 for eval_top5, after 36-epoch training on 8 gpus #56

Open
CheerM opened this issue Jun 1, 2021 · 4 comments

Comments

@CheerM
Copy link

CheerM commented Jun 1, 2021

hi, would you mind releasing the training log for T2t-vit-t-14 training with 8 GPUs? I tried to rerun the script for training T2t-vit-t-14 with 8 GPUs. It gained 0.094 for eval_top1, 0.33 for eval_top5, after 36 epochs. It seems too slow to converge.

@lucastononrodrigues
Copy link

I don't know about the convergence over time but it should take 310 epochs to get the paper results.

@yuanli2333
Copy link
Collaborator

hi, would you mind releasing the training log for T2t-vit-t-14 training with 8 GPUs? I tried to rerun the script for training T2t-vit-t-14 with 8 GPUs. It gained 0.094 for eval_top1, 0.33 for eval_top5, after 36 epochs. It seems too slow to converge.

Hi, the log of T2t-vit-t-14 is trained with 8 GPUs. It's normal if your results are slightly higher or lower than the logs.

@WangChen0902
Copy link

hi, would you mind releasing the training log for T2t-vit-t-14 training with 8 GPUs? I tried to rerun the script for training T2t-vit-t-14 with 8 GPUs. It gained 0.094 for eval_top1, 0.33 for eval_top5, after 36 epochs. It seems too slow to converge.

Hello, have you solved the problem? I have the same problem. And the loss doesn't decrease.

@imkzh
Copy link

imkzh commented Mar 10, 2022

Same here, training T2t-vit-t-14 on 3 GPU with -b64, and after 80 epochs: top-1 acc = 0.095%, top-5 acc = 0.301%

It seems no improvements from epoch 20 (Top1=0.093, Top5=0.3199) through 80.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants