Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Pretraining time #5

Closed
jzhang38 opened this issue May 1, 2022 · 1 comment
Closed

Pretraining time #5

jzhang38 opened this issue May 1, 2022 · 1 comment

Comments

@jzhang38
Copy link

jzhang38 commented May 1, 2022

Hi, thanks for your solid works!

Would you mind sharing some pretraining time statistics? Like how long does it take to pretrain on Kinetics400 using 64 V100s.

@yztongzhan
Copy link
Collaborator

yztongzhan commented May 1, 2022

Hi @jzhang38! Thanks for your question! It takes about 27 hours to pre-train our VideoMAE (ViT-B) for 800 epochs on Kinetics-400 using 64 Tesla V100 GPUs. If you find the GPU-Util is not high enough, please reduce the batch-size to alleviate the pressure of your CPUs and I/O. Set --batch_size to 16 can also give favorable results.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants