You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi @jzhang38! Thanks for your question! It takes about 27 hours to pre-train our VideoMAE (ViT-B) for 800 epochs on Kinetics-400 using 64 Tesla V100 GPUs. If you find the GPU-Util is not high enough, please reduce the batch-size to alleviate the pressure of your CPUs and I/O. Set --batch_size to 16 can also give favorable results.
Hi, thanks for your solid works!
Would you mind sharing some pretraining time statistics? Like how long does it take to pretrain on Kinetics400 using 64 V100s.
The text was updated successfully, but these errors were encountered: