You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I saw you mentioned there were 8 GPUs used for pretraining 9.5k UCF101 video clips. May I ask what the total pretraining time for pretraining UCF101 for 3200 epochs using batch size 192 (the setting in your paper) is?
Appreciate it!
The text was updated successfully, but these errors were encountered:
Hi @itsmag11! It takes about 30 hours to pre-train VideoMAE on UCF101 for 3200 epochs.
Besides, the pre-trained models and scripts on UCF101 are available!
Hello author,
I saw you mentioned there were 8 GPUs used for pretraining 9.5k UCF101 video clips. May I ask what the total pretraining time for pretraining UCF101 for 3200 epochs using batch size 192 (the setting in your paper) is?
Appreciate it!
The text was updated successfully, but these errors were encountered: