You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
It takes approximately 10 days on a single GPU to reproduce the numbers in our paper. That said, you can achieve satisfying results with much lower number of training iterations.
hello! Thanks for your excellent work!
I would like to ask how long it will take to train the generalization model on the GeForce RTX 3090 GPUs
The text was updated successfully, but these errors were encountered: