You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
You'll likely be able to train the model using AlphaFold's initial training settings (256 crop size, 128 MSA sequences, 1024 extra MSA sequences), which work even on our 11GB 2080 Ti's. You'll probably need to keep the "clear_cache_between_blocks" option enabled in the extra MSA stack, which comes with a minor performance hit. Training on the full finetuning settings, though (crops of size 384, extra MSAs of size 5120, etc.) is currently out of the question with just 12GB. We're working on fused or low-memory versions of components of the model, so that may change in the future, but even then, 12 GB will probably be a stretch. Of course, the model is fully configurable, so you may also reduce the numbers of blocks, etc., if you want.
Well done!I am quitly wondering that using 4 TITAN 2080 with 12G, can i train this model? will i meet the error on out of the memory?
The text was updated successfully, but these errors were encountered: