Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Load pretrained jax weights #263

Merged
merged 4 commits into from Feb 4, 2023
Merged

Load pretrained jax weights #263

merged 4 commits into from Feb 4, 2023

Conversation

l-bick
Copy link

@l-bick l-bick commented Jan 25, 2023

Added option to load pretrained jax-weights (such as from Alphafold) to be loaded for training. Implementation is analog to run_pretrained_openfold.

@gahdritz
Copy link
Collaborator

This is a nice option. One quibble: it looks like the "help" field is wrong ("Path to JAX model parameters. If None, and openfold_checkpoint_path is also None, parameters are selected automatically according to the model name from openfold/resources/params"). It doesn't look like pretrained parameters are automatically loaded if no checkpoint is specified, and it should stay that way; by default, training should obviously commence from a random initialization. I think we should change the option to "--resume_from_jax_params" to match the corresponding OF flag and change the help message to "Path to an .npz JAX parameter file with which to initialize the model"

@l-bick
Copy link
Author

l-bick commented Jan 31, 2023

You are right, thanks for pointing it out. The help message was uncarefully taken from run_pretrained. Ive adapted the conventions as suggested.

@gahdritz gahdritz merged commit b2d6bff into aqlaboratory:main Feb 4, 2023
@gahdritz
Copy link
Collaborator

gahdritz commented Feb 4, 2023

Thanks for the PR!

@l-bick l-bick deleted the load_pretrained_jax_weights branch August 31, 2023 06:50
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants