Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

New trainer compatibility fixes #630

Merged
merged 17 commits into from
Mar 5, 2024

Conversation

emsunshine
Copy link
Contributor

@emsunshine emsunshine commented Feb 26, 2024

This PR is intended to address #629.

There were some oversights in switching to the new trainer. The following fixes are implemented:

  • Change AtomsToGraphs to supply the new targets (i.e. rename y to energy)
  • test_ase_datasets and test_atoms_to_graphs are updated with the new target names
  • The update_config function now handles configs for ASE datasets
  • Changes the documentation to reflect the new location of the dataset format in the config

Copy link

codecov bot commented Feb 26, 2024

Codecov Report

Attention: Patch coverage is 27.27273% with 8 lines in your changes are missing coverage. Please review.

Project coverage is 57.20%. Comparing base (5741907) to head (987ba9f).

Files Patch % Lines
ocpmodels/trainers/ocp_trainer.py 0.00% 8 Missing ⚠️
Additional details and impacted files
@@                 Coverage Diff                  @@
##           ase_data_updates     #630      +/-   ##
====================================================
- Coverage             57.23%   57.20%   -0.03%     
====================================================
  Files                   109      109              
  Lines                 10279    10284       +5     
====================================================
  Hits                   5883     5883              
- Misses                 4396     4401       +5     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@emsunshine emsunshine marked this pull request as draft February 26, 2024 14:43
@emsunshine emsunshine changed the title Update documentation for dataset config changes Updates for new trainer compatibility Feb 26, 2024
@emsunshine emsunshine changed the title Updates for new trainer compatibility New trainer compatibility fixes Feb 27, 2024
@abhshkdz
Copy link
Collaborator

LGTM, thanks @emsunshine! Let's also have @mshuaibii sign-off on this before merging.

@lbluque
Copy link
Collaborator

lbluque commented Feb 27, 2024

@emsunshine this error mentioned in #629 still occurs for me using this branch when predicting forces.

I pushed a potential fix in #622, have a look please.

With this and #622 the tutorial should run fine --- although these two PRs will unfortunately have minor merge conflicts once ready.

@mshuaibii
Copy link
Collaborator

@emsunshine can you open this PR into #622 instead. This will make the merge cleaner and I can review that once you do that.

@emsunshine
Copy link
Contributor Author

I did not personally encounter this error but the fix you proposed makes sense. I'm happy to merge into #622 instead. I think the conflicts will be quite minor since this PR doesn't change the ASE datasets at all.

@emsunshine emsunshine changed the base branch from main to ase_data_updates February 28, 2024 00:46
Copy link
Collaborator

@lbluque lbluque left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @emsunshine! everything lgtm, the finetuning tutorial runs for me using this branch.

I just pushed a last commit directly here to avoid another merge, and added #622 in the description for provenance sakes

@lbluque lbluque merged commit 3b4ad43 into ase_data_updates Mar 5, 2024
5 checks passed
@lbluque lbluque mentioned this pull request Mar 6, 2024
@lbluque lbluque deleted the dataset-config-changes-documentation branch April 5, 2024 19:57
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants