Skip to content

Releases: FAIR-Chem/fairchem

fairchem_core-1.0.0

16 May 17:04
bcb3cf5
Compare
Choose a tag to compare

New repository structure!

We have restructured our codebase to have all of the FAIR Chemistry's code in one centralized repository of all its data, models, demos, and application efforts.

This release also includes releases for these other namespace packages

  • fairchem_data_oc-0.0.1
  • fairchem_demo_ocpapi-0.0.1
  • fairchem_applications_cattsunami-0.0.1

What's Changed

New Features

  • add unique key loader check to YAML to avoid problematic configs by @misko in #658
  • [BE] PyPi integration + local model cache + Github actions by @misko in #623
  • [BE] Add smoke test for [escn,gemnet,equiformer_v2] train+predict, Add optimization test for [escn,gemnet,equiformer_v2] by @misko in #640

Bug Fixes

Other Changes

Read more

fairchem_core-1.0.0b0

15 May 20:24
d0f61fa
Compare
Choose a tag to compare
fairchem_core-1.0.0b0 Pre-release
Pre-release

What's Changed

New Features

  • add unique key loader check to YAML to avoid problematic configs by @misko in #658
  • [BE] PyPi integration + local model cache + Github actions by @misko in #623
  • [BE] Add smoke test for [escn,gemnet,equiformer_v2] train+predict, Add optimization test for [escn,gemnet,equiformer_v2] by @misko in #640

Bug Fixes

Other Changes

Read more

v0.1.0

01 Oct 03:00
f9778b1
Compare
Choose a tag to compare

Major features and improvements

  • Release of the OC22 dataset (#358) focused on oxide electrocatalysis (paper, blog) and pretrained model weights (#414)
  • GemNet-OC implementation and pretrained model weights (#363) (paper)
  • Spherical Channel Network implementation (#362) (paper)
  • OC20 Bader charge data (#360)
  • PaiNN implementation and pretrained model weights (#344) (paper)
  • OCP tutorial (#314, #265)
  • Script to render relaxation trajectory GIFs (#259)
  • Load balancing batches across GPUs (#267, #277)
  • Support for cutoff radius beyond 1 unit cell across all directions in radius_graph_pbc (#268, #394)
  • Support for evaluating relaxations during S2EF training (#299)
  • Up to 5x speedup in running relaxations (#309)
  • 25% reduction in memory usage for inference of direct-force models (#323)
  • Atomwise L2 loss (#343, #346)

Breaking changes

  • DimeNet++ triplet calculation fix (#270)
    • This breaks backward compatibility to previously trained DimeNet++ models, though the change in performance is very small (< 0.5% relative).

Other changes

Read more

v0.0.3: GemNet-dT, SpinConv, new data: MD, Rattled, per-adsorbate trajectories, etc.

27 Aug 19:00
65c2d62
Compare
Choose a tag to compare

Breaking changes

  • Scheduler changed to step every iteration (#234). If your config specifies lr_milestones or warmup_epochs in epochs instead of steps, this will change how your scheduler behaves.
  • OCPCalculator no longer takes in a Trainer class as input. Instead, a yaml file and checkpoint path must be provided.

Major features

Other changes and improvements

  • Support for Python 3.8, PyTorch 1.8.1 (#247)
  • Early stopping for ML relaxations
  • Support for gradient clipping and maintaining an exponential moving average of parameters
  • Preprocessing support for datasets that do not have fixed atoms specified (#189)
  • Jupyter notebooks for creating LMDBs on your own data, understanding the data preprocessing pipeline (#211)
  • Release cached CUDA memory after each relaxation batch (#190)
  • Security fix in loading EvalAI npz submissions (#194)
  • Dataloader bug fix for when #GPUs > #LMDBs (#248)
  • Support for custom optimizers (#218)
  • Support for custom schedulers (#226)
  • New attributes (miller_index, shift, adsorption site, etc.) in data mapping (#219)
  • Deterministic unit tests (#228)
  • Bug fixes in released data for all tasks / splits (#197)
  • Improved logs: using logging instead of print commands, recording slurm settings
  • Better handling of job resumption on pre-emption; particularly relevant for those using slurm-based clusters. Model weights and training state are now saved to separate checkpoints, and all restarted jobs log to the same wandb plot instead of a new plot per restart.
  • Support for energy-only predictions in OCPCalculator (#243)

v0.0.2: DimeNet++, ForceNet, Torch relaxations, model zoo, etc.

01 Feb 23:46
a020f32
Compare
Choose a tag to compare

This release accompanies v2 of the OCP dataset paper.

Major features

  • DimeNet++ IS2RE and S2EF models (#143, #182, #184)
  • ForceNet S2EF model (#150)
  • Torch implementation of ML relaxations (#92)
  • Support for on-the-fly graph construction (#92)
  • Pretrained model zoo (#144)
  • Jupyter notebooks to explore the OCP dataset (#90) and train an S2EF SchNet (#123)
  • Consolidated data preprocessing (#91, #152)

Other improvements

  • Trainer refactoring (#84, #135)
  • Support for running inference and relaxations from main.py (#92)
  • Support for saving predictions in EvalAI-compatible formats (#93)
  • Dataloader performance improvements (#154)
  • Bug fixes in metrics (#85, #75)
  • Bug fix in how angles are computed in DimeNet (#78)
  • Support for CircleCI (#98)
  • Sphinx documentation (#100)

0.0.1

04 Oct 23:54
421e0f3
Compare
Choose a tag to compare
0.0.1 Pre-release
Pre-release

This is the initial release used to train all baseline models in the paper.