Skip to content

High priority featutres

No due date 0% complete

To make Grad DFT more useful to more people, there are more features that we should add to the code. Such features include:

(1) Nuclear gradients: Since we don't calculate all of the Fock matrix terms from scratch in JAX, nuclear gradients like forces and stresses are not available using autodiff. We can either implement these in a non-auto diff way or im…

To make Grad DFT more useful to more people, there are more features that we should add to the code. Such features include:

(1) Nuclear gradients: Since we don't calculate all of the Fock matrix terms from scratch in JAX, nuclear gradients like forces and stresses are not available using autodiff. We can either implement these in a non-auto diff way or implement the calculation of all Fock matrix terms in JAX. Nuclear gradients could prove very useful in training functionals.

(2) Greatly expand the number of XC energy densities available in Grad DFT: JaxXC could be the answer here. If not, for useful solids for functionals, we should be able to have neural PBESol and SCAN type functionals.

(3) Exporting models: Grad DFT is for training neural DFT functionals. It is not designed for large scale simulations. As such, we should be able to export models to be used in popular high performance DFT codes. This may end up being part of a standalone package to Grad DFT itself, but is a necessary step for neural functionals to be used in production science.

Loading