Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Refine HF solver to compute correct gradients #1780

Merged
merged 10 commits into from
Oct 28, 2021
Merged

Refine HF solver to compute correct gradients #1780

merged 10 commits into from
Oct 28, 2021

Conversation

soranjh
Copy link
Contributor

@soranjh soranjh commented Oct 20, 2021

Context:
The HF solver returns nan values when gradients are computed for any molecule larger than H2. The issue is related to the autograd failure in computing gradients of functions that perform matrix diagonalization for matrices with degenerate eigenvalues. See for example:

import autograd
import autograd.numpy as anp

s = anp.array([[ 1.00000000, 0.38675779, 0.58564814, 0.00000000, 0.00000000, -0.47793911],
               [ 0.38675779, 1.00000000, 0.24113657, 0.00000000, 0.00000000,  0.00000000],
               [ 0.58564814, 0.24113657, 1.00000000, 0.00000000, 0.00000000,  0.00000000],
               [ 0.00000000, 0.00000000, 0.00000000, 1.00000000, 0.00000000,  0.00000000],
               [ 0.00000000, 0.00000000, 0.00000000, 0.00000000, 1.00000000,  0.00000000],
               [-0.47793911, 0.00000000, 0.00000000, 0.00000000, 0.00000000,  1.00000000]])

def grad_fn():
    return lambda s: anp.linalg.eigh(s)[0][0]

print(autograd.grad(grad_fn())(s))

which returns

[[nan nan nan nan nan nan]
 [nan nan nan nan nan nan]
 [nan nan nan nan nan nan]
 [nan nan nan nan nan nan]
 [nan nan nan nan nan nan]
 [nan nan nan nan nan nan]]

The issue can be fixed by lifting the degeneracy through adding a small random value (in the order of 1.0e-12) to each of the diagonal elements of the matrix given above with

s = s + anp.diag(anp.random.rand(len(s)) * 1.0e-12)

This returns the gradients as

[[ 0.53101783 -0.15440008 -0.3480074   0.          0.          0.32262883]
 [-0.15440008  0.04489376  0.10118751  0.          0.         -0.09380837]
 [-0.3480074   0.10118751  0.22806983  0.          0.         -0.21143775]
 [ 0.          0.          0.          0.          0.          0.        ]
 [ 0.          0.          0.          0.          0.          0.        ]
 [ 0.32262883 -0.09380837 -0.21143775  0.          0.          0.19601858]]

Description of the Change:
In the generate_scf, the overlap matrix is modified such that a small random perturbation in the order of 1.0e-12, is added to its diagonal elements.

Benefits:
This change allows computing gradient values for any given molecule.

Possible Drawbacks:
The diagonal elements of the overlap matrix are all 1 since the atomic orbitals are normalised. We have assumed that adding a small random value to these elements does not affect the accuracy of the results returned by the HF solver. However, this is not a fundamental fix to the problem of autograd failing to compute the gradients when eigh is applied to a matrix with degenerate eigenvalues.

Related GitHub Issues:
No related issues.

@soranjh soranjh added WIP 🚧 Work-in-progress qchem ⚛️ Related to the QChem package labels Oct 20, 2021
@github-actions
Copy link
Contributor

Hello. You may have forgotten to update the changelog!
Please edit doc/releases/changelog-dev.md with:

  • A one-to-two sentence description of the change. You may include a small working example for new features.
  • A link back to this PR.
  • Your name (or GitHub username) in the contributors section.

@codecov
Copy link

codecov bot commented Oct 20, 2021

Codecov Report

Merging #1780 (afb5fbf) into qchem_hf (55a8d41) will increase coverage by 0.00%.
The diff coverage is 100.00%.

Impacted file tree graph

@@            Coverage Diff            @@
##           qchem_hf    #1780   +/-   ##
=========================================
  Coverage     98.95%   98.95%           
=========================================
  Files           215      215           
  Lines         16175    16181    +6     
=========================================
+ Hits          16006    16012    +6     
  Misses          169      169           
Impacted Files Coverage Δ
pennylane/hf/hamiltonian.py 100.00% <100.00%> (ø)
pennylane/hf/hartree_fock.py 100.00% <100.00%> (ø)

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 55a8d41...afb5fbf. Read the comment docs.

@josh146
Copy link
Member

josh146 commented Oct 21, 2021

@soranjh is there a deeper technical issue with computing gradients with degenerate eigenvalues? Or is it simply a matter of Autograd not supporting it (but other frameworks do?)

@soranjh
Copy link
Contributor Author

soranjh commented Oct 21, 2021

@soranjh is there a deeper technical issue with computing gradients with degenerate eigenvalues? Or is it simply a matter of Autograd not supporting it (but other frameworks do?)

@josh146, I tried jax for the same example given above and it worked. I believe it is just autograd that has this issue.

@josh146
Copy link
Member

josh146 commented Oct 21, 2021

@josh146, I tried jax for the same example given above and it worked. I believe it is just autograd that has this issue.

Oh, that's good to know! This means there is the option for us to simply patch/fix the autograd.numpy.linalg.eigh function to get the correct behaviour

@soranjh soranjh removed the WIP 🚧 Work-in-progress label Oct 26, 2021
@soranjh soranjh marked this pull request as ready for review October 26, 2021 13:40
@soranjh soranjh changed the title [WIP] refine HF solver to compute correct gradients Refine HF solver to compute correct gradients Oct 26, 2021
@soranjh soranjh requested a review from josh146 October 26, 2021 13:46
Copy link
Contributor

@rmoyard rmoyard left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It looks like a good temporary fix for making it into the release, but we should as @josh146 mentioned probably rewrite the autograd function in qml.math afterwards :)

Copy link
Member

@josh146 josh146 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @soranjh, looks good from my end!

for n, t in enumerate(h_ferm[1]):

if len(t) == 0:
h = qml.Hamiltonian([h_ferm[0][n]], [qml.Identity(0)])
h = h + qml.Hamiltonian([h_ferm[0][n]], [qml.Identity(0)])
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

minor question: does adding in a constant * I term cause any redundant quantum evaluations 🤔 E.g., is PL evaluating <psi|I|psi> even though it doesn't need to?

Not an issue for this PR, but if this is the case, this is a bug we should fix in a separate PR!

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have modified computation of the Hamiltonian such that all coefficients and ops are computed first and the PL Hamiltonian is computed once at the end using h = qml.Hamiltonian(coeffs, ops, simplify=True). This avoids multiple PL Hamiltonian additions and makes the process much faster and also fixes the issue of having redundant terms.

Copy link
Contributor

@ixfoduap ixfoduap left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks great!
Indeed, let's revisit this in the future for a cleaner fix

Copy link
Contributor

@agran2018 agran2018 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good.

@soranjh soranjh merged commit abe6a03 into qchem_hf Oct 28, 2021
@soranjh soranjh deleted the qchem_hf_eigh branch October 28, 2021 13:39
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
qchem ⚛️ Related to the QChem package
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants