-
Notifications
You must be signed in to change notification settings - Fork 628
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Re-implement entropy_relative. #1553
Re-implement entropy_relative. #1553
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for both finding this and fixing it! I have a suspicion there was an old issue or mailing list message that was asking for some quantity derived from relative entropy as well, so this is definitely good to get working.
Oh, I just noticed the comment at the top about vectorisation of the calculation of |
The intermittently failing test seems unrelated and is |
@jakelishman @Ericgig @Albantakis Ready for a second round of review, I think! |
I'm didn't find an obvious notebook in qutip-notebooks to add an example to. Not sure how we usually proceed in such cases -- skip having an example in a notebook? Make a new notebook? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looking good. For notebooks, we've historically not been good at updating them, I think. For a single, clearly defined function, probably just adding an Examples
section in the docstring might be more appropriate? I'm not entirely sure.
# Avoid -inf from log(0) -- these terms will be multiplied by zero later | ||
# anyway | ||
svals[abs(svals) < tol] = 1 | ||
nzrvals = rvals[abs(rvals) >= tol] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We can't use it without swapping how we handle the logarithms, but if it's useful in the future, scipy has scipy.special.xlogy
that specifically handles 0 ln(0) := 0
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm okay with not involving scipy.special.xlogy
for now -- it's another thing for readers of the code to think about and I'm not sure where the bottlenecks in the current code are or how much anyone cares about performance (my guess is that finding the eigenvalues is the slowest step now).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Oh yeah, no need to put it in now. Now that everything's vectorised, I'd be surprised if the dominant time factor wasn't the eigenstate calculation.
@jakelishman I added an example to the docstring. |
Nice, and it's all rendered correctly in the HTML documentation |
…ative-entropy Re-implement entropy_relative.
Description
Re-implement entropy_relative.
This function was implemented in March 2012, but then removed a month later. After some digging, it appears that the reason the function was removed is that it assumed the eigenvectors of the density matrices were identical (and in the same order).
After some reading of Nielsen & Chuang, I have re-implemented it to address the issue.
Still todo
Changelog
Re-implement entropy_relative which returns the quantum relative entropy between two density matrices.