New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Added q_sum
check
#77
Conversation
The Phase Estimation:
QFT:
The QPE distribution will never have I think that always renormalizing by Unfortunately this was done to support both pre-normalized and "number of counts"
|
Josh ran into this specific case: in the maxcut benchmark, when the num_shots is 100 and num_qubits is >= 10, the routine that computes the expected distribution produces an array of counts that are all zero ... 2**10 is 1024 possible measurements, and with only 100 shots, each expected bar is just 100/1024, less than 0.5 which rounds down to 0. This results in the divide by 0 error. This fix simply catches and reports that error, and permits the benchmark program to continue with crashing.
Jason, your comments are relevant, but it is a different case than what Josh ran into. What you describe should be discussed in a separate thread. |
Thanks for the clarification Tom! This does somewhat make me rethink the discretization step i.e. rounding to the number of expected results at given shot counts instead of exact probabilities. Basically, should we ever allow this error to actually occur? I agree this error message looks great. |
When running some benchmarks, if the expected distribution is too small
q_sum
will be zero and result in aZeroDivisionError
error when calculatinghellinger_fidelity_with_expected
.I wasn't sure on what the printed Error message should say. Please let me know if you have any suggestions.