Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Model reparameterization changing outcome probabilities #425

Open
juangmendoza19 opened this issue Apr 23, 2024 · 1 comment
Open

Model reparameterization changing outcome probabilities #425

juangmendoza19 opened this issue Apr 23, 2024 · 1 comment
Assignees
Labels
bug A bug or regression
Milestone

Comments

@juangmendoza19
Copy link

juangmendoza19 commented Apr 23, 2024

calling set_all_parameterizations on a full TP model to a GLND model is changing some outcome probabilities non-trivially.

To reproduce:

`from pygsti.modelpacks import smq1Q_XY as std

datagen_model = std.target_model("GLND")

#Arbitrary error where I observed the problem
error_vec = [0] * 48
error_vec[0] = .01
datagen_model.from_vector(error_vec)

design = std.create_gst_experiment_design(16)

#Circuit with the maximum difference
bad_circuit = design.all_circuits_needing_data[394]

datagen_model_copy = datagen_model.copy()
datagen_model_copy.set_all_parameterizations("full TP")
datagen_model_copy.set_all_parameterizations("GLND", ideal_model=std.target_model("GLND"))

datagen_model.probabilities(bad_circuit)['0'] - datagen_model_copy.probabilities(bad_circuit)['0']`

Expected behavior
The code above outputs a probability difference of -1.406968064276981e-08. This is a substantial difference causing issues in my current project which requires comparison of gauge-equivalent models.

Environment:

  • pyGSTi version 0.9.12.1pyth
  • python version 3.10.14
  • macOS Sonoma 14.4.1

Additional context
After an email exchange with Riley and Corey, Riley identified the problem in the state preparation. One of the vector entries deviates by 2.7057608985464707e-08 after conversion. This makes sense considering the model only has errors in the state preparation.

I believe I have identified the issue being in pygsti/modelmembers/states/__init__py line 269. This scipy optimization returns the exact number above "2.7057608985464707e-08" as the error. I tried changing the tolerance of the optimization, but this did not seem to change its behavior.

@juangmendoza19 juangmendoza19 added the bug A bug or regression label Apr 23, 2024
@sserita sserita added this to the 0.9.13 milestone Apr 23, 2024
@coreyostrove coreyostrove modified the milestones: 0.9.13, 0.9.14 Jul 10, 2024
@juangmendoza19
Copy link
Author

juangmendoza19 commented Jul 16, 2024

Update on this issue: I identified a second bug in this function. set_all_parameterization does not function properly either with errors on measurements. The error channels and provided ideal models are not used properly. The returned model never has measurement error, and the ideal measurement is instead the noisy measurement from the original model. @coreyostrove and I are currently working on fixing these.

Going back to the original state preparation problem, it seems like changing the optimization "method" parameter to "Nelder-Mead" solves the problem, although I don't know if using a different optimization algorithm will cause other problems.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug A bug or regression
Projects
None yet
Development

No branches or pull requests

4 participants