Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Get Relative Tolerances for Pixel Datapoints from Production #60

Closed
ric-evans opened this issue Jul 21, 2022 · 2 comments
Closed

Get Relative Tolerances for Pixel Datapoints from Production #60

ric-evans opened this issue Jul 21, 2022 · 2 comments
Assignees
Labels
bug Something isn't working production test run A realistic production-like test run

Comments

@ric-evans
Copy link
Member

Let's run the production scanner at least twice and see how the npz results differ pixel-to-pixel.

Then, for ScanResult.is_close(), we can decide whether to disqualify pixels with zero-energies (we're currently doing this) or to increase the rtol.

@ric-evans ric-evans added bug Something isn't working production test run A realistic production-like test run labels Jul 21, 2022
@ric-evans ric-evans changed the title Get Relative Tolerances for Pixel Datapoints Get Relative Tolerances for Pixel Datapoints from Production Jul 21, 2022
@mlincett
Copy link
Collaborator

mlincett commented Nov 7, 2022

PR #62 was meant to add alternative test data, namely the result of a second scanner iteration using the version of the scanner currently deployed in realtime.

Checking the differences between the two iterations (reference and alternative), it seems the llh values always match, while there are differences in the reconstructed reco losses across the scanned pixels.

The RMS of the relative difference, defined as (ref - alt) / ref, can be as large as 0.05. Not sure this is the best metric.

Note that we do not use the reconstructed losses for anything right now, but we should try to pinpoint the origin of any non-deterministic behaviour.

@ric-evans
Copy link
Member Author

There are other issues (#200 and #242) that are more relevant. Closing this issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working production test run A realistic production-like test run
Projects
None yet
Development

No branches or pull requests

2 participants