Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Test coverage for calculate_upper_limits() in core/unblinding.py #243

Open
mlincett opened this issue Jan 10, 2023 · 1 comment
Open

Test coverage for calculate_upper_limits() in core/unblinding.py #243

mlincett opened this issue Jan 10, 2023 · 1 comment

Comments

@mlincett
Copy link
Collaborator

It seems the core.unblinding is not properly covered by testing.

In fact, PR #242 at the time of writing should break unblinding but automated testing is successful.

It would be unfortunate to have an analysis approved for unblinding and find out only at the end that the tagged flarestack version is broken.

@mlincett
Copy link
Collaborator Author

I may be partially wrong in the sense that I expect unblinding to be tested in test_full_analysis_chain.py. It is only calculate_upper_limits() that lacks testing.

@mlincett mlincett changed the title Test coverage for core/unblinding.py Test coverage for calculate_upper_limits() in core/unblinding.py Jan 11, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant