Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

TST: 5 test failures in test_wfc_destripe.py in devdeps #184

Closed
pllim opened this issue Oct 31, 2023 · 6 comments · Fixed by #192
Closed

TST: 5 test failures in test_wfc_destripe.py in devdeps #184

pllim opened this issue Oct 31, 2023 · 6 comments · Fixed by #192

Comments

@pllim
Copy link
Collaborator

pllim commented Oct 31, 2023

Example log: https://github.com/spacetelescope/acstools/actions/runs/6627739837/job/18206557160

Package versions: 
Numpy: 2.0.0.dev0+git20231027.cdfbdf4
Scipy: 1.12.0.dev0+1899.e35850d
Matplotlib: 3.9.0.dev355+g21ce59212d
Astropy: 6.1.dev0
beautifulsoup4: not available
requests: 2.31.0
stsci.tools: 4.1.0
...
../../acstools/tests/test_wfc_destripe.py::TestDestripe::test_generic[jb5g05ubq] FAILED [ 50%]
../../acstools/tests/test_wfc_destripe.py::TestDestripe::test_generic[ja0x03ojq] FAILED [ 58%]
../../acstools/tests/test_wfc_destripe.py::TestDestripe::test_generic[jc5001soq] FAILED [ 66%]
../../acstools/tests/test_wfc_destripe.py::TestDestripe::test_fullframe_flashed FAILED [ 75%]
../../acstools/tests/test_wfc_destripe.py::TestDestripe::test_fullframe_masked FAILED [ 83%]

These appears to be floating point differences. So someone who is familiar with the destriping algorithm should double check to make sure that these differences are acceptable, and if not, what kind of fixes are needed to make destriping compatible with unreleased versions of numpy/scipy/astropy/etc.

E           Extension HDU 1 (SCI, 1):
E           
E              Data contains differences:
E                Data differs at [np.int64(727), np.int64(6)]:
E                   a> np.float32(0.9507208)
E                    ?                   ^^
E                   b> np.float32(0.9507245)
E                    ?    
@jryon
Copy link
Contributor

jryon commented Nov 2, 2023

I took a look at the discrepancies and consulted with Norman. Given that the differences are all in the 5th or 6th significant figure and affect <2.5% of pixels in the detector, they are minor enough to ignore. Let me know if there's anything else I can help with.

@pllim

This comment was marked as resolved.

@pllim
Copy link
Collaborator Author

pllim commented Nov 7, 2023

I have #185 that would make the CI green but I did find some difference that require me to set the tolerance very high (atol=0.1). Is this something you would accept as a solution? If not, what was the new tolerance you were thinking about (how relaxed are you willing to go)?

@pllim
Copy link
Collaborator Author

pllim commented Nov 13, 2023

I confirmed that the failures are caused by numpy 2.0.dev , not scipy nor astropy dev versions. See #187 (comment)

@jryon
Copy link
Contributor

jryon commented Dec 7, 2023

I ran some tests in a numpy 2.0 dev environment and an environment with current numpy (1.26.2) to track down the cause of the failures. It looks like numpy 2.0 keeps results of some arithmetic in the original dtypes, when the current numpy converts them to np.float64. These differences in dtypes compound floating point errors during subsequent calculations, resulting in the discrepancies we see as test failures. This may explain some of the behavior we're seeing: https://numpy.org/neps/nep-0050-scalar-promotion.html#table-comparing-new-and-old-behaviour

For the failures, I think setting atol = 0.01 should allow the tests to pass. That level of error is acceptable to us given the stripe uncertainties (0.9 e- RMS). If that level of absolute tolerance doesn't work, I'll have to revisit this.

pllim added a commit to pllim/acstools that referenced this issue Dec 8, 2023
@pllim
Copy link
Collaborator Author

pllim commented Dec 8, 2023

Thanks for investigating, @jryon ! I think atol=0.01 worked. See #192

@pllim pllim closed this as completed in #192 Dec 8, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants