Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

updates and fixes to template-generating code and failing unit tests and errors #559

Merged
merged 41 commits into from Jan 12, 2022

Conversation

moustakas
Copy link
Member

[WIP]

In this mishmash PR I'm planning to address #507, #537, #549, and #558. I'll provide details of the changes when I'm done.

@moustakas
Copy link
Member Author

Updates in this PR (still WIP, unit tests still not totally fixed):

  • A long-standing issue with the current template-generating code in desisim.templates is that if a simulated template did not satisfy the target-specific color-cuts, then the code would issue a non-catastrophic warning and then return a spectrum of zeros. I think that's a defensible choice because it would be misleading to return a mix of spectra which do and do not pass the color-cuts; however, this choice has also caused a myriad of downstream problems (including Suspiciously low QSO flux #507 and suspiciously low ELG/LRG/QSO flux #549). So, to address this issue, here I've done two things:

  • We can revisit this issue if the changes above do not resolve the underlying issue, but I've generated hundreds of templates of each flavor and haven't seen a single failure yet. But note that the iterative piece does add a bit extra computing time, but I think it's worth it.

  • The second long-standing issue I decided to fix was how we handle the velocity dispersion broadening, which made bitwise reproducibility hard to ensure. In this branch I've switched from using desisim.pixelsplines to scipy.ndimage.gaussian_filter1d. The upside is that the code is cleaner, faster, and reproducibility (to be demonstrated with some forthcoming unit tests) but the downside is that since I don't do any constant-velocity resampling of the basis templates, the velocity broadening is only approximately correct in the optical range. I don't think this shortcoming is a blocking factor, however.

  • I'll document some additional fixes that are coming along in this PR next.

@moustakas
Copy link
Member Author

@sbailey can I bring the trapezoidal rebinning script in redrock as a dependence to desisim (faster) or should I stick with desispec.interpolation.resample_flux (slower)?

@sbailey
Copy link
Contributor

sbailey commented Jan 11, 2022

I'd prefer to not bring in a new desisim dependency on redrock, but if it really makes a substantial difference in pragmatic runtime, ok. If that dependency becomes problematic we can copy the code into desisim (or otherwise augment desispec.interpolation to have the faster simpler version from redrock; desispec.interpolation.resample_flux is slower because it does more with ivar weighting, masking, etc.).

@coveralls
Copy link

coveralls commented Jan 11, 2022

Coverage Status

Coverage increased (+0.1%) to 45.64% when pulling a527625 on fix-failing-tests into 296bf7a on master.

@moustakas
Copy link
Member Author

@weaverba137 would you be able to help me debug these failing tests? Everything passes on my laptop but I'm worried that I've messed up the test configuration somehow, e.g., import desimodel.fastfiberacceptance doesn't even work.

@moustakas
Copy link
Member Author

Any thoughts on why the coverage tests fail and how I can fix this? Otherwise, this PR is ready for review (although I may have some additional small updates / bug fixes as I'm looking at the mock simulations in desitarget with this branch now, too.)

@weaverba137
Copy link
Member

The test coverage has decreased dramatically! It's rare to see a PR that changes the coverage by that much. Did you add a lot of code that is not tested or did you disable tests of existing code?

@weaverba137
Copy link
Member

Another discrepancy: You're using desimodel/0.17.0 code but desimodel/0.12.0 data. I've gone ahead and updated the CI test so that desimodel data is used consistently.

@moustakas
Copy link
Member Author

Thanks. I added many more unit tests and not a lot of new code, so I would have expected the coverage to increase a lot, not the other way around.

@weaverba137
Copy link
Member

OK, so we'll have to compare this branch to master.

@weaverba137
Copy link
Member

If you look at https://coveralls.io/builds/45574713, coverage of desisim/templates.py has decreased by ~ 11%. That's a lot, and it accounts for the bulk of the overall decrease.

@weaverba137
Copy link
Member

And test_templates.py is mostly commented out. So there isn't much of a mystery here. You must have thought you uncommented the tests, but didn't actually check in the change.

Copy link
Member

@weaverba137 weaverba137 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Still need to actually run tests on desisim/template.py. Other comments are relatively minor.

py/desisim/io.py Show resolved Hide resolved
py/desisim/test/test_quickgen.py Outdated Show resolved Hide resolved
py/desisim/test/test_templates.py Outdated Show resolved Hide resolved
py/desisim/test/test_top_level.py Outdated Show resolved Hide resolved
@weaverba137
Copy link
Member

Coverage is back to pre-PR levels, even with a slight increase and all tests are passing, so merge when ready.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants