Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add zero clipping in `MapEvaluator.apply_psf` #2342

Merged
merged 2 commits into from Sep 10, 2019

Conversation

@luca-giunti
Copy link
Contributor

commented Sep 5, 2019

This minimal PR fixes a bug in the PSF convolution.

Without this fix, calling dataset.npred() could predict negative counts in some bins, as shown in the attached figure (that whas produced calling `dataset.npred().slice_by_idx({'energy' : 7}).plot(vmax=0, add_cbar=True)
psf_wrong_map

@luca-giunti luca-giunti changed the title Forced negative values to be zero Fix PSF convolution Sep 5, 2019
@luca-giunti luca-giunti requested a review from adonath Sep 5, 2019
@cdeil

This comment has been minimized.

Copy link
Member

commented Sep 5, 2019

Presumably this is FFT noise and wouldn't be present if we used normal convolve?

Is it a problem? There's zero tests adapted here, so it doesn't affect any results?

My concern is performance. tmp.data[tmp.data < 0.0] = 0 processes the array twice, so FFT convolve becomes slower (see #1695).

If you merge this, you could see if there's a faster way to clip to >=0 with some Numpy function or expression that only loops over array values once, not twice. I think we have some similar clipping in some IRF evaluate methods, should check and use the same coding pattern.

@cdeil cdeil added this to the 0.14 milestone Sep 5, 2019
@luca-giunti

This comment has been minimized.

Copy link
Contributor Author

commented Sep 5, 2019

@cdeil The problem is that predicted counts should never go below zero, otherwise e.g. the dataset.fake() method. Therefore, some fix is needed..

See my notebook:
https://github.com/luca-giunti/share/blob/master/Test_notebook.ipynb

@adonath

This comment has been minimized.

Copy link
Member

commented Sep 5, 2019

@cdeil Yes, I'm sure it is FFT noise, but as long as the background is > 0 this shouldn't be a problem. So in this case I would rather argue, that the actual underlying problem is the background model and not the PSF convolution. For some reason the background drops to exactly! zero (probably for high energies), which does not really make sense. The predicted background rate should be always > 0 and is in reality above the FFT noise for sure.

In the likelihood evaluation the case of negative npred is already handled correctly:
https://github.com/gammapy/gammapy/blob/master/gammapy/stats/fit_statistics_cython.pyx#L27
https://github.com/gammapy/gammapy/blob/master/gammapy/stats/fit_statistics.py#L51

If there is the use-case of background-free simulations, I think we should include the fix. It's probably needed for event-sampling anyway. But in this case the npred could also be clipped before sampling from it...

@luca-giunti Could you make a mini-benchmark and time dataset.npred() in your notebook using %%timeitonce with and once without the fix? Just that we get a rough number...

@luca-giunti

This comment has been minimized.

Copy link
Contributor Author

commented Sep 5, 2019

@adonath sure! Indeed, i am using a Crab run from the HESS DR1, and I was surprised to find out that the background is zero above ~20 TeV.

Anyway, without this fix %timeit dataset.npred() yields:
12.3 ms ± 492 µs per loop (mean ± std. dev. of 7 runs, 100 loops each)

With this fix, instead:
13.7 ms ± 1.22 ms per loop (mean ± std. dev. of 7 runs, 100 loops each)

Copy link
Member

left a comment

Thanks @luca-giunti! I guess on Friday we agreed to include this fix. Can you please add one regression test or at least an additional assert and comment to https://github.com/gammapy/gammapy/blob/master/gammapy/modeling/models/cube/tests/test_core.py#L410?

@luca-giunti

This comment has been minimized.

Copy link
Contributor Author

commented Sep 9, 2019

@adonath Sure. The link is broken, I guess because things are changing fast. You meant here right?

@luca-giunti luca-giunti force-pushed the luca-giunti:Fix-PSF-convolution branch from 7e50969 to 98d5487 Sep 9, 2019
@luca-giunti luca-giunti force-pushed the luca-giunti:Fix-PSF-convolution branch from 98d5487 to 8a90a4e Sep 9, 2019
@cdeil

This comment has been minimized.

Copy link
Member

commented on gammapy/cube/tests/test_fit.py in 8a90a4e Sep 9, 2019

Simpler:

assert np.all(npred.data >= 0)
@adonath adonath merged commit 0918d20 into gammapy:master Sep 10, 2019
8 of 9 checks passed
8 of 9 checks passed
Scrutinizer Errored
Details
Codacy/PR Quality Review Up to standards. A positive pull request.
Details
continuous-integration/travis-ci/pr The Travis CI build passed
Details
gammapy.gammapy Build #20190909.31 succeeded
Details
gammapy.gammapy (DevDocs) DevDocs succeeded
Details
gammapy.gammapy (Lint) Lint succeeded
Details
gammapy.gammapy (Test Python36) Test Python36 succeeded
Details
gammapy.gammapy (Test Windows36) Test Windows36 succeeded
Details
gammapy.gammapy (Test Windows37) Test Windows37 succeeded
Details
@adonath

This comment has been minimized.

Copy link
Member

commented Sep 10, 2019

Thanks @luca-giunti!

@adonath adonath changed the title Fix PSF convolution Add zero clipping in `MapEvaluator.apply_psf` Sep 27, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
3 participants
You can’t perform that action at this time.