Join GitHub today
GitHub is home to over 40 million developers working together to host and review code, manage projects, and build software together.Sign up
Add zero clipping in `MapEvaluator.apply_psf` #2342
Presumably this is FFT noise and wouldn't be present if we used normal convolve?
Is it a problem? There's zero tests adapted here, so it doesn't affect any results?
My concern is performance.
If you merge this, you could see if there's a faster way to clip to >=0 with some Numpy function or expression that only loops over array values once, not twice. I think we have some similar clipping in some IRF evaluate methods, should check and use the same coding pattern.
@cdeil The problem is that predicted counts should never go below zero, otherwise e.g. the dataset.fake() method. Therefore, some fix is needed..
See my notebook:
@cdeil Yes, I'm sure it is FFT noise, but as long as the background is > 0 this shouldn't be a problem. So in this case I would rather argue, that the actual underlying problem is the background model and not the PSF convolution. For some reason the background drops to exactly! zero (probably for high energies), which does not really make sense. The predicted background rate should be always > 0 and is in reality above the FFT noise for sure.
In the likelihood evaluation the case of negative
If there is the use-case of background-free simulations, I think we should include the fix. It's probably needed for event-sampling anyway. But in this case the
@luca-giunti Could you make a mini-benchmark and time
@adonath sure! Indeed, i am using a Crab run from the HESS DR1, and I was surprised to find out that the background is zero above ~20 TeV.
Anyway, without this fix
With this fix, instead:
adonath left a comment
Thanks @luca-giunti! I guess on Friday we agreed to include this fix. Can you please add one regression test or at least an additional assert and comment to https://github.com/gammapy/gammapy/blob/master/gammapy/modeling/models/cube/tests/test_core.py#L410?