New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Avoid copy in background model evaluation #3592
Conversation
Codecov Report
@@ Coverage Diff @@
## master #3592 +/- ##
==========================================
- Coverage 93.85% 93.69% -0.16%
==========================================
Files 162 162
Lines 19345 19406 +61
==========================================
+ Hits 18156 18183 +27
- Misses 1189 1223 +34
Continue to review full report at Codecov.
|
gammapy/datasets/map.py
Outdated
if self.background_model and background: | ||
values = self.background_model.evaluate_geom(geom=self.background.geom) | ||
background = background * values | ||
if self._background_cached is None: | ||
self._background_cached = background * values |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This caching does not make sense to me. Don't you want to check on the model parameters and avoid the recomputation if not needed?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yes that could be added, for now I just did this as most of the time is spend in copy during background = background * values
just avoiding this allow to reduce npred_background computation by about 40%.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
added
gammapy/modeling/models/cube.py
Outdated
@@ -767,8 +768,9 @@ def evaluate(self): | |||
Background evaluated on the Map | |||
""" | |||
value = self.spectral_model(self.energy_center).value | |||
back_values = self.map.data * value | |||
return self.map.copy(data=back_values) | |||
self._evaluation_cache.data = self.map.data * value |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'd say we don't need the caching on the .evaluate()
level, this is only used for independent evaluation and not used during fitting.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
for the TemplateNPredModel
the evaluator call .evaluate()
gammapy/gammapy/datasets/evaluator.py
Line 361 in 19da784
npred = self.model.evaluate() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, but we have caching on the MapEvaluator
, so the effect on the performance should be completely negligible.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
True
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
removed
Thanks @QRemy, do you have a rough estimate how much this improve the performance in our benchmarks? |
I tested it only on the |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks @QRemy, no further comments from my side.
Avoid copy in background model evaluation in order to slightly improve performance.