New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
DM-9795: fix prior/likelihood weight bug #44
Conversation
In Bayesian fitting, we need to do some kind of variance weighting in the likelihood even when we want a constant weight across all pixels - otherwise the likelihood and prior will not be weighted correctly relative to each other. Note that this makes it impossible to run CModel on an image with exactly zero variance.
9ad06d3
to
14a3a79
Compare
src/UnitTransformedLikelihood.cc
Outdated
unweightedData.deep() = data; | ||
// Convert from variance to weights (1/sigma); this is actually the usual inverse-variance | ||
// weighting, because we implicitly square it later. | ||
weights.asEigen<Eigen::ArrayXpr>() = variance.asEigen<Eigen::ArrayXpr>().sqrt().inverse(); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You're taking a sqrt
here just to square it later (or take a log); isn't that a bit wasteful? Maybe it's not important because the model evaluations consume far more time?
@@ -98,29 +98,36 @@ def tearDown(self): | |||
def testNoNoise(self): | |||
"""Test that CModelAlgorithm.apply() works when applied to a postage-stamp | |||
containing only a point source with no noise. | |||
|
|||
We still have to pretend there is noise (i.e. have nonzero values in | |||
the variance plane) to allow it to compute a likelihood, though. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
And that's why the rtol
is non-zero now?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Indirectly, yes. Before this change CModel_fluxSigma
was exactly zero in this test, which was easy to verify. Now it's some nonzero number that's hard to estimate in the unit test in a non-circular way, so we have a fuzzy comparison instead.
No description provided.