Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

DM-9795: fix prior/likelihood weight bug #44

Merged
merged 4 commits into from Mar 29, 2017
Merged

Conversation

TallJimbo
Copy link
Member

No description provided.

In Bayesian fitting, we need to do some kind of variance weighting in
the likelihood even when we want a constant weight across all pixels -
otherwise the likelihood and prior will not be weighted correctly
relative to each other.

Note that this makes it impossible to run CModel on an image with
exactly zero variance.
unweightedData.deep() = data;
// Convert from variance to weights (1/sigma); this is actually the usual inverse-variance
// weighting, because we implicitly square it later.
weights.asEigen<Eigen::ArrayXpr>() = variance.asEigen<Eigen::ArrayXpr>().sqrt().inverse();
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You're taking a sqrt here just to square it later (or take a log); isn't that a bit wasteful? Maybe it's not important because the model evaluations consume far more time?

@@ -98,29 +98,36 @@ def tearDown(self):
def testNoNoise(self):
"""Test that CModelAlgorithm.apply() works when applied to a postage-stamp
containing only a point source with no noise.

We still have to pretend there is noise (i.e. have nonzero values in
the variance plane) to allow it to compute a likelihood, though.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

And that's why the rtol is non-zero now?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Indirectly, yes. Before this change CModel_fluxSigma was exactly zero in this test, which was easy to verify. Now it's some nonzero number that's hard to estimate in the unit test in a non-circular way, so we have a fuzzy comparison instead.

@TallJimbo TallJimbo merged commit 437bf3e into master Mar 29, 2017
@ktlim ktlim deleted the tickets/DM-9795 branch August 25, 2018 05:50
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
2 participants