Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

log_likelihood loss incorrect and other loss names misleading #237

Closed
iancze opened this issue Dec 21, 2023 · 0 comments · Fixed by #248
Closed

log_likelihood loss incorrect and other loss names misleading #237

iancze opened this issue Dec 21, 2023 · 0 comments · Fixed by #248
Assignees

Comments

@iancze
Copy link
Collaborator

iancze commented Dec 21, 2023

Describe the bug

The mpol.losses.log_likelihood calculates the likelihood as
Screenshot 2023-12-21 at 11 21 08 AM
When it should actually be closer to
Screenshot 2023-12-21 at 11 22 27 AM
(e.g., Deep Learning: Foundations and Concepts, Eqn 2.66).

There is at least one error in the missing
$$\sum_i \ln \sigma_i^2$$
(I think there may still be factors of 2 that remain slightly different from the complex-valued calculations in MPoL vs. the text, which assumes real only), and the source code is missing the negative sign.

This must have been a case of me being more tired than normal, since I think I've implemented this correctly in other codebases. This also implies the mpol.losses.log_likelihood_gridded routine is also incorrect, since it calls mpol.losses.log_likelihood.

Morover, other loss functions in mpol.losses are incorrectly named for the quantity they actually calculate.

  • mpol.losses.nll does not actually calculate a negative log likelihood, it calculates a 'reduced' $\chi^2$, since it does not include the penalty for the weight values
  • same for mpol.losses.nll_gridded

Suggested fix

  • correct mpol.losses.log_likelihood to calculate the correct quantity
  • rename mpol.losses.nll and mpol.losses.nll_gridded to mpol.losses.reduced_chi_squared and mpol.losses.reduced_chi_squared_gridded, respectively
  • add a mpol.losses.log_likelihood_avg routine that is the average of mpol.losses.log_likelihood. This is useful for cases where the weights may be adjusted (and thus the penalty factor is needed) and we are working with batches of different data sizes.
  • recommend in documentation that mpol.losses.reduced_chi_squared and mpol.losses.reduced_chi_squared_gridded are default loss functions for RML imaging and that corrected mpol.losses.log_likelihood is the proper loss function for inference (e.g., MCMC).
  • document changes in changelog

Additional context
Recommend that we stay away from the nll name entirely, since it appears to be inconsistently defined in the broader ML context. Sometimes it is the negative log likelihood (i.e. negative of Eqn 2.66) but more often than not it some averaged or normalized factor that does not include the contribution from the weights. These factors matter when building RML workflows and can make it very tricky to intercompare results from different sized datasets.

Downstream updates
When fixed, @briannazawadzki @jeffjennings will need to update their calls to mpol.losses.nll_gridded -> mpol.losses.reduced_chi_squared_gridded.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant