Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

SNGP | Laplace RF Precision update inconsistent with the likelihood #258

Closed
nikhil-dce opened this issue Dec 14, 2020 · 4 comments
Closed

Comments

@nikhil-dce
Copy link

nikhil-dce commented Dec 14, 2020

For the CIFAR10 example, the precision matrix update (in the LaplaceRandomFeatureCovariance class in Edward) is based on gaussian likelihood, whereas in the loss function cross-entropy is being used.

The batch precision update seems different than in the SNGP paper. The update should be based on equation 9 of the paper. Is this observation correct or am I missing something?

#question

@dustinvtran
Copy link
Member

hey @nikhil-dce! were you able to get this resolved? ccing @jereliu

@jereliu
Copy link
Collaborator

jereliu commented Jan 9, 2021

Hi @nikhil-dce !

Yes you are correct the released implementation in UB used Gaussian likelihood - the main reason is to keep code simple, and performance-wise we did not observe a significant difference in ECE (and it seems to hurt OOD a bit).

@nikhil-dce
Copy link
Author

Hi @jereliu - Thank you for getting back to me and answering my question. It might be nice to add a comment somewhere in the paper/implementation regarding this deviation from the paper. Thanks again!

Hi @nikhil-dce !

Yes you are correct the released implementation in UB used Gaussian likelihood - the main reason is to keep code simple, and performance-wise we did not observe a significant difference in ECE (and it seems to hurt OOD a bit).

@jereliu
Copy link
Collaborator

jereliu commented Jan 12, 2021

Thanks Nikhil! Yes will add a note to the implementation in an upcoming code update. Will report here when the pull request merged.

Hi @jereliu - Thank you for getting back to me and answering my question. It might be nice to add a comment somewhere in the paper/implementation regarding this deviation from the paper. Thanks again!

Hi @nikhil-dce !

Yes you are correct the released implementation in UB used Gaussian likelihood - the main reason is to keep code simple, and performance-wise we did not observe a significant difference in ECE (and it seems to hurt OOD a bit).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants