New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Implement Kumaraswamy Distribution #48285
Conversation
Hi @vishwakftw! Thank you for your pull request. We require contributors to sign our Contributor License Agreement, and yours needs attention. You currently have a record in our system, but we do not have a signature on file. In order for us to review and merge your code, please sign at https://code.facebook.com/cla. If you are contributing on behalf of someone else (eg your employer), the individual CLA may not be sufficient and your employer may need to sign the corporate CLA. If you have received this in error or have any questions, please contact us at cla@fb.com. Thanks! |
|
||
@property | ||
def variance(self): | ||
return _moments(self.concentration1, self.concentration0, 2) - torch.pow(self.mean, 2) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why not compute and use log moments and use logsumexp these calculations?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't see an immediate benefit, but let me try benchmarking.
Thanks for putting this up! |
Codecov Report
@@ Coverage Diff @@
## master #48285 +/- ##
=======================================
Coverage 81.04% 81.04%
=======================================
Files 1842 1843 +1
Lines 199238 199274 +36
=======================================
+ Hits 161463 161507 +44
+ Misses 37775 37767 -8 |
💊 CI failures summary and remediationsAs of commit fcf8b75 (more details on the Dr. CI page):
🕵️ 3 new failures recognized by patternsThe following CI failures do not appear to be due to upstream breakages: pytorch_xla_linux_bionic_py3_6_clang9_build (1/3)Step: "(Optional) Merge target branch" (full log | diagnosis details | 🔁 rerun)
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Tests look good to me. I haven't verified the math.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM after fixing lint error.
@@ -5,6 +5,10 @@ | |||
from typing import Dict, Any | |||
|
|||
|
|||
euler_constant = 0.57721566490153286060 # Euler Mascheroni Constant | |||
|
|||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think flake8 will complain about three empty lines; you can just remove one.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@ezyang has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
Signed-off-by: Edward Z. Yang <ezyang@fb.com>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@ezyang has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
Summary: This PR implements the Kumaraswamy distribution. cc: fritzo alicanb sdaulton Pull Request resolved: pytorch#48285 Reviewed By: ejguan Differential Revision: D25221015 Pulled By: ezyang fbshipit-source-id: e621b25a9c75671bdfc94af145a4d9de2f07231e
This PR implements the Kumaraswamy distribution.
cc: @fritzo @alicanb @sdaulton