Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adding entropy function analogous to SciPy #43255

Open
xiaomengy opened this issue Aug 19, 2020 · 2 comments
Open

Adding entropy function analogous to SciPy #43255

xiaomengy opened this issue Aug 19, 2020 · 2 comments
Labels
feature A request for a proper, new feature. module: numpy Related to numpy support, and also numpy compatibility of our operators triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module

Comments

@xiaomengy
Copy link
Contributor

xiaomengy commented Aug 19, 2020

馃殌 Feature

Add an entropy function in PyTorch to compute entropy = -sum(p * log(p), axis=axis), which is analogous to scipy.stats.entropy.

Motivation

It is a common thing to compute entropy given a distribution p in many use cases. Currently there are several ways to compute entropy in PyTorch listed as below.

  1. Direct implementation which is entropy = -((p * p.log()).sum(dim=-1)). This one works in many cases, but not elegant and efficient enough.
  2. Using torch.bmm as entropy = -(torch.bmm(p.view(M, 1, N), p.log().view(M, N, 1))).squeeze(). This one has better forward pass performance compared to the 1st one on CPU when MKL is enabled. However, the backward pass performance will be worse because of the backward computation of bmm.
  3. Using torch.distributions.categorical.Categorical.entropy as entropy = torch.distributions.categorical.Categorical(p).entropy(). This one looks the most direct one. However, it is much slower than both of the previous approaches in both forward and backward pass even if we don't need logit at all.

The 1st and 2nd approaches are not a direct function to compute entropy as scipy. So we can consider either to add a scipy-like entropy function or optimize the implementation of torch.distributions.categorical.Categorical.entropy.

cc @mruberry @rgommers

@xiaomengy
Copy link
Contributor Author

cc @mruberry @ngimel

@mruberry mruberry added feature A request for a proper, new feature. module: numpy Related to numpy support, and also numpy compatibility of our operators triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module labels Aug 19, 2020
@vadimkantorov
Copy link
Contributor

vadimkantorov commented Aug 19, 2020

Quite related: #9993 #22656 #40336 (comment). One needed UX decision is requiring or not the tensor to be pre-normalized by the user themselves

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature A request for a proper, new feature. module: numpy Related to numpy support, and also numpy compatibility of our operators triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module
Projects
None yet
Development

No branches or pull requests

3 participants