Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

Logsigmoid as chaining operation #42597

Open
Bartolo1024 opened this issue Aug 5, 2020 · 6 comments
Open

Logsigmoid as chaining operation #42597

Bartolo1024 opened this issue Aug 5, 2020 · 6 comments
Labels
function request A request for a new function or the addition of new arguments/modes to an existing function. triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module

Comments

@Bartolo1024
Copy link

馃殌 Feature

I would like to have a chained log sigmoid operation in torch.Tensor object.

Motivation

I prefer to use chained versions of the operations like x.sum(), x.mean(), etc. instead of torch.sum() torch.mean(), etc.
This prevents me from the formulation of hardly readable long operations.
For example:
x.pow(2).sum(dim=1).mean()
is much cleaner than:
torch.mean(torch.sum(torch.pow(x, 2), dim=1))

Currently, the operation is available only in torch.nn or as torch module method or can be implemented as the combination of sigmoid and logarithm, which is less numerically stable.

Pitch

Add log sigmoid to tensor object methods.

import torch
x = torch.rand(2, 3, 10, 10)
log_sigmoid_from_x = x.logsigmoid()
@malfet malfet added enhancement Not as big of a feature, but technically not a bug. Should be easy to fix topic: operator triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module labels Aug 5, 2020
@malfet
Copy link
Contributor

malfet commented Aug 5, 2020

Hmm, sigmoid is method of Tensor, so why not logsigmoid couldn't be one.
@ezyang, what do you think?

@ezyang
Copy link
Contributor

ezyang commented Aug 6, 2020

...suuuuuure? it's one darn slippery slope, but logsigmoid might deserve it. cc @gchanan

@vadimkantorov
Copy link
Contributor

vadimkantorov commented Aug 6, 2020

softmax/logsumexp/log_softmax is also very-very common, may deserve migration from torch.nn.functional to torch, but it's a slippery slope indeed...

@Bartolo1024
Copy link
Author

@vadimkantorov softmax/logsumexp/log_softmax are already available as torch module and torch.Tensor methods?

@vadimkantorov
Copy link
Contributor

@Bartolo1024 I think this is currently unintetional...

@firstprayer
Copy link

Only place I found so far related to how is Tensor.xxx() implemented is this

def sigmoid(self):
result = torch.sigmoid(self)
def backward(grad_output):
return (1 - result) * result * grad_output
return result, backward

Looks like it wraps it within a helper function but still calling torch.xxx() under the hook>

@mruberry mruberry added function request A request for a new function or the addition of new arguments/modes to an existing function. and removed enhancement Not as big of a feature, but technically not a bug. Should be easy to fix topic: operator labels Oct 10, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
function request A request for a new function or the addition of new arguments/modes to an existing function. triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module
Projects
None yet
Development

No branches or pull requests

6 participants