New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.
Already on GitHub? Sign in to your account
Logsigmoid as chaining operation #42597
Comments
Hmm, |
...suuuuuure? it's one darn slippery slope, but logsigmoid might deserve it. cc @gchanan |
softmax/logsumexp/log_softmax is also very-very common, may deserve migration from torch.nn.functional to torch, but it's a slippery slope indeed... |
@vadimkantorov softmax/logsumexp/log_softmax are already available as torch module and torch.Tensor methods? |
@Bartolo1024 I think this is currently unintetional... |
Only place I found so far related to how is Tensor.xxx() implemented is this pytorch/torch/csrc/jit/runtime/symbolic_script.cpp Lines 783 to 788 in 6162343
Looks like it wraps it within a helper function but still calling torch.xxx() under the hook> |
馃殌 Feature
I would like to have a chained log sigmoid operation in
torch.Tensor
object.Motivation
I prefer to use chained versions of the operations like
x.sum()
,x.mean()
, etc. instead oftorch.sum()
torch.mean()
, etc.This prevents me from the formulation of hardly readable long operations.
For example:
x.pow(2).sum(dim=1).mean()
is much cleaner than:
torch.mean(torch.sum(torch.pow(x, 2), dim=1))
Currently, the operation is available only in torch.nn or as torch module method or can be implemented as the combination of sigmoid and logarithm, which is less numerically stable.
Pitch
Add log sigmoid to tensor object methods.
The text was updated successfully, but these errors were encountered: