Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Any easy approach to apply nn.Softmax() along each dimension? #1260

Closed
yuandong-tian opened this issue Apr 14, 2017 · 1 comment
Closed

Any easy approach to apply nn.Softmax() along each dimension? #1260

yuandong-tian opened this issue Apr 14, 2017 · 1 comment

Comments

@yuandong-tian
Copy link

Currently nn.Softmax() can only be applied on 2D tensor. Is that possible we can specify dimension? E.g.,

x = torch.FloatTensor(128, 5, 27)
softmax = nn.Softmax(axis=1)
y = softmax(Variable(x))
y.size()

y.size() == x.size() but y's second dimension (5) has been applied softmax operation.

@apaszke
Copy link
Contributor

apaszke commented Apr 14, 2017

Right now there's no other way than transposing and flattening all other dimensions to a single one at the front (and reversing that after softmax). It's on the roadmap and is already tracked in #1020.

@apaszke apaszke closed this as completed Apr 14, 2017
jjsjann123 pushed a commit to jjsjann123/pytorch that referenced this issue Dec 5, 2021
Change grid synchronization code to expand for cooperative groups, but also to allow multi grid reduction code.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants