Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Softmax across all keypoints? #19

Closed
Yishun99 opened this issue Dec 26, 2018 · 1 comment
Closed

Softmax across all keypoints? #19

Yishun99 opened this issue Dec 26, 2018 · 1 comment

Comments

@Yishun99
Copy link

Yishun99 commented Dec 26, 2018

class Flatten(nn.Module):
    def forward(self, input):
        return input.view(input.size(0), -1)

class PRN(nn.Module):
    def __init__(self,node_count,coeff):
        ...
        self.softmax   = nn.Softmax(dim=1)

    def forward(self, x):
        res = self.flatten(x)
        ...
        out = self.add(out,res)  # [N,H*W*C]
        out = self.softmax(out)
        out = out.view(out.size()[0],self.height, self.width, 17)

        return out
@salihkaragoz
Copy link
Owner

Hi,
We tried both softmax per keypoint and across keypoints. Computing the softmax activations per keypoint channel is the intuitive way. However, during the experiments, we get similar results with both and use the simplest one.
I am closing this issue, feel free for further questions,
Best

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants