Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

why normalize_weight need div torch.mean(x)? #15

Open
alpc91 opened this issue Jun 2, 2021 · 1 comment
Open

why normalize_weight need div torch.mean(x)? #15

alpc91 opened this issue Jun 2, 2021 · 1 comment

Comments

@alpc91
Copy link

alpc91 commented Jun 2, 2021

def normalize_weight(x):
min_val = x.min()
max_val = x.max()
x = (x - min_val) / (max_val - min_val)
x = x / torch.mean(x)
return x.detach()

according to paper, x in (0,1), why normalize_weight need div torch.mean(x)?

@alpc91
Copy link
Author

alpc91 commented Jun 2, 2021

ce = nn.CrossEntropyLoss(reduction='none')(predict_prob_source, label_source)
And why feed after_sotmax(predict_prob_source) to nn.CrossEntropyLoss? This criterion has combined LogSoftmax and NLLLoss.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant