Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Confusion about Lambda #4

Closed
ForgottenOneNyx opened this issue Feb 14, 2019 · 1 comment
Closed

Confusion about Lambda #4

ForgottenOneNyx opened this issue Feb 14, 2019 · 1 comment

Comments

@ForgottenOneNyx
Copy link

Hello, Firstly thank you for the awesome work!
I had a question in the Pytorch_Wasserstein.ipynb:

In the WassersteinLossVanilla, why is it
self.K = torch.exp(-self.cost/self.lam) ?
Shouldn't it be
self.K = torch.exp(-self.cost*self.lam)?

In mocha also it is the above https://github.com/pluskid/Mocha.jl/blob/5e15b882d7dd615b0c5159bb6fde2cc040b2d8ee/src/layers/wasserstein-loss.jl#L33

Have you changed it because "Note that we use a different convention for $\lambda$ (i.e. we use $\lambda$ as the weight for the regularisation, later versions of the above use $\lambda^-1$ as the weight)." ?

Also what is the reason for the above?

@t-vi
Copy link
Owner

t-vi commented Jul 17, 2019

Hi,
They're mostly equivalent, to me the more natural view seemed to be lambda as the same unit as cost.
In the meantime I revised the notebook to feature a kernel and added a writeup of my maths.

Best regards

Thomas

@t-vi t-vi closed this as completed Jul 17, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants