Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cannot Converge with L2 Loss #35

Closed
kingnobro opened this issue Jan 6, 2023 · 4 comments
Closed

Cannot Converge with L2 Loss #35

kingnobro opened this issue Jan 6, 2023 · 4 comments

Comments

@kingnobro
Copy link

kingnobro commented Jan 6, 2023

I am trying to quantize the latent vector. To be specific, I use a Encoder to get the latent representation z of the input. Then I try to quantize z, then send z into Decoder.

However, during my experiment, I found the reconstruction loss cannot decrease with L2 loss, namely, the EuclideanCodebook. The model can converge with cosine similarity. Have any idea about this phenomenon?

I think cosine similarity only considers the direction of the vector, instead of the scale of the vector. I still want to use EuclideanCodebook.

@kingnobro kingnobro changed the title Cannot Converge with L1 Loss Cannot Converge with L2 Loss Jan 6, 2023
@lucidrains
Copy link
Owner

@kingnobro are you on the latest version of the library?

@kingnobro
Copy link
Author

@kingnobro are you on the latest version of the library?

Oh, I found the code was downloaded about three months ago. After using the latest version, the problem is solved.

@lucidrains
Copy link
Owner

@kingnobro ok great!

@IISCAditayTripathi
Copy link

@lucidrains @kingnobro, what was the problem? What changes in the code in the newer version made it work?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants