New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Cannot Converge with L2 Loss #35
Comments
@kingnobro are you on the latest version of the library? |
Oh, I found the code was downloaded about three months ago. After using the latest version, the problem is solved. |
@kingnobro ok great! |
@lucidrains @kingnobro, what was the problem? What changes in the code in the newer version made it work? |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
I am trying to quantize the latent vector. To be specific, I use a Encoder to get the latent representation
z
of the input. Then I try to quantizez
, then sendz
into Decoder.However, during my experiment, I found the reconstruction loss cannot decrease with L2 loss, namely, the
EuclideanCodebook
. The model can converge with cosine similarity. Have any idea about this phenomenon?I think cosine similarity only considers the direction of the vector, instead of the scale of the vector. I still want to use
EuclideanCodebook
.The text was updated successfully, but these errors were encountered: