Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Sth about the Loss Function #9

Open
ZhouYiiFeng opened this issue Jul 1, 2021 · 3 comments
Open

Sth about the Loss Function #9

ZhouYiiFeng opened this issue Jul 1, 2021 · 3 comments

Comments

@ZhouYiiFeng
Copy link

Hi,

I notice that this model is based on the "flow-based generative model", however, the loss does not include the logdet loss. The cal_jacobian is default set to False. Is this cause by you replaced some features of the forward output by the normal distribution?

I‘m new in learning the "flow-based generative model", maybe I got it wrong, could you help me? Thx!

@Yang-Liu1082
Copy link
Owner

It is just because there is no need to calculate the Jacobian. Calculating Jacobian is to get the accurate log-likelihood. However, in our task, we don't care about the log-likelihood of z. Setting it false can speed up the training.

@ZhouYiiFeng

This comment has been minimized.

@Yang-Liu1082
Copy link
Owner

Yang-Liu1082 commented Jul 9, 2021 via email

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants