You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I notice that this model is based on the "flow-based generative model", however, the loss does not include the logdet loss. The cal_jacobian is default set to False. Is this cause by you replaced some features of the forward output by the normal distribution?
I‘m new in learning the "flow-based generative model", maybe I got it wrong, could you help me? Thx!
The text was updated successfully, but these errors were encountered:
It is just because there is no need to calculate the Jacobian. Calculating Jacobian is to get the accurate log-likelihood. However, in our task, we don't care about the log-likelihood of z. Setting it false can speed up the training.
________________________________
From: JoeyF Zhou ***@***.***>
Sent: Friday, 9 July 2021 4:57 PM
To: Yang-Liu1082/InvDN ***@***.***>
Cc: Yang Liu ***@***.***>; Comment ***@***.***>
Subject: Re: [Yang-Liu1082/InvDN] Sth about the Loss Function (#9)
Thanks for your reply, I still have some other questions, would you like to add my wechat? My wechat ID is: wx_joeyf
—
You are receiving this because you commented.
Reply to this email directly, view it on GitHub<#9 (comment)>, or unsubscribe<https://github.com/notifications/unsubscribe-auth/ATMHHUDOSSDBJSZ35SOWJ33TW2MVLANCNFSM47T4TWKA>.
Hi,
I notice that this model is based on the "flow-based generative model", however, the loss does not include the logdet loss. The
cal_jacobian
is default set to False. Is this cause by you replaced some features of the forward output by the normal distribution?I‘m new in learning the "flow-based generative model", maybe I got it wrong, could you help me? Thx!
The text was updated successfully, but these errors were encountered: