Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Normalizing Flow vs Normalizing Flow VAE behavior #17

Closed
maulberto3 opened this issue Nov 20, 2022 · 2 comments
Closed

Normalizing Flow vs Normalizing Flow VAE behavior #17

maulberto3 opened this issue Nov 20, 2022 · 2 comments
Labels
bug Something isn't working question Further information is requested

Comments

@maulberto3
Copy link

I can't help but to wonder why the NormalizingFlow class use the flows' inverse method when computing forward_kl, but, on the contrary, when using the NormalizingFlowVAE, it uses the flows' forward method.

This way, when trying to fit MNIST with NormalizingFlow, when training and passing a batch of say (64, 784) images I get the following error:

     34 for i in range(len(self.flows) - 1, -1, -1):
     35     z, log_det = self.flows[i].inverse(z)
---> 36     log_q += log_det
     37 log_q += self.q0.log_prob(z)
     38 return -torch.mean(log_q)

RuntimeError: output with shape [64] doesn't match the broadcast shape [1, 64]

Any help/suggestion?

@VincentStimper VincentStimper added the question Further information is requested label Jan 20, 2023
@VincentStimper
Copy link
Owner

Hi @maulberto3,

in this package, flows are defined as maps from the latent to the observation space. To compute the forward KL divergence, you have to map the observations to the latent space, i.e. apply the inverse.
On the other side, in a variational autoencoder you sample from the latent space (prior) and transform it with the flow layers, i.e. us the forward direction.

In any case, the error is not related to the fact whether you use the forward and the inverse map. Probably, it is due to a misspecification in the flow layer, e.g. that certain parameters that you use to initialize the flow layer do not have the correct shape.
I'll close the issue for now, but if this does not resolve your problem, feel free to open it and add more details about what you are doing and in which context the error occurs.

Best regards,
Vincent

@VincentStimper VincentStimper added the bug Something isn't working label Jan 21, 2023
@VincentStimper
Copy link
Owner

The bug has been fix, see this issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working question Further information is requested
Projects
None yet
Development

No branches or pull requests

2 participants