New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Can I use a GAN-based network to replace the flow-based prior P(Z|X)? #9
Comments
Hi @seekerzz! The issue is that you need the prior to sample z's and inference the probs of z's efficiently. While GAN can do the sampling, it cannot do the inference. |
Thank you so much for the quick reply!😁
|
Hmm, I was too concerned about the computation of the KL at first glance of your question. I think your idea is doable. GAN is good at sampling high-quality samples. However, if you sample from GAN and close the distance with P(Z|X, Y), it's would be a reverse KL computation. But it doesn't mean that it's a bad choice as we care more about the generative quality instead of NLL or ELBO for TTS. Anyway, I think you can give it a shot. Good luck! |
Many thanks to you😁😊 |
If I understand this paper and FlowSeq correctly, the normalizing flow is used to model the dependence of text X (from the posterior P(Z|X, Y)).
As GAN can also model the distribution, can I use a GAN-based network to replace the flow-based prior P(Z|X)?
The text was updated successfully, but these errors were encountered: