Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Requesting Advice on NF Methods #70

Closed
kayuksel opened this issue Feb 22, 2021 · 1 comment
Closed

Requesting Advice on NF Methods #70

kayuksel opened this issue Feb 22, 2021 · 1 comment
Labels
question Further information is requested

Comments

@kayuksel
Copy link

I am working on a project where I sample a set of n-dimensional points from a Gaussian distribution (of learnt parameters) as follows and then evaluate those points based on a loss function to update model parameters with gradient descent.

mu, std = self.lin_1(z), self.lin_2(z)
eps = torch.Tensor(*img_shape).normal_()
return self.act((eps.cuda() * std) + mu)

I would like to transform the Gaussian distribution for being able to sample those points from a more complex learnt distribution. In other words, the model needs to learn how to best transform points obtained from the Gaussian distribution.

I would be glad if you can suggest the best normalizing flows method (transform) to employ considering the following scalability requirements (whether or not it is available in this repo). Thank you very much in advance for your suggestion.

  • I am sampling 100K-dimensional points with a batch-size of 5K; hence, the scalability is crucial.
  • The method should be memory efficient and fast to train on a RTX series desktop Nvidia GPU.
  • There should not ideally be an additional regularization parameter to my current loss function.
  • Expressiveness of the method is not as important as scalability and robustness in the training.
@kayuksel kayuksel added the question Further information is requested label Feb 22, 2021
@Zymrael
Copy link
Member

Zymrael commented Jun 3, 2021

Hi @kayuksel. I would suggest the Neural Spline Flow line of work for a good trade-off between scalability and expressivity. I'm not sure if scaling to 100k dimensions is possible out-of-the-box. This is not really related to torchdyn, but you might want to checkout Pyro's implementation. Pyro is a robust and well-tested library for density estimation, generative modeling and probabilistic programming in general.

@Zymrael Zymrael closed this as completed Jul 20, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

2 participants