Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

MDN Implementation using Edward2 #44

Open
Niknafs opened this issue Oct 24, 2019 · 2 comments
Open

MDN Implementation using Edward2 #44

Niknafs opened this issue Oct 24, 2019 · 2 comments
Labels
good first issue Good for newcomers

Comments

@Niknafs
Copy link

Niknafs commented Oct 24, 2019

The current example on MDN from Edward tutorials needs small modifications to run on edward2. Documentation covering these modifications will be appreciated.

@dustinvtran dustinvtran added the good first issue Good for newcomers label Oct 26, 2019
@dustinvtran
Copy link
Member

This would be hugely useful if anyone's interested in contributing!

For the record, the following model works:

def build_neural_network():
  inputs = tf.keras.layers.Input(...)
  net = tf.keras.layers.Dense(X, 15, activation='relu')(inputs)
  net = tf.keras.layers.Dense(15, activation=tf.nn.relu)(net)
  locs = tf.keras.layers.Dense(K, activation=None)(net)
  scales = tf.keras.layers.Dense(K, activation=tf.exp)(net)
  logits = tf.keras.layers.Dense(K, activation=None)(net)
  model = tf.keras.Model(inputs=inputs, outputs=[locs, scales, logits])
  return model

K = 20  # number of mixture components
features = ...  # data features

neural_network = build_neural_network()
locs, scales, logits = neural_network(features)
cat = Categorical(logits=logits)
components = [Normal(loc=loc, scale=scale) for loc, scale
              in zip(tf.unstack(tf.transpose(locs)),
                     tf.unstack(tf.transpose(scales)))]
y = Mixture(cat=cat, components=components, value=tf.zeros_like(features))

You can then train it using gradient descent following any TF 2.0 tutorial.

@Niknafs
Copy link
Author

Niknafs commented Oct 29, 2019

Thanks, Dustin! Can you please verify that the references to Categorical and Normal are from edward2, and not tfp.distributions?

When running the above using Categorical and Normal from edward2, I get the following error:

TypeError: cat must be a Categorical distribution, but saw: RandomVariable("Categorical_1/", shape=(?,), dtype = int32)

Also, do you mind sharing a pointer to one such TF 2.0 tutorial? I am running TF 1.14.0 and TFP 0.7.0.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
good first issue Good for newcomers
Projects
None yet
Development

No branches or pull requests

2 participants