Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Clarifying the * N in log_joint? #105

Closed
jilljenn opened this issue May 18, 2018 · 4 comments
Closed

Clarifying the * N in log_joint? #105

jilljenn opened this issue May 18, 2018 · 4 comments

Comments

@jilljenn
Copy link

Hi!

Here:

N, n_x = x_train.shape

...

def log_joint(observed):
    model, _ = bayesianNN(observed, x, n_x, layer_sizes, n_particles)
    log_pws = model.local_log_prob(w_names)
    log_py_xw = model.local_log_prob('y')
    return tf.add_n(log_pws) + log_py_xw * N

Source: https://github.com/thu-ml/zhusuan/blob/master/examples/bayesian_neural_nets/bayesian_nn.py#L96

Shouldn't log_py_xw be multiplied by B the batch size corresponding to tf.shape(x)[0], instead of the size of the train set?

@meta-inf
Copy link
Member

meta-inf commented May 18, 2018

The training set has size $N$ instead of batch_size, so an estimate of the log likelihood of the training set should use N.

@jilljenn
Copy link
Author

OK thanks :) I'm still surprised there is not a N / B somewhere.
If x is a batch, log_py_xw already returns a mean per sample? Should it be mentioned in the docs?

@meta-inf
Copy link
Member

meta-inf commented May 18, 2018

As y in the model isn't defined with non-zero group_ndims, its log_prob has shape [n_particles, batch_size], and the division by B is performed by the reduced_mean in L103/104.

The code does seem a bit ambiguous. Maybe we should add some comments, but they are being rewritten in the v4 branch anyway...

@jilljenn
Copy link
Author

Thanks, I'll have a look.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants