You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
There seems to be an error when using manual model vectorization for transformed variables when the event shape of the variable is larger than 1. A minimal example would be:
importnumpyasnpimportpymc4aspmimporttensorflowastfmeans=np.random.random((3,1))*5+5noise=np.random.random((3, 10))
data=means+noisedata=data.astype('float32')
# We want to infer the means of the data:@pm.modeldefmodel():
means=yieldpm.HalfNormal(name='means', loc=0, scale=10, event_stack=3)
means=tf.repeat(tf.expand_dims(means, axis=-1), axis=-1, repeats=10)
likelihood=yieldpm.Normal(name='likeli', loc=means, scale=5, observed=data,
reinterpreted_batch_ndims=2)
trace=pm.sample(model(), num_samples=50, burn_in=200, use_auto_batching=False, num_chains=2)
print(means)
print(np.median(trace.posterior['model/means'], axis=(0,1)))
When the pm.Halfnormal is replaced by pm.Normal it works without problems.
If I understood the organization of the source code correctly, this error is due to the fact that the correct number of dimensions of the event shape is not passed to inverse_log_det_jacobian and forward_log_det_jacobian of tensorflow probability, for example exactly here: https://github.com/pymc-devs/pymc4/blob/master/pymc4/distributions/transforms.py#L131. Somehow the Transform class should have also have the number of dimensions of the event_shape as an attribute, to be able to calculate the determinant of the Jacobian correctly.
But eventually, I am using PyMC4 wrongly and there is another way to specify the model...
The text was updated successfully, but these errors were encountered:
Thanks for reporting this @jdehning. I can reproduce the exception you are running into and can confirm that its coming from the gradient computation. I still have to investigate more before confirming that the culprits are inverse_log_det_jacobian and forward_log_det_jacobian, but what you posted was very helpful to pinpoint the cause thus far. I'm not sure when I'll get a chance to fix this. Maybe @ferrine has some time to look into the problem a bit more.
Looks like I failed to fix the error quickly, I've tried to figure out if it is the wrong usage of reinterpreted_batch_ndims=2 and similar arguments but it was not successful.
There seems to be an error when using manual model vectorization for transformed variables when the event shape of the variable is larger than 1. A minimal example would be:
which leads to the following shape error:
When the
pm.Halfnormal
is replaced bypm.Normal
it works without problems.If I understood the organization of the source code correctly, this error is due to the fact that the correct number of dimensions of the event shape is not passed to
inverse_log_det_jacobian
andforward_log_det_jacobian
of tensorflow probability, for example exactly here: https://github.com/pymc-devs/pymc4/blob/master/pymc4/distributions/transforms.py#L131. Somehow the Transform class should have also have the number of dimensions of the event_shape as an attribute, to be able to calculate the determinant of the Jacobian correctly.But eventually, I am using PyMC4 wrongly and there is another way to specify the model...
The text was updated successfully, but these errors were encountered: