Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Metropolis broadcasting issue #1083

Closed
morepoch opened this issue May 5, 2016 · 1 comment

Comments

@morepoch
Copy link

commented May 5, 2016

I try to use Metropolis sampling method for a simple hierarchical model, following is the code:

import pymc3 as pm
import numpy as np
#%% generate synthetic data
MU_PHI = -10
SD_PHI = 2
MU_THETA = 10.0
SD_THETA = 2
NUM_SUBJ = 20
NUM_OBS = 60
t_x = np.random.normal(size=(NUM_SUBJ, NUM_OBS))
t_z = np.random.normal(size=(NUM_SUBJ, NUM_OBS))
c_y = np.random.normal(size=(NUM_SUBJ, NUM_OBS))
#%% Model
basic_model = pm.Model()
with basic_model:
    # hyper-parameter
    mu_phi = pm.Normal('mu_phi', mu=MU_PHI, sd=SD_PHI)
    mu_theta = pm.Normal('mu_theta', mu=MU_THETA, sd=SD_THETA)
    # parameter
    phi = pm.Normal('phi', mu=mu_phi, sd=SD_PHI, shape=(NUM_SUBJ, 1))
    theta = pm.Normal('theta', mu=mu_theta, sd=SD_THETA, shape=(NUM_SUBJ, 1))
    # deterministic variable
    mu_c_y = theta - phi
    # observation
    var_c_y = pm.Normal('c_y', mu=mu_c_y, sd=1.0, shape=(NUM_SUBJ, NUM_OBS), observed=c_y)
    #
    start = pm.find_MAP()
    trace = pm.sample(10000, start=start, step=pm.Metropolis())
  • [Case 1] When I directly run the above code, I get the following ValueError:
ValueError: Input dimension mis-match. (input[1].shape[1] = 60, input[3].shape[1] = 1)
Apply node that caused the error: Elemwise{Composite{(i0 + (-sqr((i1 - (i2 - i3)))))}}(TensorConstant{(1L, 1L) o..3787715435}, TensorConstant{[[-0.55792..13113699]]}, Reshape{2}.0, phi_shared)
Toposort index: 7
Inputs types: [TensorType(float64, (True, True)), TensorType(float64, matrix), TensorType(float64, col), TensorType(float64, matrix)]
Inputs shapes: [(1L, 1L), (20L, 60L), (20L, 1L), (20L, 1L)]
Inputs strides: [(8L, 8L), (480L, 8L), (8L, 8L), (8L, 8L)]
Inputs values: [array([[-1.83787715]]), 'not shown', 'not shown', 'not shown']
Outputs clients: [[Sum{acc_dtype=float64}(Elemwise{Composite{(i0 + (-sqr((i1 - (i2 - i3)))))}}.0)]]

I guess it's because theano requires indicating broadcastable dimensions in the graph before compilation, while numpy can do broadcasting at run time.

  • [Case 2] When I modify the "deterministic variable" from
mu_c_y = theta - phi

to

mu_c_y = phi

or

mu_c_y = theta

Then there's no error anymore, the sampling goes well. The question is that when the deterministic variable involves two different stochastic variable, there's broadcasting issue, but if only with one, there's no issues.

  • [Case 3] With the original code, if I change to NUTS sampler, the sampling goes well without any issues.
@ColCarroll

This comment has been minimized.

Copy link
Member

commented Aug 16, 2019

Closing for inactivity.

@ColCarroll ColCarroll closed this Aug 16, 2019

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
3 participants
You can’t perform that action at this time.