Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

TypeError: Unknown parameter type: <class 'theano.tensor.var.TensorVariable'> #4468

Closed
Erikmeier18 opened this issue Feb 9, 2021 · 2 comments

Comments

@Erikmeier18
Copy link

Erikmeier18 commented Feb 9, 2021

If you have questions about a specific use case, or you are not sure whether this is a bug or not, please post it to our discourse channel: https://discourse.pymc.io

Description of your problem

I'm new using pymc3 and wanted to learn how to use it with a simple example [https://docs.pymc.io/notebooks/GLM-linear.html]. When building the model and running it I get a problem related to theano.

Please provide a minimal, self-contained, and reproducible example.

import arviz as az
import matplotlib.pyplot as plt
import numpy as np
import pymc3 as pm

%config InlineBackend.figure_format = 'retina'
az.style.use("arviz-darkgrid")

size = 200
true_intercept = 1
true_slope = 2

x = np.linspace(0,1, size)
true_regression_line = true_intercept + true_slope * x
y = true_regression_line + np.random.normal(scale=0.5, size=size)
data = dict(x=x, y=y)

fig = plt.figure(figsize=((7,7)))
ax = fig.add_subplot(111, xlabel="x", ylabel="y", title="Generated data and underlying model")
ax.plot(x,y, "x", label="sampled data")
ax.plot(x, true_regression_line, label="true regression line", lw=2.0)
plt.legend(loc=0);

with pm.Model() as model:
    sigma = pm.HalfCauchy("sigma", beta=10, testval=1.0)
    intercept = pm.Normal("Intercept", 0, sigma=20)
    x_coeff = pm.Normal("x", 0, sigma=20)
    likelihood = pm.Normal("y", mu=intercept + x_coeff * x, sigma=sigma, observed=y)
    trace = pm.sample(3000, cores=2)
    
az.plot_trace(trace)

Please provide the full traceback.

---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
<ipython-input-1-807f8acca5be> in <module>
     27     x_coeff = pm.Normal("x", 0, sigma=20)
     28     likelihood = pm.Normal("y", mu=intercept + x_coeff * x, sigma=sigma, observed=y)
---> 29     trace = pm.sample(3000, cores=2)
     30 
     31 az.plot_trace(trace)

/opt/anaconda3/lib/python3.8/site-packages/pymc3/sampling.py in sample(draws, step, init, n_init, start, trace, chain_idx, chains, cores, tune, progressbar, model, random_seed, discard_tuned_samples, compute_convergence_checks, callback, return_inferencedata, idata_kwargs, mp_ctx, pickle_backend, **kwargs)
    479             # By default, try to use NUTS
    480             _log.info("Auto-assigning NUTS sampler...")
--> 481             start_, step = init_nuts(
    482                 init=init,
    483                 chains=chains,

/opt/anaconda3/lib/python3.8/site-packages/pymc3/sampling.py in init_nuts(init, chains, n_init, model, random_seed, progressbar, **kwargs)
   2168         raise ValueError("Unknown initializer: {}.".format(init))
   2169 
-> 2170     step = pm.NUTS(potential=potential, model=model, **kwargs)
   2171 
   2172     return start, step

/opt/anaconda3/lib/python3.8/site-packages/pymc3/step_methods/hmc/nuts.py in __init__(self, vars, max_treedepth, early_max_treedepth, **kwargs)
    166         `pm.sample` to the desired number of tuning steps.
    167         """
--> 168         super().__init__(vars, **kwargs)
    169 
    170         self.max_treedepth = max_treedepth

/opt/anaconda3/lib/python3.8/site-packages/pymc3/step_methods/hmc/base_hmc.py in __init__(self, vars, scaling, step_scale, is_cov, model, blocked, potential, dtype, Emax, target_accept, gamma, k, t0, adapt_step_size, step_rand, **theano_kwargs)
     91         vars = inputvars(vars)
     92 
---> 93         super().__init__(vars, blocked=blocked, model=model, dtype=dtype, **theano_kwargs)
     94 
     95         self.adapt_step_size = adapt_step_size

/opt/anaconda3/lib/python3.8/site-packages/pymc3/step_methods/arraystep.py in __init__(self, vars, model, blocked, dtype, **theano_kwargs)
    241         self.blocked = blocked
    242 
--> 243         func = model.logp_dlogp_function(
    244             vars, dtype=dtype, **theano_kwargs)
    245 

/opt/anaconda3/lib/python3.8/site-packages/pymc3/model.py in logp_dlogp_function(self, grad_vars, **kwargs)
    933         varnames = [var.name for var in grad_vars]
    934         extra_vars = [var for var in self.free_RVs if var.name not in varnames]
--> 935         return ValueGradFunction(self.logpt, grad_vars, extra_vars, **kwargs)
    936 
    937     @property

/opt/anaconda3/lib/python3.8/site-packages/pymc3/model.py in __init__(self, cost, grad_vars, extra_vars, dtype, casting, **kwargs)
    652         inputs = [self._vars_joined]
    653 
--> 654         self._theano_function = theano.function(
    655             inputs, [self._cost_joined, grad], givens=givens, **kwargs
    656         )

/opt/anaconda3/lib/python3.8/site-packages/theano/compile/function/__init__.py in function(inputs, outputs, mode, updates, givens, no_default_updates, accept_inplace, name, rebuild_strict, allow_input_downcast, profile, on_unused_input)
    335         # note: pfunc will also call orig_function -- orig_function is
    336         #      a choke point that all compilation must pass through
--> 337         fn = pfunc(
    338             params=inputs,
    339             outputs=outputs,

/opt/anaconda3/lib/python3.8/site-packages/theano/compile/function/pfunc.py in pfunc(params, outputs, mode, updates, givens, no_default_updates, accept_inplace, name, rebuild_strict, allow_input_downcast, profile, on_unused_input, output_keys)
    424 
    425     # transform params into theano.compile.In objects.
--> 426     inputs = [
    427         _pfunc_param_to_in(p, allow_downcast=allow_input_downcast) for p in params
    428     ]

/opt/anaconda3/lib/python3.8/site-packages/theano/compile/function/pfunc.py in <listcomp>(.0)
    425     # transform params into theano.compile.In objects.
    426     inputs = [
--> 427         _pfunc_param_to_in(p, allow_downcast=allow_input_downcast) for p in params
    428     ]
    429 

/opt/anaconda3/lib/python3.8/site-packages/theano/compile/function/pfunc.py in _pfunc_param_to_in(param, strict, allow_downcast)
    541     elif isinstance(param, In):
    542         return param
--> 543     raise TypeError(f"Unknown parameter type: {type(param)}")
    544 
    545 

TypeError: Unknown parameter type: <class 'theano.tensor.var.TensorVariable'>

Please provide any additional information below.
Before I was able to get the trace and run into problems trying to az.plot_trace(), which I tried to solve. I updated the CLT, uninstalled pymc3 and theano, tried git clone and version 3.11 gave another type of theano error related to gcc, so I uninstalled again and installed it back with conda (giving version 3.9.3). I'm running Big Sur and the problem could be related to my machine, since a colleague of mine with older OS was able to run correctly the complete code. I would appreciate any help on getting it solved.

Versions and main components

  • PyMC3 Version: 3.9.3
  • Theano Version: 1.0.5
  • Python Version: 3.8.3 (Clang 10.0.0)
  • Operating system: macOS Big Sur 11.2
  • How did you install PyMC3: (conda/pip) conda install pymc3 / conda install -c conda-forge mkl pymc3
@michaelosthege
Copy link
Member

Closing this issue, after the problem was resolved via discourse: https://discourse.pymc.io/t/typeerror-unknown-parameter-type-class-theano-tensor-var-tensorvariable/6745/3

@philongvn99
Copy link

[Google Colab] Maybe this solution is too late but for me, it is because of theano package is too old. Then it lead to mismatch of theano.Variable and theano.tensor.var.TensorVariable . Let's try to remove theano folder (or at least all of files) and try again. Thanks for solution of mqin

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants