Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions RELEASE-NOTES.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,7 @@

### Maintenance

- All occurances of `sd` as a parameter name have been renamed to `sigma`. `sd` will continue to function for backwards compatibility.
- Made `BrokenPipeError` for parallel sampling more verbose on Windows.
- Added the `broadcast_distribution_samples` function that helps broadcasting arrays of drawn samples, taking into account the requested `size` and the inferred distribution shape. This sometimes is needed by distributions that call several `rvs` separately within their `random` method, such as the `ZeroInflatedPoisson` (Fix issue #3310).
- The `Wald`, `Kumaraswamy`, `LogNormal`, `Pareto`, `Cauchy`, `HalfCauchy`, `Weibull` and `ExGaussian` distributions `random` method used a hidden `_random` function that was written with scalars in mind. This could potentially lead to artificial correlations between random draws. Added shape guards and broadcasting of the distribution samples to prevent this (Similar to issue #3310).
Expand Down
8 changes: 4 additions & 4 deletions docs/source/Advanced_usage_of_Theano_in_PyMC3.rst
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@ be time consuming if the number of datasets is large)::
data = theano.shared(observed_data[0])
pm.Model() as model:
mu = pm.Normal('mu', 0, 10)
pm.Normal('y', mu=mu, sd=1, observed=data)
pm.Normal('y', mu=mu, sigma=1, observed=data)

# Generate one trace for each dataset
traces = []
Expand All @@ -53,7 +53,7 @@ variable for our observations::
x_shared = theano.shared(x)

with pm.Model() as model:
coeff = pm.Normal('x', mu=0, sd=1)
coeff = pm.Normal('x', mu=0, sigma=1)
logistic = pm.math.sigmoid(coeff * x_shared)
pm.Bernoulli('obs', p=logistic, observed=y)

Expand Down Expand Up @@ -210,8 +210,8 @@ We can now define our model using this new op::
tt_mu_from_theta = MuFromTheta()

with pm.Model() as model:
theta = pm.HalfNormal('theta', sd=1)
theta = pm.HalfNormal('theta', sigma=1)
mu = pm.Deterministic('mu', tt_mu_from_theta(theta))
pm.Normal('y', mu=mu, sd=0.1, observed=[0.2, 0.21, 0.3])
pm.Normal('y', mu=mu, sigma=0.1, observed=[0.2, 0.21, 0.3])

trace = pm.sample()
2 changes: 1 addition & 1 deletion docs/source/Probability_Distributions.rst
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ For example, if we wish to define a particular variable as having a normal prior

with pm.Model():

x = pm.Normal('x', mu=0, sd=1)
x = pm.Normal('x', mu=0, sigma=1)

A variable requires at least a ``name`` argument, and zero or more model parameters, depending on the distribution. Parameter names vary by distribution, using conventional names wherever possible. The example above defines a scalar variable. To make a vector-valued variable, a ``shape`` argument should be provided; for example, a 3x3 matrix of beta random variables could be defined with:

Expand Down
8 changes: 4 additions & 4 deletions docs/source/PyMC3_and_Theano.rst
Original file line number Diff line number Diff line change
Expand Up @@ -134,8 +134,8 @@ happens if we define a PyMC3 model. Let's look at a simple example::
data = true_mu + np.random.randn(50)

with pm.Model() as model:
mu = pm.Normal('mu', mu=0, sd=1)
y = pm.Normal('y', mu=mu, sd=1, observed=data)
mu = pm.Normal('mu', mu=0, sigma=1)
y = pm.Normal('y', mu=mu, sigma=1, observed=data)

In this model we define two variables: `mu` and `y`. The first is
a free variable that we want to infer, the second is an observed
Expand Down Expand Up @@ -184,7 +184,7 @@ example::
with pm.Model() as model:
mu = pm.Normal('mu', 0, 1)
sd = pm.HalfNormal('sd', 1)
y = pm.Normal('y', mu=mu, sd=sd, observed=data)
y = pm.Normal('y', mu=mu, sigma=sd, observed=data)

is roughly equivalent to this::

Expand Down Expand Up @@ -213,4 +213,4 @@ theano operation on them::
beta = pm.Normal('beta', 0, 1, shape=len(design_matrix))
predict = tt.dot(design_matrix, beta)
sd = pm.HalfCauchy('sd', beta=2.5)
pm.Normal('y', mu=predict, sd=sd, observed=data)
pm.Normal('y', mu=predict, sigma=sd, observed=data)
10 changes: 5 additions & 5 deletions docs/source/api/bounds.rst
Original file line number Diff line number Diff line change
Expand Up @@ -39,27 +39,27 @@ specification of a bounded distribution should go within the model block::

with pm.Model() as model:
BoundedNormal = pm.Bound(pm.Normal, lower=0.0)
x = BoundedNormal('x', mu=1.0, sd=3.0)
x = BoundedNormal('x', mu=1.0, sigma=3.0)

If the bound will be applied to a single variable in the model, it may be
cleaner notationally to define both the bound and variable together. ::

with model:
x = pm.Bound(pm.Normal, lower=0.0)('x', mu=1.0, sd=3.0)
x = pm.Bound(pm.Normal, lower=0.0)('x', mu=1.0, sigma=3.0)

However, it is possible to create multiple different random variables
that have the same bound applied to them::

with model:
BoundNormal = pm.Bound(pm.Normal, lower=0.0)
hyper_mu = BoundNormal("hyper_mu", mu=1, sd=0.5)
mu = BoundNormal("mu", mu=hyper_mu, sd=1)
hyper_mu = BoundNormal("hyper_mu", mu=1, sigma=0.5)
mu = BoundNormal("mu", mu=hyper_mu, sigma=1)

Bounds can also be applied to a vector of random variables. With the same
``BoundedNormal`` object we created previously we can write::

with model:
x_vector = BoundedNormal('x_vector', mu=1.0, sd=3.0, shape=3)
x_vector = BoundedNormal('x_vector', mu=1.0, sigma=3.0, shape=3)

Caveats
#######
Expand Down
14 changes: 7 additions & 7 deletions docs/source/developer_guide.rst
Original file line number Diff line number Diff line change
Expand Up @@ -147,8 +147,8 @@ explicit about the conversion. For example:
.. code:: python

with pm.Model() as model:
z = pm.Normal('z', mu=0., sd=5.) # ==> pymc3.model.FreeRV, or theano.tensor with logp
x = pm.Normal('x', mu=z, sd=1., observed=5.) # ==> pymc3.model.ObservedRV, also has logp properties
z = pm.Normal('z', mu=0., sigma=5.) # ==> pymc3.model.FreeRV, or theano.tensor with logp
x = pm.Normal('x', mu=z, sigma=1., observed=5.) # ==> pymc3.model.ObservedRV, also has logp properties
x.logp({'z': 2.5}) # ==> -4.0439386
model.logp({'z': 2.5}) # ==> -6.6973152

Expand Down Expand Up @@ -308,7 +308,7 @@ a model:
.. code:: python

with pm.Model() as m:
x = pm.Normal('x', mu=0., sd=1.)
x = pm.Normal('x', mu=0., sigma=1.)


Which is the same as doing:
Expand All @@ -317,7 +317,7 @@ Which is the same as doing:
.. code:: python

m = pm.Model()
x = m.Var('x', pm.Normal.dist(mu=0., sd=1.))
x = m.Var('x', pm.Normal.dist(mu=0., sigma=1.))


Both with the same output:
Expand Down Expand Up @@ -457,7 +457,7 @@ transformation <https://docs.pymc.io/notebooks/api_quickstart.html?highlight=cha

.. code:: python

z = pm.Lognormal.dist(mu=0., sd=1., transform=tr.Log)
z = pm.Lognormal.dist(mu=0., sigma=1., transform=tr.Log)
z.transform # ==> pymc3.distributions.transforms.Log


Expand Down Expand Up @@ -1051,14 +1051,14 @@ we get error (even worse, wrong answer with silent error):
with pm.Model() as m:
mu = pm.Normal('mu', 0., 1., shape=(5, 1))
sd = pm.HalfNormal('sd', 5., shape=(1, 10))
pm.Normal('x', mu=mu, sd=sd, observed=np.random.randn(2, 5, 10))
pm.Normal('x', mu=mu, sigma=sd, observed=np.random.randn(2, 5, 10))
trace = pm.sample_prior_predictive(100)

trace['x'].shape # ==> should be (100, 2, 5, 10), but get (100, 5, 10)

.. code:: python

pm.Normal.dist(mu=np.zeros(2), sd=1).random(size=(10, 4)) # ==> ERROR
pm.Normal.dist(mu=np.zeros(2), sigma=1).random(size=(10, 4)) # ==> ERROR

There are also other error related random sample generation (e.g.,
`Mixture is currently
Expand Down
4 changes: 2 additions & 2 deletions docs/source/history.rst
Original file line number Diff line number Diff line change
Expand Up @@ -109,8 +109,8 @@ Models are defined using a context manager (``with`` statement). The model is sp
with Model() as bioassay_model:

# Prior distributions for latent variables
alpha = Normal('alpha', 0, sd=100)
beta = Normal('beta', 0, sd=100)
alpha = Normal('alpha', 0, sigma=100)
beta = Normal('beta', 0, sigma=100)

# Linear combinations of parameters
theta = invlogit(alpha + beta*dose)
Expand Down
4 changes: 2 additions & 2 deletions docs/source/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -23,11 +23,11 @@

X, y = linear_training_data()
with pm.Model() as linear_model:
weights = pm.Normal('weights', mu=0, sd=1)
weights = pm.Normal('weights', mu=0, sigma=1)
noise = pm.Gamma('noise', alpha=2, beta=1)
y_observed = pm.Normal('y_observed',
mu=X.dot(weights),
sd=noise,
sigma=noise,
observed=y)

prior = pm.sample_prior_predictive()
Expand Down
12 changes: 6 additions & 6 deletions docs/source/notebooks/AR.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -177,8 +177,8 @@
"source": [
"tau = 1.0\n",
"with pm.Model() as ar1:\n",
" beta = pm.Normal('beta', mu=0, sd=tau)\n",
" data = pm.AR('y', beta, sd=1.0, observed=y)\n",
" beta = pm.Normal('beta', mu=0, sigma=tau)\n",
" data = pm.AR('y', beta, sigma=1.0, observed=y)\n",
" trace = pm.sample(1000, cores=4)\n",
" \n",
"pm.traceplot(trace);"
Expand Down Expand Up @@ -303,8 +303,8 @@
],
"source": [
"with pm.Model() as ar2:\n",
" beta = pm.Normal('beta', mu=0, sd=tau, shape=2)\n",
" data = pm.AR('y', beta, sd=1.0, observed=y)\n",
" beta = pm.Normal('beta', mu=0, sigma=tau, shape=2)\n",
" data = pm.AR('y', beta, sigma=1.0, observed=y)\n",
" trace = pm.sample(1000, cores=4)\n",
" \n",
"pm.traceplot(trace);"
Expand Down Expand Up @@ -362,9 +362,9 @@
],
"source": [
"with pm.Model() as ar2:\n",
" beta = pm.Normal('beta', mu=0, sd=tau)\n",
" beta = pm.Normal('beta', mu=0, sigma=tau)\n",
" beta2 = pm.Uniform('beta2')\n",
" data = pm.AR('y', [beta, beta2], sd=1.0, observed=y)\n",
" data = pm.AR('y', [beta, beta2], sigma=1.0, observed=y)\n",
" trace = pm.sample(1000, tune=1000, cores=4)\n",
"\n",
"pm.traceplot(trace);"
Expand Down
4 changes: 2 additions & 2 deletions docs/source/notebooks/BEST.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -128,8 +128,8 @@
"μ_s = y.value.std() * 2\n",
"\n",
"with pm.Model() as model:\n",
" group1_mean = pm.Normal('group1_mean', μ_m, sd=μ_s)\n",
" group2_mean = pm.Normal('group2_mean', μ_m, sd=μ_s)"
" group1_mean = pm.Normal('group1_mean', μ_m, sigma=μ_s)\n",
" group2_mean = pm.Normal('group2_mean', μ_m, sigma=μ_s)"
]
},
{
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -147,10 +147,10 @@
"outputs": [],
"source": [
"with pm.Model() as Centered_eight:\n",
" mu = pm.Normal('mu', mu=0, sd=5)\n",
" mu = pm.Normal('mu', mu=0, sigma=5)\n",
" tau = pm.HalfCauchy('tau', beta=5)\n",
" theta = pm.Normal('theta', mu=mu, sd=tau, shape=J)\n",
" obs = pm.Normal('obs', mu=theta, sd=sigma, observed=y)"
" theta = pm.Normal('theta', mu=mu, sigma=tau, shape=J)\n",
" obs = pm.Normal('obs', mu=theta, sigma=sigma, observed=y)"
]
},
{
Expand Down Expand Up @@ -1321,11 +1321,11 @@
"outputs": [],
"source": [
"with pm.Model() as NonCentered_eight:\n",
" mu = pm.Normal('mu', mu=0, sd=5)\n",
" mu = pm.Normal('mu', mu=0, sigma=5)\n",
" tau = pm.HalfCauchy('tau', beta=5)\n",
" theta_tilde = pm.Normal('theta_t', mu=0, sd=1, shape=J)\n",
" theta_tilde = pm.Normal('theta_t', mu=0, sigma=1, shape=J)\n",
" theta = pm.Deterministic('theta', mu + tau * theta_tilde)\n",
" obs = pm.Normal('obs', mu=theta, sd=sigma, observed=y)"
" obs = pm.Normal('obs', mu=theta, sigma=sigma, observed=y)"
]
},
{
Expand Down
4 changes: 2 additions & 2 deletions docs/source/notebooks/Euler-Maruyama_and_SDEs.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -249,7 +249,7 @@
" xh = EulerMaruyama('xh', dt, lin_sde, (lam, ), shape=N, testval=x_t)\n",
" \n",
" # predicted observation\n",
" zh = pm.Normal('zh', mu=xh, sd=5e-3, observed=z_t)"
" zh = pm.Normal('zh', mu=xh, sigma=5e-3, observed=z_t)"
]
},
{
Expand Down Expand Up @@ -629,7 +629,7 @@
" ah = pm.Uniform('ah', lower=0.5, upper=1.5)\n",
" mh = pm.Uniform('mh', lower=0.0, upper=1.0)\n",
" xyh = EulerMaruyama('xyh', dt, osc_sde, (τh, ah), shape=xys.shape, testval=xys)\n",
" zh = pm.Normal('zh', mu=mh * xyh[:, 0] + (1 - mh) * xyh[:, 1], sd=0.1, observed=zs)"
" zh = pm.Normal('zh', mu=mh * xyh[:, 0] + (1 - mh) * xyh[:, 1], sigma=0.1, observed=zs)"
]
},
{
Expand Down
20 changes: 10 additions & 10 deletions docs/source/notebooks/GLM-hierarchical-advi-minibatch.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -70,9 +70,9 @@
"source": [
"with pm.Model() as hierarchical_model:\n",
" # Hyperpriors for group nodes\n",
" mu_a = pm.Normal('mu_alpha', mu=0., sd=100**2)\n",
" mu_a = pm.Normal('mu_alpha', mu=0., sigma=100**2)\n",
" sigma_a = pm.Uniform('sigma_alpha', lower=0, upper=100)\n",
" mu_b = pm.Normal('mu_beta', mu=0., sd=100**2)\n",
" mu_b = pm.Normal('mu_beta', mu=0., sigma=100**2)\n",
" sigma_b = pm.Uniform('sigma_beta', lower=0, upper=100)"
]
},
Expand All @@ -93,9 +93,9 @@
"source": [
"with hierarchical_model:\n",
" \n",
" a = pm.Normal('alpha', mu=mu_a, sd=sigma_a, shape=n_counties)\n",
" a = pm.Normal('alpha', mu=mu_a, sigma=sigma_a, shape=n_counties)\n",
" # Intercept for each county, distributed around group mean mu_a\n",
" b = pm.Normal('beta', mu=mu_b, sd=sigma_b, shape=n_counties)"
" b = pm.Normal('beta', mu=mu_b, sigma=sigma_b, shape=n_counties)"
]
},
{
Expand Down Expand Up @@ -139,7 +139,7 @@
" eps = pm.Uniform('eps', lower=0, upper=100) \n",
" \n",
" # Data likelihood\n",
" radon_like = pm.Normal('radon_like', mu=radon_est, sd=eps, observed=log_radon_t, total_size=len(data))"
" radon_like = pm.Normal('radon_like', mu=radon_est, sigma=eps, observed=log_radon_t, total_size=len(data))"
]
},
{
Expand Down Expand Up @@ -238,21 +238,21 @@
"# Inference button (TM)!\n",
"with pm.Model():\n",
"\n",
" mu_a = pm.Normal('mu_alpha', mu=0., sd=100**2)\n",
" mu_a = pm.Normal('mu_alpha', mu=0., sigma=100**2)\n",
" sigma_a = pm.Uniform('sigma_alpha', lower=0, upper=100)\n",
" mu_b = pm.Normal('mu_beta', mu=0., sd=100**2)\n",
" mu_b = pm.Normal('mu_beta', mu=0., sigma=100**2)\n",
" sigma_b = pm.Uniform('sigma_beta', lower=0, upper=100)\n",
" \n",
" a = pm.Normal('alpha', mu=mu_a, sd=sigma_a, shape=n_counties)\n",
" b = pm.Normal('beta', mu=mu_b, sd=sigma_b, shape=n_counties)\n",
" a = pm.Normal('alpha', mu=mu_a, sigma=sigma_a, shape=n_counties)\n",
" b = pm.Normal('beta', mu=mu_b, sigma=sigma_b, shape=n_counties)\n",
" \n",
" # Model error\n",
" eps = pm.Uniform('eps', lower=0, upper=100)\n",
" \n",
" radon_est = a[county_idx] + b[county_idx] * data.floor.values\n",
" \n",
" radon_like = pm.Normal(\n",
" 'radon_like', mu=radon_est, sd=eps, observed=data.log_radon.values)\n",
" 'radon_like', mu=radon_est, sigma=eps, observed=data.log_radon.values)\n",
" \n",
" step = pm.NUTS(scaling=approx.cov.eval(), is_cov=True)\n",
" hierarchical_trace = pm.sample(2000, step, start=approx.sample()[0], progressbar=True)"
Expand Down
16 changes: 8 additions & 8 deletions docs/source/notebooks/GLM-hierarchical.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -220,8 +220,8 @@
"with pm.Model() as unpooled_model:\n",
" \n",
" # Independent parameters for each county\n",
" a = pm.Normal('a', 0, sd=100, shape=n_counties)\n",
" b = pm.Normal('b', 0, sd=100, shape=n_counties)\n",
" a = pm.Normal('a', 0, sigma=100, shape=n_counties)\n",
" b = pm.Normal('b', 0, sigma=100, shape=n_counties)\n",
" \n",
" # Model error\n",
" eps = pm.HalfCauchy('eps', 5)\n",
Expand All @@ -233,7 +233,7 @@
" radon_est = a[county_idx] + b[county_idx]*data.floor.values\n",
" \n",
" # Data likelihood\n",
" y = pm.Normal('y', radon_est, sd=eps, observed=data.log_radon)\n",
" y = pm.Normal('y', radon_est, sigma=eps, observed=data.log_radon)\n",
" "
]
},
Expand Down Expand Up @@ -283,26 +283,26 @@
"source": [
"with pm.Model() as hierarchical_model:\n",
" # Hyperpriors for group nodes\n",
" mu_a = pm.Normal('mu_a', mu=0., sd=100**2)\n",
" mu_a = pm.Normal('mu_a', mu=0., sigma=100**2)\n",
" sigma_a = pm.HalfCauchy('sigma_a', 5)\n",
" mu_b = pm.Normal('mu_b', mu=0., sd=100**2)\n",
" mu_b = pm.Normal('mu_b', mu=0., sigma=100**2)\n",
" sigma_b = pm.HalfCauchy('sigma_b', 5)\n",
" \n",
" # Intercept for each county, distributed around group mean mu_a\n",
" # Above we just set mu and sd to a fixed value while here we\n",
" # plug in a common group distribution for all a and b (which are\n",
" # vectors of length n_counties).\n",
" a = pm.Normal('a', mu=mu_a, sd=sigma_a, shape=n_counties)\n",
" a = pm.Normal('a', mu=mu_a, sigma=sigma_a, shape=n_counties)\n",
" # Intercept for each county, distributed around group mean mu_a\n",
" b = pm.Normal('b', mu=mu_b, sd=sigma_b, shape=n_counties)\n",
" b = pm.Normal('b', mu=mu_b, sigma=sigma_b, shape=n_counties)\n",
" \n",
" # Model error\n",
" eps = pm.HalfCauchy('eps', 5)\n",
" \n",
" radon_est = a[county_idx] + b[county_idx] * data.floor.values\n",
" \n",
" # Data likelihood\n",
" radon_like = pm.Normal('radon_like', mu=radon_est, sd=eps, observed=data.log_radon)"
" radon_like = pm.Normal('radon_like', mu=radon_est, sigma=eps, observed=data.log_radon)"
]
},
{
Expand Down
Loading