PyMC 3.5 (July 21 2018)
- Add documentation section on survival analysis and censored data models
- Improve error message
Mass matrix contains zeros on the diagonal. Some derivatives might always be zeroduring tuning of
- Improve error message
NaN occurred in optimization.during ADVI
- Save and load traces without
- Rewrite parallel sampling of multiple chains on py3. This resolves long standing issues when transferring large traces to the main process, avoids pickling issues on UNIX, and allows us to show a progress bar for all chains. If parallel sampling is interrupted, we now return partial results.
sample_prior_predictivewhich allows for efficient sampling from the unconditioned model.
- SMC: remove experimental warning, allow sampling using
sample, reduce autocorrelation from final trace.
model_to_graphviz(which uses the optional dependency
graphviz) to plot a directed graph of a PyMC3 model using plate notation.
- Add beta-ELBO variational inference as in beta-VAE model (Christopher P. Burgess et al. NIPS, 2017)
SingleGroupApproximationto improve autocompletion in interactive environments
- Fixed grammar in divergence warning, previously
There were 1 divergences ...could be raised.
KeyErrorraised when only subset of variables are specified to be recorded in the trace.
- Removed unused
repeat=Nonearguments from all
random()methods in distributions.
- Deprecated the
MarginalSparse.marginal_likelihoodin favor of
- Fixed unexpected behavior in
random. Now the
randomfunctionality is more robust and will work better for
sample_priorwhen that is implemented.
scale_cost_to_minibatchbehaviour, previously this was not working and always
PyMC 3.4.1 (April 18 2018)
pm.Bernoulli, so that users can specify the logit of the success probability. This is faster and more stable than using
pm.DensityDistthus enabling users to pass custom random method which in turn makes sampling from a
- Effective sample size computation is updated. The estimation uses Geyer's initial positive sequence, which no longer truncates the autocorrelation series inaccurately.
pm.diagnostics.effective_nnow can reports N_eff>N.
KroneckerNormaldistribution and a corresponding
MarginalKronGaussian Process implementation for efficient inference, along with lower-level functions such as
- Add new 'pairplot' function, for plotting scatter or hexbin matrices of sampled parameters. Optionally it can plot divergences.
- Plots of discrete distributions in the docstrings
- Add logitnormal distribution
- Densityplot: add support for discrete variables
- Fix the Binomial likelihood in
.glm.families.Binomial, with the flexibility of specifying the
- Changed the
comparefunction to accept a dictionary of model-trace pairs instead of two separate lists of models and traces.
- add test and support for creating multivariate mixture and mixture of mixtures
distribution.draw_values, now is also able to draw values from conditionally dependent RVs, such as autotransformed RVs (Refer to PR #2902).
VonMisesdoes not overflow for large values of kappa. i0 and i1 have been removed and we now use log_i0 to compute the logp.
- The bandwidth for KDE plots is computed using a modified version of Scott's rule. The new version uses entropy instead of standard deviation. This works better for multimodal distributions. Functions using KDE plots has a new argument
bwcontrolling the bandwidth.
- fix PyMC3 variable is not replaced if provided in more_replacements (#2890)
- Fix for issue #2900. For many situations, named node-inputs do not have a
randommethod, while some intermediate node may have it. This meant that if the named node-input at the leaf of the graph did not have a fixed value,
theanowould try to compile it and fail to find inputs, raising a
theano.gof.fg.MissingInputError. This was fixed by going through the theano variable's owner inputs graph, trying to get intermediate named-nodes values if the leafs had failed.
distribution.draw_values, some named nodes could be
theano.tensor.sharedvar.SharedVariables. Nevertheless, in
distribution._draw_value, these would be passed to
distribution._compile_theano_functionas if they were
theano.tensor.TensorVariables. This could lead to the following exceptions
TypeError: ('Constants not allowed in param list', ...)or
TypeError: Cannot use a shared variable (...). The fix was to not add
theano.tensor.sharedvar.SharedVariablenamed nodes into the
givensdict that could be used in
- Exponential support changed to include zero values.
- DIC and BPIC calculations have been removed
- df_summary have been removed, use summary instead
nchainskwarg are deprecated in favor of
PyMC 3.3 (January 9, 2018)
- Improve NUTS initialization
MatrixNormalclass for representing vectors of multivariate normal variables
- New benchmark suite added (see http://pandas.pydata.org/speed/pymc3/)
- Generalized random seed types
- Update loo, new improved algorithm (#2730)
- New CSG (Constant Stochastic Gradient) approximate posterior sampling algorithm (#2544)
- Michael Osthege added support for population-samplers and implemented differential evolution metropolis (
DEMetropolis). For models with correlated dimensions that can not use gradient-based samplers, the
DEMetropolissampler can give higher effective sampling rates. (also see PR#2735)
- Forestplot supports multiple traces (#2736)
- Add new plot, densityplot (#2741)
- DIC and BPIC calculations have been deprecated
- Refactor HMC and implemented new warning system (#2677, #2808)
posteriorplotto scale fonts
df_summaryfunction renamed to
- Add test for
sample_ppc_wto iterate all chains(#2633, #2748)
- Add Bayesian R2 score (for GLMs)
stats.r2_score(#2696) and test (#2729).
- SMC works with transformed variables (#2755)
- Speedup OPVI (#2759)
- Multiple minor fixes and improvements in the docs (#2775, #2786, #2787, #2789, #2790, #2794, #2799, #2809)
- Old (
adviis removed (#2781)
PyMC3 3.2 (October 10, 2017)
This version includes two major contributions from our Google Summer of Code 2017 students:
- Maxim Kochurov extended and refactored the variational inference module. This primarily adds two important classes, representing operator variational inference (
OPVI) objects and
Approximationobjects. These make it easier to extend existing
variationalclasses, and to derive inference from
variationaloptimizations, respectively. The
variationalmodule now also includes normalizing flows (
- Bill Engels added an extensive new Gaussian processes (
gp) module. Standard GPs can be specified using either
Marginalclasses, depending on the nature of the underlying function. A Student-T process
TPhas been added. In order to accomodate larger datasets, approximate marginal Gaussian processes (
MarginalSparse) have been added.
Documentation has been improved as the result of the project's monthly "docathons".
An experimental stochastic gradient Fisher scoring (
SGFS) sampling step method has been added.
The API for
find_MAP was enhanced.
SMC now estimates the marginal likelihood.
HalfFlat distributions to set of continuous distributions.
Bayesian fraction of missing information (
bfmi) function added to
QuadPotential adaptation has been implemented.
Script added to build and deploy documentation.
MAP estimates now available for transformed and non-transformed variables.
Constant variable class has been deprecated, and will be removed in 3.3.
DIC and BPIC calculations have been sped up.
Arrays are now accepted as arguments for the
random method was added to the
Progress bars have been added to LOO and WAIC calculations.
All example notebooks updated to reflect changes in API since 3.1.
Parts of the test suite have been refactored.
Fixed sampler stats error in NUTS for non-RAM backends
Matplotlib is no longer a hard dependency, making it easier to use in settings where installing Matplotlib is problematic. PyMC will only complain if plotting is attempted.
Several bugs in the Gaussian process covariance were fixed.
All chains are now used to calculate WAIC and LOO.
AR(1) log-likelihood function has been fixed.
Slice sampler fixed to sample from 1D conditionals.
Several docstring fixes.
The following people contributed to this release (ordered by number of commits):
Maxim Kochurov firstname.lastname@example.org Bill Engels email@example.com Chris Fonnesbeck firstname.lastname@example.org Junpeng Lao email@example.com Adrian Seyboldt firstname.lastname@example.org AustinRochford email@example.com Osvaldo Martin firstname.lastname@example.org Colin Carroll email@example.com Hannes Vasyura-Bathke firstname.lastname@example.org Thomas Wiecki email@example.com michaelosthege firstname.lastname@example.org Marco De Nadai email@example.com Kyle Beauchamp firstname.lastname@example.org Massimo email@example.com ctm22396 firstname.lastname@example.org Max Horn email@example.com Hennadii Madan firstname.lastname@example.org Hassan Naseri email@example.com Peadar Coyle firstname.lastname@example.org Saurav R. Tuladhar email@example.com Shashank Shekhar firstname.lastname@example.org Eric Ma email@example.com Ed Herbst firstname.lastname@example.org tsdlovell email@example.com zaxtax firstname.lastname@example.org Dan Nichol email@example.com Benjamin Yetton firstname.lastname@example.org jackhansom email@example.com Jack Tsai firstname.lastname@example.org Andrés Asensio Ramos email@example.com
PyMC3 3.1 (June 23, 2017)
New user forum at http://discourse.pymc.io
Much improved variational inference support:
Added various optimizers including ADAM.
Stopping criterion implemented via callbacks.
sample() defaults changed: tuning is enabled for the first 500 samples which are then discarded from the trace as burn-in.
MvNormal supports Cholesky Decomposition now for increased speed and numerical stability.
Many optimizations and speed-ups.
NUTS implementation now matches current Stan implementation.
Add higher-order integrators for HMC.
ADVI stopping criterion implemented.
Improved support for theano's floatX setting to enable GPU computations (work in progress).
MvNormal supports Cholesky Decomposition now for increased speed and numerical stability.
Added support for multidimensional minibatches
Approximationclass and the ability to convert a sampled trace into an approximation via its
Modelcan now be inherited from and act as a base class for user specified models (see pymc3.models.linear).
Add MvGaussianRandomWalk and MvStudentTRandomWalk distributions.
GLM models do not need a left-hand variable anymore.
Refactored HMC and NUTS for better readability.
Add support for Python 3.6.
Bound now works for discrete distributions as well.
Random sampling now returns the correct shape even for higher dimensional RVs.
Use theano Psi and GammaLn functions to enable GPU support for them.
PyMC3 3.0 (January 9, 2017)
We are proud and excited to release the first stable version of PyMC3, the product of more than 5 years of ongoing development and contributions from over 80 individuals. PyMC3 is a Python module for Bayesian modeling which focuses on modern Bayesian computational methods, primarily gradient-based (Hamiltonian) MCMC sampling and variational inference. Models are specified in Python, which allows for great flexibility. The main technological difference in PyMC3 relative to previous versions is the reliance on Theano for the computational backend, rather than on Fortran extensions.
Since the beta release last year, the following improvements have been implemented:
variationalsubmodule, which features the automatic differentiation variational inference (ADVI) fitting method. Also supports mini-batch ADVI for large data sets. Much of this work was due to the efforts of Taku Yoshioka, and important guidance was provided by the Stan team (specifically Alp Kucukelbir and Daniel Lee).
Added model checking utility functions, including leave-one-out (LOO) cross-validation, BPIC, WAIC, and DIC.
Implemented posterior predictive sampling (
Implemented auto-assignment of step methods by
Enhanced IPython Notebook examples, featuring more complete narratives accompanying code.
Extensive debugging of NUTS sampler.
Updated documentation to reflect changes in code since beta.
Refactored test suite for better efficiency.
Added von Mises, zero-inflated negative binomial, and Lewandowski, Kurowicka and Joe (LKJ) distributions.
joblibfor managing parallel computation of chains.
Added contributor guidelines, contributor code of conduct and governance document.
- Argument order of tau and sd was switched for distributions of the normal family:
Normal(name, mu, tau)
Normal(name, mu, sd) (supplying keyword arguments is unaffected).
MvNormalcalling signature changed: Old:
MvNormal(name, mu, tau)New:
MvNormal(name, mu, cov)(supplying keyword arguments is unaffected).
We on the PyMC3 core team would like to thank everyone for contributing and now feel that this is ready for the big time. We look forward to hearing about all the cool stuff you use PyMC3 for, and look forward to continued development on the package.
The following authors contributed to this release:
Chris Fonnesbeck firstname.lastname@example.org John Salvatier email@example.com Thomas Wiecki firstname.lastname@example.org Colin Carroll email@example.com Maxim Kochurov firstname.lastname@example.org Taku Yoshioka email@example.com Peadar Coyle (springcoil) firstname.lastname@example.org Austin Rochford email@example.com Osvaldo Martin firstname.lastname@example.org Shashank Shekhar email@example.com
In addition, the following community members contributed to this release:
A Kuz firstname.lastname@example.org A. Flaxman email@example.com Abraham Flaxman firstname.lastname@example.org Alexey Goldin email@example.com Anand Patil firstname.lastname@example.org Andrea Zonca email@example.com Andreas Klostermann firstname.lastname@example.org Andres Asensio Ramos Andrew Clegg email@example.com Anjum48 Benjamin Edwards firstname.lastname@example.org Boris Avdeev email@example.com Brian Naughton firstname.lastname@example.org Byron Smith Chad Heyne email@example.com Corey Farwell firstname.lastname@example.org David Huard email@example.com David Stück firstname.lastname@example.org DeliciousHair email@example.com Dustin Tran Eigenblutwurst Hannes.Bathke@gmx.net Gideon Wulfsohn firstname.lastname@example.org Gil Raphaelli email@example.com Gogs firstname.lastname@example.org Ilan Man Imri Sofer email@example.com Jake Biesinger firstname.lastname@example.org James Webber email@example.com John McDonnell firstname.lastname@example.org Jon Sedar email@example.com Jordi Diaz Jordi Warmenhoven firstname.lastname@example.org Karlson Pfannschmidt email@example.com Kyle Bishop firstname.lastname@example.org Kyle Meyer email@example.com Lin Xiao Mack Sweeney firstname.lastname@example.org Matthew Emmett email@example.com Michael Gallaspy firstname.lastname@example.org Nick email@example.com Osvaldo Martin firstname.lastname@example.org Patricio Benavente email@example.com Raymond Roberts Rodrigo Benenson firstname.lastname@example.org Sergei Lebedev email@example.com Skipper Seabold firstname.lastname@example.org Thomas Kluyver email@example.com Tobias Knuth firstname.lastname@example.org Volodymyr Kazantsev Wes McKinney email@example.com Zach Ploskey firstname.lastname@example.org akuz email@example.com brandon willard firstname.lastname@example.org dstuck email@example.com ingmarschuster firstname.lastname@example.org jan-matthis email@example.com jason JasonTam22@gmailcom kiudee firstname.lastname@example.org maahnman email@example.com macgyver firstname.lastname@example.org mwibrow email@example.com olafSmits firstname.lastname@example.org paul sorenson email@example.com redst4r firstname.lastname@example.org santon email@example.com sgenoud firstname.lastname@example.org stonebig Tal Yarkoni email@example.com x2apps firstname.lastname@example.org zenourn email@example.com
PyMC3 3.0b (June 16th, 2015)
Probabilistic programming allows for flexible specification of Bayesian statistical models in code. PyMC3 is a new, open-source probabilistic programmer framework with an intuitive, readable and concise, yet powerful, syntax that is close to the natural notation statisticians use to describe models. It features next-generation fitting techniques, such as the No U-Turn Sampler, that allow fitting complex models with thousands of parameters without specialized knowledge of fitting algorithms.
PyMC3 has recently seen rapid development. With the addition of two new major features: automatic transforms and missing value imputation, PyMC3 has become ready for wider use. PyMC3 is now refined enough that adding features is easy, so we don't expect adding features in the future will require drastic changes. It has also become user friendly enough for a broader audience. Automatic transformations mean NUTS and find_MAP work with less effort, and friendly error messages mean its easy to diagnose problems with your model.
Thus, Thomas, Chris and I are pleased to announce that PyMC3 is now in Beta.
- Transforms now automatically applied to constrained distributions
- Transforms now specified with a
transform=argument on Distributions.
- Transparent missing value imputation support added with MaskedArrays or pandas.DataFrame NaNs.
- Bad default values now ignored
- Profile theano functions using
Contributors since 3.0a
- A. Flaxman firstname.lastname@example.org
- Andrea Zonca email@example.com
- Andreas Klostermann firstname.lastname@example.org
- Andrew Clegg email@example.com
- AustinRochford firstname.lastname@example.org
- Benjamin Edwards email@example.com
- Brian Naughton firstname.lastname@example.org
- Chad Heyne email@example.com
- Chris Fonnesbeck firstname.lastname@example.org
- Corey Farwell email@example.com
- John Salvatier firstname.lastname@example.org
- Karlson Pfannschmidt email@example.com
- Kyle Bishop firstname.lastname@example.org
- Kyle Meyer email@example.com
- Mack Sweeney firstname.lastname@example.org
- Osvaldo Martin email@example.com
- Raymond Roberts firstname.lastname@example.org
- Rodrigo Benenson email@example.com
- Thomas Wiecki firstname.lastname@example.org
- Zach Ploskey email@example.com
- maahnman firstname.lastname@example.org
- paul sorenson email@example.com
- zenourn firstname.lastname@example.org