Be notified of new releases
Create your free GitHub account today to subscribe to this repository for new releases and build software alongside 40 million developers.Sign up
- A new AutoNormal guide that supports data subsampling, thanks to @patrickeganfoley.
- Data subsampling support for the AutoDelta guide.
- A new pyro.subsample primitive to aide in subsampling.
- An AutoNormalizingFlow autoguide.
- A new pyro.contrib.forecasting module for multivariate hierarchical heavy-tailed forecasting, together with three tutorials: uivariate, heavy-tailed, state space models, and hierarchical models.
- A tutorial on Boosting Black Box Variational Inference, thanks to @lorenzkuhn, @gideonite, @sharrison5, and @TNU-yaoy.
- A NeuTraReparam example.
- PyroModule's interface is now stable, no longer EXPERIMENTAL.
- Better validation for GaussianHMM and LinearHMM distributions.
- Miscellaneous new GaussianHMM method to handle conjugacy.
Patches 1.2.0 with the following bug fixes:
- Fix for MCMC with parallel chains using multiprocessing, where transforms to the latent sites' support was not being correctly stored.
- Other minor rendering related fixes for tutorials.
- Updated to PyTorch 1.4.0 and torchvision 0.5.0.
- Changed license from MIT to Apache 2.0 and removed Uber CLA as part of Pyro's move to the Linux foundation.
This release adds a new effect handler and a collection of strategies that reparameterize models to improve geometry. These tools are largely orthogonal to other inference tools in Pyro, and can be used with SVI, MCMC, and other inference algorithms.
- poutine.reparam() is a new effect handler that transforms models into other models for which inference may be easier (Gorinova et al. 2019).
- pyro.infer.reparam is a collection of reparameterization strategies following a standard Reparam interface:
- Decentering transforms for location-scale families (Gorinova et al. 2019).
- Transform unwrapping to deconstruct
- Discrete Cosine transforms for frequency-domain parameterizations (useful for inference in time series).
- Auxiliary variable methods for Levy Stable and StudentT distributions.
- Linear Hidden Markov Model reparameterization, allowing a range of non-Gaussian HMMs to be treated as conditional Gaussian processes.
- Neural Transport uses SVI to learn the geometry of a model before drawing samples using HMC (Hoffman et al. 2019).
- A tutorial on inference with Levy Stable distrubutions, demonstrating StableReparam, DiscreteCosineReparam, and EnergyDistance.
Other new features
- A tutorial on Dirichlet process mixture modeling, contributed by @m-k-S
- Added a LinearHMM distribution with an
.rsample()method. This supports non-Gaussian noise such as Levy Stable and StudentT, but requires reparameterization for inference.
- Implemented a GaussianHMM.rsample() method for drawing joint samples from a linear-Gaussian HMM.
- Added a LowerCholeskyAffine transform.
- #2264 improves speed and numerical stability of
pyro.deterministic primitive to record deterministic values in the trace.
A default implementation for Distribution.expand() that is available to all Pyro distributions that subclass from
TorchDistribution, making it easier to create custom distributions.
New distributions and transforms
- MultivariateStudentT is a heavy-tailed multivariate distribution.
- Stable implements a Lévy α-stable distribution with reparametrized
.rsample()method but no
.log_prob(). This can be fit using EnergyDistance inference.
- ZeroInflatedNegativeBinomial is a distribution for count data.
- LowerCholeskyAffine is a multivariate affine transform.
Other Changes / Bug Fixes
pyro.util.save_visualizationhas been deprecated, and dependency on
- #2197 fixed a naming bug in PyroModule that affected mutliple sub-PyroModules with conflicting names.
- #2192 Bug fix in Planar normalizing flow implementation
- #2188 Make error messages for incorrect arguments to effect handlers more informative
The objective of this release is to stabilize Pyro's interface and thereby make it safer to build high level components on top of Pyro.
- Behavior of documented APIs will remain stable across minor releases, except for bug fixes and features marked EXPERIMENTAL or DEPRECATED.
- Serialization formats will remain stable across patch releases, but may change across minor releases (e.g. if you save a model in 1.0.0, it will be safe to load it in 1.0.1, but not in 1.1.0).
- Undocumented APIs, features marked EXPERIMENTAL or DEPRECATED, and anything in
pyro.contribmay change at any time (though we aim for stability).
- All deprecated features throw a
FutureWarningand specify possible work-arounds. Features marked as deprecated will not be maintained, and are likely to be removed in a future release.
- If you want more stability for a particular feature, contribute a unit test.
- pyro.infer.Predictive is a new utility for serving models, supporting jit tracing and serialization.
- pyro.distributions.transforms has many new transforms, and includes helper functions to easily create a variety of normalizing flows. The transforms library has also been reorganized.
- pyro.contrib.timeseries is an experimental new module with fast Gaussian Process inference for univariate and multivariate time series and state space models.
- pyro.nn.PyroModule is an experimental new interface that adds Pyro effects to an
PyroModuleis already used internally by
pyro.contrib.timeseries, and elsewhere.
- FoldedDistribution is a new distribution factory, essentially equivalent to
TransformedDistribution(-, AbsTransform())but providing a
- A new tutorial illustrates the usage of pyro.contrib.oed in the context of adaptive election polling.
- Autoguides have slightly changed interfaces:
nn.Modules and can be serialized separately from the param store. This enables serving via torch.jit.trace_module.
Auto*Normalfamily of autoguides now have
init_loc_fnhas better support. Autoguides no longer support initialization by writing directly to the param store.
- Many transforms have been renamed to enforce a consistent interface, such as the renaming of
pyro.generichas been moved to a separate project pyroapi.
- poutine.do has slightly changed semantics to follow Single World Intervention Graph semantics.
pyro.contrib.glmmhas been moved to
pyro.contrib.oed.glmmand will eventually be replaced by BRMP.
DeprecationWarnings have been promoted to
pyro.random_moduleprimitive has been deprecated in favor of PyroModule which can be used to create Bayesian modules from
SVI.runmethod is deprecated and users are encouraged to use the .step method directly to run inference. For drawing samples from the posterior distribution, we recommend using the Predictive utility class, or directly by using the
TracePredictiveclass is deprecated in favor of Predictive, that can be used to gather samples from the posterior and predictive distributions in SVI and MCMC.
mcmc.predictive: This utility function has been absorbed into the more general Predictive class.
0.5.0 with the following bug fixes:
- Removes f-string which is only supported in Python 3.6+, so that Python 3.5 is supported.
- Fix incompatibility with recent tqdm releases which make multiple bars not work in the notebook environment (for MCMC with multiple chains).
- pyro.factor to add arbitrary log probability factor to a probabilistic model.
- Conditional MADE Autoregressive Network available in pyro.nn.
- Tutorial on adaptive experiment design for studying working memory.
- KL divergence for
- A fast
n log(n)implementation of the Continuous Ranked Probability Score (CRPS) for sample sets: pyro.ops.stats.crps_empirical
Code changes and bug fixes
pyro.genericto a separate pyro-api package.
- Minor changes to ensure compatibility with pyro-api, a generic modeling and inference API for dispatch to different Pyro backends.
- Improve numerical stability of MixtureOfDiagonals distribution using
- Improved U-Turn check condition in NUTS for better sampling efficiency.
transformsmodule to match
- Fixed AutoGuide intitialization stragtegies, resolving a bug in
- *HMM.filter() methods for forecasting.
- Support for Independent(Normal) observations in GaussianHMM.
- Fix for HMC / NUTS to handle errors arising from numerical issues when computing Cholesky decomposition.
This release drops support for Python 2. Additionally, it includes a few fixes to enable Pyro to use the latest PyTorch release,
Some other additions / minor changes:
- Add option for sequential predictions for MCMC predictive.
pyro.contrib.autoguideto the core Pyro repo.
- Additional inference algorithms
- Add a GaussianHMM distribution for fast tuning of Gaussian state space models / Kalman filters
- A more flexible easyguide module. Refer to the tutorial for usage instructions.
- Different initialization methods for autoguides.
- More normalizing flows - Block Neural Autoregressive Flow, Sum of Squares, Sylvester Flow, DeepELUFlow, Householder Flow, RealNVP.
- Support ReduceLROnPlateau scheduler.
- New interface for MCMC inference:
- A DiscreteHMM distribution for fast parallel training of discrete-state Hidden Markov Models with arbitrary observation distributions. See examples/hmm.py for example usage in a neural HMM.
Code changes and bug fixes
- Addresses pickling issue with Pyro handlers that makes it possible to pickle a much larger class of models.
- Multiple fixes for multiprocessing bugs with MCMC. With the new interface, the memory consumption is low thereby allowing for collecting many more samples.
- Performance enhancements for models with many sample sites.