Getting Started | Documentation
sbi
is a PyTorch package for simulation-based inference. Simulation-based inference is
the process of finding parameters of a simulator from observations.
sbi
takes a Bayesian approach and returns a full posterior distribution over the
parameters of the simulator, conditional on the observations. The package implements a
variety of inference algorithms, including amortized and sequential methods.
Amortized methods return a posterior that can be applied to many different observations
without retraining; sequential methods focus the inference on one particular observation
to be more simulation-efficient. See below for an overview of implemented methods.
sbi
offers a simple interface for posterior inference in a few lines of code
from sbi.inference import SNPE
# import your simulator, define your prior over the parameters
# sample parameters theta and observations x
inference = SNPE(prior=prior)
_ = inference.append_simulations(theta, x).train()
posterior = inference.build_posterior()
sbi
requires Python 3.8 or higher. A GPU is not required, but can lead to speed-up in some cases. We recommend to use a conda
virtual
environment (Miniconda installation instructions). If conda
is installed on the system, an environment for installing sbi
can be created as follows:
# Create an environment for sbi (indicate Python 3.8 or higher); activate it
$ conda create -n sbi_env python=3.10 && conda activate sbi_env
Independent of whether you are using conda
or not, sbi
can be installed using pip
:
pip install sbi
To test the installation, drop into a python prompt and run
from sbi.examples.minimal import simple
posterior = simple()
print(posterior)
If you're new to sbi
, we recommend starting with our Getting
Started tutorial.
You can easily access and run these tutorials by opening a Codespace on this repo. To do so, click on the green "Code" button and select "Open with Codespaces". This will provide you with a fully functional environment where you can run the tutorials as Jupyter notebooks.
The following inference algorithms are currently available. You can find instructions on how to run each of these methods here.
-
SNPE_A
(including amortized single-roundNPE
) from Papamakarios G and Murray I Fast ε-free Inference of Simulation Models with Bayesian Conditional Density Estimation (NeurIPS 2016). -
SNPE_C
orAPT
from Greenberg D, Nonnenmacher M, and Macke J Automatic Posterior Transformation for likelihood-free inference (ICML 2019). -
TSNPE
from Deistler M, Goncalves P, and Macke J Truncated proposals for scalable and hassle-free simulation-based inference (NeurIPS 2022).
SNLE_A
or justSNL
from Papamakarios G, Sterrat DC and Murray I Sequential Neural Likelihood (AISTATS 2019).
-
(S)NRE_A
orAALR
from Hermans J, Begy V, and Louppe G. Likelihood-free Inference with Amortized Approximate Likelihood Ratios (ICML 2020). -
(S)NRE_B
orSRE
from Durkan C, Murray I, and Papamakarios G. On Contrastive Learning for Likelihood-free Inference (ICML 2020). -
BNRE
from Delaunoy A, Hermans J, Rozet F, Wehenkel A, and Louppe G. Towards Reliable Simulation-Based Inference with Balanced Neural Ratio Estimation (NeurIPS 2022). -
(S)NRE_C
orNRE-C
from Miller BK, Weniger C, Forré P. Contrastive Neural Ratio Estimation (NeurIPS 2022).
SNVI
from Glöckler M, Deistler M, Macke J, Variational methods for simulation-based inference (ICLR 2022).
MNLE
from Boelts J, Lueckmann JM, Gao R, Macke J, Flexible and efficient simulation-based inference for models of decision-making (eLife 2022).
We welcome any feedback on how sbi
is working for your inference problems (see
Discussions) and are happy to receive bug
reports, pull requests, and other feedback (see
contribute). We wish to maintain a positive
community; please read our Code of Conduct.
sbi
is the successor (using PyTorch) of the
delfi
package. It started as a fork of Conor M.
Durkan's lfi
. sbi
runs as a community project. See also
credits.
sbi
has been supported by the German Federal Ministry of Education and Research (BMBF)
through project ADIMEM (FKZ 01IS18052 A-D), project SiMaLeSAM (FKZ 01IS21055A) and the
Tübingen AI Center (FKZ 01IS18039A).
Apache License Version 2.0 (Apache-2.0)
If you use sbi
consider citing the sbi software paper, in addition to the original research articles describing the specific sbi-algorithm(s) you are using.
@article{tejero-cantero2020sbi,
doi = {10.21105/joss.02505},
url = {https://doi.org/10.21105/joss.02505},
year = {2020},
publisher = {The Open Journal},
volume = {5},
number = {52},
pages = {2505},
author = {Alvaro Tejero-Cantero and Jan Boelts and Michael Deistler and Jan-Matthis Lueckmann and Conor Durkan and Pedro J. Gonçalves and David S. Greenberg and Jakob H. Macke},
title = {sbi: A toolkit for simulation-based inference},
journal = {Journal of Open Source Software}
}
The above citation refers to the original version of the sbi
project and has a persistent DOI.
Additionally, new releases of sbi
are citable via Zenodo, where we create a new DOI for every release.