Skip to content

Commit

Permalink
replace "blockings" by "blocks"
Browse files Browse the repository at this point in the history
  • Loading branch information
axsk authored and brian-j-smith committed Sep 8, 2015
1 parent 82f0f8b commit 6550b33
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion doc/intro.rst
Expand Up @@ -15,7 +15,7 @@ The Mamba Package

*Mamba* :cite:`smith:2014:Mamba` is a **julia** :cite:`julia:2014` package designed for general Bayesian model fitting via MCMC. Like *OpenBUGS* and *JAGS*, it supports a wide range of model and distributional specifications, and provides a syntax for model specification. Unlike those two, and like *PyMC*, *Mamba* provides a unified environment in which all interactions with the software are made through a single, interpreted language. Any **julia** operator, function, type, or package can be used for model specification; and custom distributions and samplers can be written in **julia** to extend the package. Conversely, interactions with and extensions to *OpenBUGS* and *JAGS* can involve three different programming environments --- **R** wrappers used to call the programs, their DSLs, and the underlying implementations in Component Pascal and C++. Advantages of a unified environment include more flexible model specification; tighter integration with supplied functions for convergence diagnostics and posterior inference; and faster development, testing, and debugging of extensions. Advantages of the `BUGS` DSLs include more concise model specification and facilitation of automated sampling scheme formulation. Indeed, sampling schemes must be selected manually in the initial release of *Mamba*. Nevertheless, *Mamba* holds other distinct advantages over existing offerings. In particular, it provides arbitrary blocking of model parameters and designation of block-specific samplers; samplers that can be used with the included simulation engine or apart from it; and command-line access to all package functionality, including its simulation API. Likewise, advantages of the **julia** language include its familiar syntax, focus on technical computing, and benchmarks showing it to be one or more orders of magnitude faster than **R** and **Python** :cite:`bezanson:2012:JFD`. Finally, the intended audience for *Mamba* includes individuals interested in programming in **julia**; who wish to have low-level access to model design and implementation; and, in some cases, are able to derive full conditional distributions of model parameters (up to normalizing constants).

*Mamba* allows for the implementation of an MCMC sampling scheme to simulate draws for a set of Bayesian model parameters :math:`(\theta_1, \ldots, \theta_p)` from their joint posterior distribution. The package supports the general Gibbs :cite:`gelfand:1990:SBA,geman:1984:SRG` scheme outlined in the algorithm below. In its implementation with the package, the user may specify any blocking :math:`\{\Theta_j\}_{j=1}^{B}` of the parameters and corresponding functions :math:`\{f_j\}_{j=1}^{B}` to sample each :math:`\Theta_j` from its full conditional distribution :math:`p(\Theta_j | \Theta \setminus \Theta_{j})`. Simulation performance (efficiency and runtime) can be affected greatly by the choice of blocking scheme and sampling functions. For some models, an optimal choice may not be obvious, and different choices may need to be tried to find one that gives a desired level of performance. This can be a time-consuming process. The *Mamba* package provides a set of **julia** types and method functions to facilitate the specification of different schemes and functions. Supported sampling functions include those provided by the package, user-defined functions, and functions from other packages; thus providing great flexibility with respect to sampling methods. Furthermore, a sampling engine is provided to save the user from having to implement tasks common to all MCMC simulators. Therefore, time and energy can be focused on implementation aspects that most directly affect performance.
*Mamba* allows for the implementation of an MCMC sampling scheme to simulate draws for a set of Bayesian model parameters :math:`(\theta_1, \ldots, \theta_p)` from their joint posterior distribution. The package supports the general Gibbs :cite:`gelfand:1990:SBA,geman:1984:SRG` scheme outlined in the algorithm below. In its implementation with the package, the user may specify blocks :math:`\{\Theta_j\}_{j=1}^{B}` of parameters and corresponding functions :math:`\{f_j\}_{j=1}^{B}` to sample each :math:`\Theta_j` from its full conditional distribution :math:`p(\Theta_j | \Theta \setminus \Theta_{j})`. Simulation performance (efficiency and runtime) can be affected greatly by the choice of blocking scheme and sampling functions. For some models, an optimal choice may not be obvious, and different choices may need to be tried to find one that gives a desired level of performance. This can be a time-consuming process. The *Mamba* package provides a set of **julia** types and method functions to facilitate the specification of different schemes and functions. Supported sampling functions include those provided by the package, user-defined functions, and functions from other packages; thus providing great flexibility with respect to sampling methods. Furthermore, a sampling engine is provided to save the user from having to implement tasks common to all MCMC simulators. Therefore, time and energy can be focused on implementation aspects that most directly affect performance.

.. _figure-Gibbs:

Expand Down

0 comments on commit 6550b33

Please sign in to comment.