Accelerating Monte Carlo methods for Bayesian inference in dynamical models
Switch branches/tags
Nothing to show
Clone or download
Latest commit 9e5366a Nov 29, 2017
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
example-icevarve inital commit Apr 8, 2016
example-philips inital commit Apr 8, 2016
example-supreme inital commit Apr 8, 2016
python-base inital commit Apr 8, 2016
LICENSE Update LICENSE Nov 29, 2017
README.md Update README.md Nov 29, 2017

README.md

phd-thesis

This code was downloaded from < https://github.com/compops/phd-thesis > and contains the code used to produce the results in the thesis:

J. Dahlin, Accelerating Monte Carlo methods for Bayesian inference in dynamical models, Linköping studies in science and technology no. 1754, Linköping University, Sweden, 2016.

The thesis is available as full-text from < http://www.johandahlin.com/files/theses/dahlin-phdthesis.pdf >.

Dependencies

The code is written and tested for R 3.2.2 and Python 2.7.6 with some additional libraries/packages (see below).

The implementation in Python makes use of NumPy 1.9.2, SciPy 0.15.1, Matplotlib 1.4.3 and Pandas 0.13.1. On Ubuntu, these packages can be installed/upgraded using

sudo pip install --upgrade package-name

The implementation in R makes use of the package MCMCPack and RColorBrewer. They can be installed by the command

install.packages(c("MCMCPack","RColorBrewer"))

The implementation in Python makes use of NumPy 1.7.1, SciPy 0.12.0, Matplotlib 1.2.1 and Pandas. Please have these packages installed, on Ubuntu they can be installed using

sudo pip install --upgrade package-name

See < https://www.quandl.com/tools/python > for more information.

Example: Reconstructing the temperature of pre-historic Earth

The code in the folder example-icevarve replicates the example in Section 1.1.1. Two different models are used: (i) a Gaussian process regression and (ii) a non-linear state space model SSM.

For (i), we make use of a zero mean function and a kernel consisting of a bias kernel and a Matérn 5/2 kernel. The hyperparameters in the prior are hand-tuned and based on a run of empirical Bayes. The computations are carried out by ex-icevarve-gp.py and the plotting is done using ex-icevarve-plot.R.

For (ii), we make use of a state space model proposed by Langrock (2011) < http://www.tandfonline.com/doi/abs/10.1080/02664763.2011.573543 > given by

equation

where equation denote the parameters. These are estimated using the particle Metropolis-Hastings algorithm by running the file ex-icevarve-pmh0.py, which also estimates the latent state by marginalising over the parameter posterior. The results are plotted using ex-icevarve-plot.R. Note that the folder python-base needs to be the current working directory for the Python code to work. This as many subroutines required for particle filtering and particle Metropolis-Hastings are provided in the folder.

Example: How does unemployment affect inflation?

The code in the folder example-philips replicates the Phillips curve example introduced in Section 2.1 (Example 2.1) The data is collected by Statistics Sweden (http://www.scb.se/en_/) and consist of the monthly inflation rate (in percent) and unemployment rate (in percent of the total able-bodied population) during the period between 1987 and 2015. Note that the folder python-base needs to be the current working directory for the Python code to work. This as many subroutines required for particle filtering and particle Metropolis-Hastings are provided in the folder. To replicate the full example, run the files in the following order:

ex-philips-data.R This file replicates Figue 2.2 on page 20 of the thesis.

ex-philips-smc.py / ex-philips-state.R These files replicate the results presented in Example 3.3 on page 54 and Example 3.6 on page 57 in thesis. The script executes a bootstrap particle filter and estimates the gradient of the log-posterior using fixed-lag particle smoothing, see the main text for details. The R-file creates Figure 3.6 using the outputs generated by ex-philips-smc.py and ex-philips-ll-pmh0.py

ex-philips-ll-score.R This file replicates Figure 3.7 after a run of ex-philips-smc.py.

ex-philips-pmh0.py / ex-philips-posteriors.R These files run the particle Metropolis-Hastings algorithm to estimate the NAIRU and the parameter posteriors. The algorithm is calibrated using pilot runs and the rule-of-thumb suggested by Sherlock et al. (2015) < http://arxiv.org/pdf/1309.7209 >.

ex-philips-qpmh2.[py,R] These files make use of the quasi-Newton particle Metropolis-Hastings algorithm to estimate the parameter posteriors as discussed in Example 4.1 on page 83. It also make use of the zero-variance estimator to reduce the variance after the run of the algorithm. The R-file recreates the plot in Figure 4.5.

ex-philips-predictions.R This file generates prediction of the future inflation given the scenario discussed in Example 5.1 on page 87. This is done using 1,000 Monte Carlo runs, where each outcome is simulated from the model given a draw from the parameter posterior generated by the particle Metropolis-Hastings algorithm. The end result is a replication of Figure 5.1.

Example: Voting behaviour in the US Supreme court

The code in the folder example-supreme replicates the example regarding the ideological leaning of the US Supreme Court introduced in Section 2.1 (Example 2.2) The data is collected by The Supreme Court database (http://supremecourtdatabase.org/index.php). We make use of Version 2016, Release 01 and extract all cases with non-unanimous votes in the period between 2010 and 2015 during which the same justices served on the court. To replicate the full example, run the files in the following order:

ex-supreme-data.R This file replicates Figure 2.3 on page 22 of the thesis.

ex-supreme-posteriors.R This file estimates the posterior distribution of the parameter using Gibbs sampling as outlines in Example 2.6 (page 31) and Example 3.8 (page 63). The end result is Figure 3.8.

ex-supreme-predictions.R This file makes use of Monte Carlo simulations to make predictions as discussed in Example 5.2 on page 88. This is done by sampling a set of parameters from the posterior and then simulate the outcome from 10,000 randomly generated cases. The end result is Figure 5.2.