-
Notifications
You must be signed in to change notification settings - Fork 1
/
README.rmd
91 lines (66 loc) · 3.72 KB
/
README.rmd
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
---
title: "Adaptive Rejection Sampler"
author: "Philipp Hunziker"
date: "June 3, 2018"
output: github_document
---
```{r setup, include=FALSE}
knitr::opts_chunk$set(echo = TRUE)
```
A very fast univariate Adaptive Rejection Sampler for R, implemented in C++.
## Installation
```{r eval=FALSE}
devtools::install_github('hunzikp/arscpp')
```
## Quick Start
Sample from the standard normal distribution:
```{r quick}
# Normal log pdf
f <- function(x, mu, sd) {
-0.5*(sd^(-2))*(x-mu)^2
}
# Derivative of normal log pdf
f_prime <- function(x, mu, sd) {
-1*(sd^(-2))*(x-mu)
}
# Prepare the sampler
sampler <- arscpp::ars(f = f, f_prime = f_prime, x = c(-1, 1),
mu = 0, sd = 1)
# Draw 1e4 samples
set.seed(1)
x <- sampler$sample(1000)
qqnorm(x)
```
## Adaptive Rejection Sampling
The `arscpp` package implements the Adaptive Rejection Sampling (ARS) algorithm by Wild and Gilks (1993, Algorithm AS 287). ARS allows sampling from abritrary univariate log-concave densities. Importantly, the density does not have to be normalized, so ARS is particularly useful for Bayesian inference problems.
Without going into any detail, ARS works by building a piecewise log-linear density function (called the 'upper hull') that serves as a proposal distribution for the rejection sampler. In addition, the algorithm defines a piecwise log-linear lower-hull that allows performing the rejection step without having to evaluate the target density at every iteration. The algorithm is 'adaptive' because the upper and lower hulls get adjusted to fit the target density more closely with every rejected sample. The animation below illustrates this procedure, showing the sampling of 100 points from an unnormalized standard normal density using ARS. The red and blue lines correspond to the upper and lower hulls, respectively. The animation shows nicely how ARS leads to a near optimal proposal distribution as the number of samples increases.
![Alt Text](examples/gif/ars.gif)
## Implementation Details
`arscpp` is implemented in C++ using the excellent [Rcpp](https://cran.r-project.org/package=Rcpp) package, with a focus on speed and user-friendliness. `arscpp` is quite a bit faster than the implementation provided in the [ars](https://cran.r-project.org/package=ars) package:
```{r speed}
# Prepare the arscpp sampler (this package)
sampler <- arscpp::ars(f = f, f_prime = f_prime, x = c(-2, -1, 1),
mu = 0, sd = 1)
ars_new_sample <- function() {sampler$sample(1000)}
# Prepare the sampler from the 'ars' package
ars_old_sample <- function() {ars::ars(n = 1000, f = f, fprima = f_prime, x = c(-2, -1, 1),
mu = 0, sd = 1)}
# Benchmark
library(rbenchmark)
res <- benchmark(
"arscpp" = ars_new_sample(),
"ars" = ars_old_sample()
)
print(res)
```
`arscpp` also tries to provide a 'black-box' implementation where the user does not have to worry about providing correct starting values (see the Wild and Gilks article for details). Instead, upon encountering invalid starting values, `arscpp` seeks new ones, e.g.:
```{r start}
# Try invalid starting values with 'ars' implementation
smp <- ars::ars(n = 1, f = f, fprima = f_prime, x = c(1,2,3), mu = 0, sd = 1)
# Try invalid starting values with 'arscpp'
sampler <- arscpp::ars(f = f, f_prime = f_prime, x = c(1,2,3), mu = 0, sd = 1)
smp <- sampler$sample(1)
print(smp)
```
## C++ Extensions
If you'd rather use the `arscpp` sampler from within C++ you should be able to do so quite easily using run-time polymorphism. Simply extend the `LogDensity` abstract class and implement all the virtual member functions. The main AR sampler in `arcpp` is defined in class `ARS`, which takes a pointer to a `LogDensity` child object as a constructor parameter.