-
Notifications
You must be signed in to change notification settings - Fork 4
/
2018-07-13-simulate-poisson-edition.Rmd
166 lines (107 loc) · 9.62 KB
/
2018-07-13-simulate-poisson-edition.Rmd
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
---
title: 'Simulate! Simulate! - Part 3: The Poisson edition'
author: Ariel Muldoon
date: '2018-07-18'
slug: simulate-poisson-edition
categories:
- r
- statistics
tags:
- simulation
- glmm
draft: FALSE
description: "Extending my simulation examples into the world of generalized linear models, I simulate Poisson data to explore what a quadratic relationship looks like on the scale of the data when fitting a generalized linear model with a log link."
---
```{r setup, echo = FALSE, message = FALSE, purl = FALSE}
knitr::opts_chunk$set(comment = "#")
devtools::source_gist("2500a85297b742c6f2fb3a14549f5851",
filename = 'render_toc.R')
```
One of the things I like about simulations is that, with practice, they can be a quick way to check your intuition about a model or relationship.
My most recent example is based on a discussion with a student about quadratic effects.
I've never had a great grasp on what the coefficients that define a quadratic relationship mean. Luckily there is this very nice [FAQ page](https://stats.idre.ucla.edu/other/mult-pkg/faq/general/faqhow-do-i-interpret-the-sign-of-the-quadratic-term-in-a-polynomial-regression/) from the Institute for Digital Research and Education at UCLA that goes over the meaning of the coefficients in detail, with examples. This has become my go-to page when I need a review (which is apparently every time the topic comes up `r emo::ji("stuck_out_tongue_winking_eye")`).
So while we understood what the quadratic effect "meant" in the model, in this particular case the student was working with a generalized linear mixed model for count data. This model was *linear on the log scale*. If something is quadratic on the log scale (the scale of the model), what does the relationship look like on the original scale (the scale of the data)?
I wasn't sure. Logically I would say that if something that is a straight line on the log scale is curved on the data scale then something that is curved on the log scale should be *more* curved on the original scale. Right? But how much more curved? Hmm.
Simulation to the rescue! I decided to simulate some data to see what a relationship like the one the student had estimated could look like on the original scale.
## Table of Contents
```{r toc, echo = FALSE, purl = FALSE}
render_toc("2018-07-13-simulate-poisson-edition.Rmd")
```
# The statistical model
Even though what I did was a single iteration simulation, I think it is still useful to write out the statistical model I used for building the simulated data. It has been the combination of seeing the equations and then doing simulations (and then seeing the equations in a new light `r emo::ji("grin")`) that has really helped me understand generalized linear (mixed) models and so I like to include the models explicitly in these posts. But if you're at a place where you find looking at these equations makes your eyes glaze over, [jump down to the code](#code-for-a-single-simulation).
The statistical model for generalized linear model looks pretty different than the statistical model of the very special case when we assume normality. Since so many of us (like me!) learned that special case first, this different approach takes some getting used to.
Instead of defining the distribution of the *errors* we'll now directly define the distribution of the response variable. (*For a more formal coverage of the statistical model for generalized linear (mixed) models see Stroup's [Rethinking the Analysis of Non-Normal Data in Plant and Soil Science](http://lira.pro.br/wordpress/wp-content/uploads/2015/06/stroup-2015.pdf).*)
I'm going to use a Poisson generalized linear model for my simulation, so the response variable will be discrete counts. In my statistical model I first define a response variable that comes from the Poisson distribution.
$$y_t \thicksim Poisson(\lambda_t)$$
+ $y_t$ is the recorded count for the $t$th observation of the discrete response variable.
+ $\lambda_t$ is the unobserved true mean of the Poisson distribution for the $t$th observation. The Poisson distribution is a single parameter distribution, where the variance is exactly equal the mean.
We will assume that the relationship between the *mean* of the response and any explanatory variables is linear on the log scale. This can be described as using a *log link*, since the log is the function that "links" the mean to the linear predictor. If you're coming from the world of linear models you may be used to describing the relationship between the *response variable* and any explanatory variables, not the relationship between the *mean of the response variable* and explanatory variables.
The model I define here is a quadratic model for a single, continuous explanatory variable.
$$log(\lambda_t) = \beta_0 + \beta_1*x_t + \beta_2*x^2_t$$
+ $x_t$ is the recorded value of the $t$th observation of the continuous explanatory variable.
+ $x^2_t$ is the square of $x_t$.
+ $\beta_0$, $\beta_1$, and $\beta_2$ are parameters (intercepts and slope coefficients) of the linear model.
If you are new to generalized linear models you might want to take a moment and note of the absence of epsilon in the linear predictor.
Notice we can calculate the mean on the original scale instead of the log scale by exponentiating both sides of the above equation. This will be important when we get to writing code to simulate data.
$$\lambda_t = exp(\beta_0 + \beta_1*x_t + \beta_2*x^2_t)$$
# Code for a single simulation
The first thing I will do in this simulation is define my true parameter values. I'm simulating a relationship between x and y that is similar to the student's results so I'll set the intercept and the linear coefficient ($\beta_0$ and $\beta_1$, respectively) both to 0.5 and the quadratic coefficient ($\beta_2$) to 5.
```{r}
b0 = .5
b1 = .5
b2 = 5
```
Next I need an explanatory variable, which I'll call `x`. I decided to make this a continuous variable between 0 and 1 by taking 100 random draws from a uniform distribution with a minimum of 0 and a maximum of 1 via `runif()`. Since I'm taking 100 draws, $t$ in the statistical model goes from 1-100.
I'll set my seed prior to generating random numbers so you'll get identical results if you run this code.
```{r}
set.seed(16)
x = runif(100, min = 0, max = 1)
head(x) # First six values of x
```
Once I have my parameters set and an explanatory variable created I can calculate $\lambda_t$. This is where I find the statistical model to be really handy, as it directly shows me how to write the code. Because I want to calculate the means on the original scale and not the log of the means I use the model equation after exponentiating both sides.
I'll simulate the 100 means via
$$\lambda_t = exp(0.5 + 0.5*x_t + 5*x^2_t)$$
```{r}
lambda = exp(b0 + b1*x + b2*x^2)
```
The step above simulates the *mean* of each value of the response variable. These values are continuous, not discrete counts.
```{r}
head(lambda)
```
Now that I have a vector of means I can use it to generate a count for each value of `lambda` based on the Poisson distribution. I do this via `rpois()`.
The next bit of code is based on the distribution defined in the statistical model. Remember that we defined `y` as:
$$y_t \thicksim Poisson(\lambda_t)$$
It is this step where we add "Poisson errors" to the mean to generate the response variable. For a fixed x variable, the variation for each simulated `y` value around the mean is based on the Poisson variance. For linear model simulations we usually add variability to the mean by simulating the errors directly from a normal distribution with a mean of 0. Since the variance is based on the mean in the Poisson distribution, adding the variability isn't so obvious. I've seen this referred to as adding "Poisson noise", but "Poisson errors" may be a better term.
I randomly draw 100 counts, one for each of the 100 means stored in `lambda`.
```{r}
y = rpois(100, lambda = lambda)
```
Unlike `lambda`, the `y` variable is a discrete count. This is the response variable that will be used in analysis.
```{r}
head(y)
```
# Results!
Now that I have simulated values for both the response and explanatory variable I can take a look at the relationship between `x` and `y`.
Below is what things look like on the log scale (the scale of the model). I was interested to see that, while the relationship was curved up as expected by the quadratic coefficient I used, the curve was really quite shallow.
```{r}
plot(x, log(y) )
```
How do things look on the original scale? The curve is more extreme, much more than I realized it would be.
```{r}
plot(x, y)
```
This was good to see, as it matched pretty well with an added variable plot the student had made. We had originally been concerned that there was some mistake in the plotting code, and being able to explore things via simulation helped allay these fears. Simulation success! `r emo::ji("clinking_glasses")`
# Just the code, please
```{r getlabels, echo = FALSE, purl = FALSE}
labs = knitr::all_labels()
labs = labs[!labs %in% c("setup", "toc", "getlabels", "allcode", "makescript")]
```
```{r makescript, include = FALSE, purl = FALSE}
options(knitr.duplicate.label = "allow") # Needed to purl like this
input = knitr::current_input() # filename of input document
output = here::here("static", "script", paste(tools::file_path_sans_ext(input), "R", sep = ".") )
knitr::purl(input, output, documentation = 0, quiet = TRUE)
```
Here's the code without all the discussion. Copy and paste the code below or you can download an R script of uncommented code [from here](/script/2018-07-13-simulate-poisson-edition.R).
```{r allcode, ref.label = labs, eval = FALSE, purl = FALSE}
```