public Wardje /vub

Subversion checkout URL

You can clone with HTTPS or Subversion.

math stat oef - add some from ch 1

 @@ -26,6 +26,8 @@ 26 26  \newcommand{\C}{{\mathbb C}} 27 27  \newcommand{\HQ}{{\mathbb H}} 28 28   29 +\newcommand{\Prob}{{\mathbb P}} 30 + 29 31  \pdfinfo{% Give info that will be linked to the resulting pdf file 30 32  /Title (Mathematical Statistics: exercises) 31 33  /Author (Ward Muylaert) @@ -194,6 +196,9 @@ \section{Estimation, basic concepts} 194 196  $ 195 197  \hat\theta = \frac{\sum X_i}{n} 196 198 $ 199 + \item 200 + Take $g(t(x_1, \dots, x_n), \theta) = f(x_1, \dots, x_n, \theta)$ 201 + and take $h(x_1, \dots, x_n) = 1$. Then $T = \sum X_i$. 197 202  \end{enumerate} 198 203  \end{oplossing} 199 204   @@ -208,11 +213,45 @@ \section{Estimation, basic concepts} 208 213  \] 209 214  \end{opgave} 210 215  \begin{oplossing} 216 + Remember that a sufficient statistic is a statistic $T$ where it holds 217 + that 218 + $ 219 + \Prob [ X_1 = x_1, \dots, X_n = x_n \mid T_n = c ] 220 +$ 221 + is independent of $\theta$, for every value of $c$. Alternatively, we 222 + could use the factorization theorem of Fisher and Neyman. $T$ is 223 + sufficient if and only if 224 + $ 225 + f(x_1, \dots, x_n, \theta) = g(t(x_1, \dots, x_n), \theta) h(x_1, \dots, x_n) 226 +$ 227 + Adjusting for the second $\theta$ gives us 228 + $ 229 + f(x_1, \dots, x_n, \theta_1, \theta_2) 230 + = g(t_1(x_1, \dots, x_n), t_2(x_1, \dots, x_n), \theta_1, \theta_2) 231 + h(x_1, \dots, x_n) 232 +$ 233 + 234 + Density function for a normal distribution is 235 + $ 236 + f(x, \mu, \sigma^2) 237 + = \frac{1}{\sigma \sqrt{2\pi}} \exp\left( - \frac{(x - \mu)^2}{2\sigma^2}\right) 238 +$ 239 + thus 240 + \begin{align*} 241 + f(x_1, \dots, x_n, \mu, \sigma^2) 242 + &= \prod f(x_i, \mu, \sigma^2)\\ 243 + &= \left(\frac{1}{\sigma \sqrt{2\pi}}\right)^n 244 + \exp\left( - \frac{\sum (x_i - \mu)^2}{2\sigma^2}\right)\\ 245 + &= \left(\frac{1}{\sigma \sqrt{2\pi}}\right)^n 246 + \exp\left( - \frac{1}{2\sigma^2} \left(\sum x_i^2 -2\mu\sum x_i + \mu^2\right)\right)\\ 247 + &= g\left(\sum x_i, \sum x_i^2, \mu, \sigma^2\right) 248 + \end{align*} 249 + and $h(x_1, \dots, x_n) = 1$. We can conclude it is a sufficient statistic. 211 250  \end{oplossing} 212 251   213 252  \begin{opgave} 214 253  Let $X_1, \dots, X_n$ be a random sample from 215 - $X \sim \text{ Poisson}(\theta)$ with $\theta > 0$. 254 + $X \sim \text{Poisson}(\theta)$ with $\theta > 0$. 216 255  \begin{enumerate} 217 256  \item 218 257  Find a sufficient statistic for $\theta$. @@ -221,6 +260,29 @@ \section{Estimation, basic concepts} 221 260  \end{enumerate} 222 261  \end{opgave} 223 262  \begin{oplossing} 263 + Remember that the Poisson distribution is of the form 264 + $ 265 + f(x, \theta) = 266 + \begin{cases} 267 + \frac{e^{-\theta} \theta^x}{x!} &x = 0, 1, 2, \dots\\ 268 + 0 &\text{otherwise} 269 + \end{cases} 270 +$ 271 + so 272 + \begin{align*} 273 + f(x_1, \dots, x_n, \theta) &= \prod f(x_i, \theta)\\ 274 + &= 275 + \begin{cases} 276 + \frac{e^{-n\theta} \theta^{\sum x_i}}{x_1! \dots x_n!} &\forall x_i: x_i = 0, 1, 2, \dots\\ 277 + 0 &\text{otherwise} 278 + \end{cases} 279 + \end{align*} 280 + now it is clear that $T = \sum X_i$ is a sufficient statistic. 281 + Simply take 282 + \begin{align*} 283 + g\left(\sum x_i, \theta\right) &= e^{-n\theta} \theta^{\sum x_i}\\ 284 + h(x_1, \dots, x_n) &= \frac{1}{x_1! \dots x_n!} 285 + \end{align*} 224 286  \end{oplossing} 225 287   226 288  \begin{opgave}