Skip to content

Latest commit

 

History

History
22 lines (16 loc) · 948 Bytes

2008-66.md

File metadata and controls

22 lines (16 loc) · 948 Bytes
course course_year question_number tags title year
Statistics
IB
66
IB
2008
Statistics
$1 . \mathrm{I} . 7 \mathrm{H} \quad$
2008

A Bayesian statistician observes a random sample $X_{1}, \ldots, X_{n}$ drawn from a $N\left(\mu, \tau^{-1}\right)$ distribution. He has a prior density for the unknown parameters $\mu, \tau$ of the form

$$\pi_{0}(\mu, \tau) \propto \tau^{\alpha_{0}-1} \exp \left(-\frac{1}{2} K_{0} \tau\left(\mu-\mu_{0}\right)^{2}-\beta_{0} \tau\right) \sqrt{\tau},$$

where $\alpha_{0}, \beta_{0}, \mu_{0}$ and $K_{0}$ are constants which he chooses. Show that after observing $X_{1}, \ldots, X_{n}$ his posterior density $\pi_{n}(\mu, \tau)$ is again of the form

$$\pi_{n}(\mu, \tau) \propto \tau^{\alpha_{n}-1} \exp \left(-\frac{1}{2} K_{n} \tau\left(\mu-\mu_{n}\right)^{2}-\beta_{n} \tau\right) \sqrt{\tau}$$

where you should find explicitly the form of $\alpha_{n}, \beta_{n}, \mu_{n}$ and $K_{n}$.