Skip to content

Latest commit

 

History

History
31 lines (20 loc) · 1.21 KB

2017-76.md

File metadata and controls

31 lines (20 loc) · 1.21 KB
course course_year question_number tags title year
Statistics
IB
76
IB
2017
Statistics
Paper 3, Section II, $\mathbf{2 0 H}$
2017

Consider the general linear model

$$\boldsymbol{Y}=X \boldsymbol{\beta}+\varepsilon$$

where $X$ is a known $n \times p$ matrix of full rank $p<n, \varepsilon \sim \mathcal{N}_{n}\left(0, \sigma^{2} I\right)$ with $\sigma^{2}$ known and $\boldsymbol{\beta} \in \mathbb{R}^{p}$ is an unknown vector.

(a) State without proof the Gauss-Markov theorem.

Find the maximum likelihood estimator $\widehat{\boldsymbol{\beta}}$ for $\boldsymbol{\beta}$. Is it unbiased?

Let $\boldsymbol{\beta}^{*}$ be any unbiased estimator for $\boldsymbol{\beta}$ which is linear in $\left(Y_{i}\right)$. Show that

$$\operatorname{var}\left(\boldsymbol{t}^{T} \widehat{\boldsymbol{\beta}}\right) \leqslant \operatorname{var}\left(\boldsymbol{t}^{T} \boldsymbol{\beta}^{*}\right)$$

for all $\boldsymbol{t} \in \mathbb{R}^{p}$.

(b) Suppose now that $p=1$ and that $\boldsymbol{\beta}$ and $\sigma^{2}$ are both unknown. Find the maximum likelihood estimator for $\sigma^{2}$. What is the joint distribution of $\widehat{\boldsymbol{\beta}}$ and $\widehat{\sigma}^{2}$ in this case? Justify your answer.