Skip to content

Latest commit

 

History

History
29 lines (19 loc) · 1.05 KB

2010-74.md

File metadata and controls

29 lines (19 loc) · 1.05 KB
course course_year question_number tags title year
Statistics
IB
74
IB
2010
Statistics
Paper 1, Section II, E
2010

Consider the the linear regression model

$$Y_{i}=\beta x_{i}+\epsilon_{i},$$

where the numbers $x_{1}, \ldots, x_{n}$ are known, the independent random variables $\epsilon_{1}, \ldots, \epsilon_{n}$ have the $N\left(0, \sigma^{2}\right)$ distribution, and the parameters $\beta$ and $\sigma^{2}$ are unknown. Find the maximum likelihood estimator for $\beta$.

State and prove the Gauss-Markov theorem in the context of this model.

Write down the distribution of an arbitrary linear estimator for $\beta$. Hence show that there exists a linear, unbiased estimator $\widehat{\beta}$ for $\beta$ such that

$$\mathbb{E}{\beta, \sigma^{2}}\left[(\widehat{\beta}-\beta)^{4}\right] \leqslant \mathbb{E}{\beta, \sigma^{2}}\left[(\widetilde{\beta}-\beta)^{4}\right]$$

for all linear, unbiased estimators $\widetilde{\beta}$.

[Hint: If $Z \sim N\left(a, b^{2}\right)$ then $\left.\mathbb{E}\left[(Z-a)^{4}\right]=3 b^{4} .\right]$