Skip to content

Latest commit

 

History

History
25 lines (17 loc) · 2.03 KB

2015-74.md

File metadata and controls

25 lines (17 loc) · 2.03 KB
course course_year question_number tags title year
Statistics
IB
74
IB
2015
Statistics
Paper 4, Section II, H
2015

Consider a linear model $\mathbf{Y}=X \boldsymbol{\beta}+\varepsilon$ where $\mathbf{Y}$ is an $n \times 1$ vector of observations, $X$ is a known $n \times p$ matrix, $\boldsymbol{\beta}$ is a $p \times 1(p<n)$ vector of unknown parameters and $\varepsilon$ is an $n \times 1$ vector of independent normally distributed random variables each with mean zero and unknown variance $\sigma^{2}$. Write down the log-likelihood and show that the maximum likelihood estimators $\hat{\boldsymbol{\beta}}$ and $\hat{\sigma}^{2}$ of $\boldsymbol{\beta}$ and $\sigma^{2}$ respectively satisfy

$$X^{T} X \hat{\boldsymbol{\beta}}=X^{T} \mathbf{Y}, \quad \frac{1}{\hat{\sigma}^{4}}(\mathbf{Y}-X \hat{\boldsymbol{\beta}})^{T}(\mathbf{Y}-X \hat{\boldsymbol{\beta}})=\frac{n}{\hat{\sigma}^{2}}$$

$(T$ denotes the transpose $)$. Assuming that $X^{T} X$ is invertible, find the solutions $\hat{\boldsymbol{\beta}}$ and $\hat{\sigma}^{2}$ of these equations and write down their distributions.

Prove that $\hat{\boldsymbol{\beta}}$ and $\hat{\sigma}^{2}$ are independent.

Consider the model $Y_{i j}=\mu_{i}+\gamma x_{i j}+\varepsilon_{i j}, i=1,2,3$ and $j=1,2,3$. Suppose that, for all $i, x_{i 1}=-1, x_{i 2}=0$ and $x_{i 3}=1$, and that $\varepsilon_{i j}, i, j=1,2,3$, are independent $N\left(0, \sigma^{2}\right)$ random variables where $\sigma^{2}$ is unknown. Show how this model may be written as a linear model and write down $\mathbf{Y}, X, \boldsymbol{\beta}$ and $\varepsilon$. Find the maximum likelihood estimators of $\mu_{i}$ $(i=1,2,3), \gamma$ and $\sigma^{2}$ in terms of the $Y_{i j}$. Derive a $100(1-\alpha) %$ confidence interval for $\sigma^{2}$ and for $\mu_{2}-\mu_{1}$.

[You may assume that, if $\mathbf{W}=\left(\mathbf{W}{1}^{T}, \mathbf{W}{2}^{T}\right)^{T}$ is multivariate normal with $\operatorname{cov}\left(\mathbf{W}{1}, \mathbf{W}{2}\right)=0$, then $\mathbf{W}{1}$ and $\mathbf{W}{2}$ are independent.]