Skip to content

Latest commit

 

History

History
31 lines (22 loc) · 1.36 KB

2008-68.md

File metadata and controls

31 lines (22 loc) · 1.36 KB
course course_year question_number tags title year
Statistics
IB
68
IB
2008
Statistics
2.II.19H
2008

Suppose that the joint distribution of random variables $X, Y$ taking values in $\mathbb{Z}^{+}={0,1,2, \ldots}$ is given by the joint probability generating function

$$\varphi(s, t) \equiv E\left[s^{X} t^{Y}\right]=\frac{1-\alpha-\beta}{1-\alpha s-\beta t}$$

where the unknown parameters $\alpha$ and $\beta$ are positive, and satisfy the inequality $\alpha+\beta<1$. Find $E(X)$. Prove that the probability mass function of $(X, Y)$ is

$$f(x, y \mid \alpha, \beta)=(1-\alpha-\beta)\left(\begin{array}{c} x+y \\ x \end{array}\right) \alpha^{x} \beta^{y} \quad\left(x, y \in \mathbb{Z}^{+}\right)$$

and prove that the maximum-likelihood estimators of $\alpha$ and $\beta$ based on a sample of size $n$ drawn from the distribution are

$$\hat{\alpha}=\frac{\bar{X}}{1+\bar{X}+\bar{Y}}, \quad \hat{\beta}=\frac{\bar{Y}}{1+\bar{X}+\bar{Y}},$$

where $\bar{X}$ (respectively, $\bar{Y}$ ) is the sample mean of $X_{1}, \ldots, X_{n}$ (respectively, $Y_{1}, \ldots, Y_{n}$ ).

By considering $\hat{\alpha}+\hat{\beta}$ or otherwise, prove that the maximum-likelihood estimator is biased. Stating clearly any results to which you appeal, prove that as $n \rightarrow \infty, \hat{\alpha} \rightarrow \alpha$, making clear the sense in which this convergence happens.