Skip to content

Latest commit

 

History

History
22 lines (16 loc) · 937 Bytes

2007-75.md

File metadata and controls

22 lines (16 loc) · 937 Bytes
course course_year question_number tags title year
Markov Chains
IB
75
IB
2007
Markov Chains
1.II.19C
2007

Consider a Markov chain $\left(X_{n}\right){n \geqslant 0}$ on states ${0,1, \ldots, r}$ with transition matrix $\left(P{i j}\right)$, where $P_{0,0}=1=P_{r, r}$, so that 0 and $r$ are absorbing states. Let

$$A=\left(X_{n}=0, \text { for some } n \geqslant 0\right) \text {, }$$

be the event that the chain is absorbed in 0 . Assume that $h_{i}=\mathbb{P}\left(A \mid X_{0}=i\right)>0$ for $1 \leqslant i<r$.

Show carefully that, conditional on the event $A,\left(X_{n}\right)_{n \geqslant 0}$ is a Markov chain and determine its transition matrix.

Now consider the case where $P_{i, i+1}=\frac{1}{2}=P_{i, i-1}$, for $1 \leqslant i<r$. Suppose that $X_{0}=i, 1 \leqslant i<r$, and that the event $A$ occurs; calculate the expected number of transitions until the chain is first in the state 0 .