Skip to content

Latest commit

 

History

History
41 lines (30 loc) · 1.29 KB

2001-38.md

File metadata and controls

41 lines (30 loc) · 1.29 KB
course course_year question_number tags title year
Markov Chains
II
38
II
2001
Markov Chains
A3.1 B3.1
2001

(i) Explain what is meant by the transition semigroup $\left{P_{t}\right}$ of a Markov chain $X$ in continuous time. If the state space is finite, show under assumptions to be stated clearly, that $P_{t}^{\prime}=G P_{t}$ for some matrix $G$. Show that a distribution $\pi$ satisfies $\pi G=0$ if and only if $\pi P_{t}=\pi$ for all $t \geqslant 0$, and explain the importance of such $\pi$.

(ii) Let $X$ be a continuous-time Markov chain on the state space $S={1,2}$ with generator

$$G=\left(\begin{array}{cc} -\beta & \beta \\ \gamma & -\gamma \end{array}\right), \quad \text { where } \beta, \gamma>0 .$$

Show that the transition semigroup $P_{t}=\exp (t G)$ is given by

$$(\beta+\gamma) P_{t}=\left(\begin{array}{cc} \gamma+\beta h(t) & \beta(1-h(t)) \\ \gamma(1-h(t)) & \beta+\gamma h(t) \end{array}\right),$$

where $h(t)=e^{-t(\beta+\gamma)}$.

For $0<\alpha<1$, let

$$H(\alpha)=\left(\begin{array}{cc} \alpha & 1-\alpha \\ 1-\alpha & \alpha \end{array}\right)$$

For a continuous-time chain $X$, let $M$ be a matrix with $(i, j)$ entry

$P(X(1)=j \mid X(0)=i)$, for $i, j \in S$. Show that there is a chain $X$ with $M=H(\alpha)$ if and only if $\alpha>\frac{1}{2}$.