Description
HMM can be viewed as dynamic Bayesian networks
- We learned that DBN can be used to model given conditional independencies.
Discrete Markov Process
Transition prob of states
- consider a system that may be described at any time as being in one of a set of N distinct states. S1...SN.
Use Markov to describe the transition
s means state, t - 1, each one we have n value, if no limited, the probability is hard to specify
Given the state at time t - 1
, we don't care about t-2
anymore because it is a first-order Markov chain
The transition from state i to state j.
If I know today's weather, I can guess tomorrow's weather based on the understanding of past weather.
For example, Today is rainy, tomorrow will be cloudy, the probability is 0.3, based on the matrix.
Given the weather on day 1 being sunny, what's the prob that the following 7 days are sun-sun-rain-rain-sun-cloudy-sun?
Convert to more formal format:
{S3,S3,S3,S1,S1,S3,S2,S3}
=> Find P(O|Model)
Extension to an HMM
- The previous example is an "Observable" Markov model, the output of the system/process in the states of interest.
Now:
We also want to observe if we have the same weather for the next few days.
Specifying an HMM
Given HMM
- What is the most likely state sequence?