Skip to content

Hidden Markov Models #44

Open
Open
@cheatsheet1999

Description

@cheatsheet1999

HMM can be viewed as dynamic Bayesian networks

  • We learned that DBN can be used to model given conditional independencies.

Discrete Markov Process

Transition prob of states

  • consider a system that may be described at any time as being in one of a set of N distinct states. S1...SN.

Use Markov to describe the transition

Screen Shot 2021-09-30 at 12 07 53 PM

s means state, t - 1, each one we have n value, if no limited, the probability is hard to specify
Screen Shot 2021-09-30 at 12 08 55 PM

Given the state at time t - 1, we don't care about t-2 anymore because it is a first-order Markov chain

Screen Shot 2021-09-30 at 12 10 43 PM

The transition from state i to state j.
Screen Shot 2021-09-30 at 12 13 19 PM
If I know today's weather, I can guess tomorrow's weather based on the understanding of past weather.

An example
Screen Shot 2021-09-30 at 12 17 46 PM

For example, Today is rainy, tomorrow will be cloudy, the probability is 0.3, based on the matrix.

Given the weather on day 1 being sunny, what's the prob that the following 7 days are sun-sun-rain-rain-sun-cloudy-sun?
Convert to more formal format:
{S3,S3,S3,S1,S1,S3,S2,S3}
=> Find P(O|Model)

Extension to an HMM

  • The previous example is an "Observable" Markov model, the output of the system/process in the states of interest.

Now:
We also want to observe if we have the same weather for the next few days.

Screen Shot 2021-09-30 at 12 28 48 PM

Specifying an HMM

Screen Shot 2021-09-30 at 12 34 26 PM

Given HMM

Screen Shot 2021-09-30 at 12 35 19 PM

  1. What is the most likely state sequence?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions