Skip to content

kesmarag/gaussian-hmm-tf

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

GaussianHMM

GaussianHMM(self, num_states, data_dim)

A Hidden Markov Models class with Gaussians emission distributions.

fit

GaussianHMM.fit(self, data, max_steps=100, batch_size=None, TOL=0.01, min_var=0.1, num_runs=1)

Implements the Baum-Welch algorithm.

Args:
  data: A numpy array with rank two or three.
  max_steps: The maximum number of steps.
  batch_size: None or the number of batch size.
  TOL: The tolerance for stoping training process.

Returns:
  True if converged, False otherwise.

posterior

GaussianHMM.posterior(self, data)

Runs the forward-backward algorithm in order to calculate the log-scale posterior probabilities.

Args:
  data: A numpy array with rank two or three.

Returns:
  A numpy array that contains the log-scale posterior probabilities of
  each time serie in data.

run_viterbi

GaussianHMM.run_viterbi(self, data)

Implements the viterbi algorithm. (I am not sure that it works properly)

Args:
  data: A numpy array with rank two or three.

Returns:
  The most probable state path.

generate

GaussianHMM.generate(self, num_samples)

Generate simulated data from the model.

Args:
  num_samples: The number of samples of the generated data.

Returns:
  The generated data.

instalation using pip:

pip install git+https://github.com/kesmarag/gaussian-hmm-tf.git

usage

import numpy as np
from kesmarag.ghmm import GaussianHMM

# create a random data set with 3 time series.
data = np.random.randn(3, 100, 2)

# create a model with 10 hidden states.
model = GaussianHMM(10, 2)

# fit the model
model.fit(data)

# print the trained model
print(model)

# Good luck