Skip to content
No description, website, or topics provided.
Branch: master
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
MarkovModel.playground
MarkovModel.xcodeproj
MarkovModel.xcworkspace
MarkovModel
MarkovModelTests
ModuleIntegrationTests
Pods
docs
.codebeatignore
.gitignore
.swiftlint.yml
.travis.yml
LICENSE
MarkovModel.podspec
Package.swift
Podfile
Podfile.lock
README.md
codecov.yml

README.md

MarkovModel

Build Status codecov codebeat badge Version Platform CocoaPods Compatible Carthage compatible

Description

MarkovModel is a Swift framework that uses the power of Markov Model to process and calculate states in a known system. It can be used to achieve several goals in Machine Learning, games and others. To know more about Markov Model, visit https://wikipedia.org/wiki/Markov_model

Features

  • Automatically creates Markov Chain based on a given sequence of transactions;
  • Allows manual matrix manipulation for mutating members;
  • Pretty printed matrix for debugging;
  • Markov decision process, including weighted random process;
  • Next state prediction;
  • Full API documentation.

Installation

Using CocoaPods

Add to your Podfile file

pod 'MarkovModel'

Using Carthage

Add to your Cartfile or Cartfile.private file

github "db-in/MarkovModel"

Using Swift Package Manager

Add to your Package.swift file

let package = Package(
    name: "myproject",
    dependencies: [
        .package(url: "https://github.com/db-in/MarkovModel.git"),
    ],
    targets: [
        .target(
            name: "myproject",
            dependencies: ["MarkovModel"]),
    ]
)

Requirements

Version Language Xcode iOS
1.0.0 Swift 4.1 9.0 10.0

Programming Guide

The Markov Model can be used to achieve many goals. This section will explain the usage while providing some possible scenarios.

  • Traning the Model
  • Decision Process
  • Debugging

Training the Model

Start by importing the package in the file you want to use it. There are two options of working with the model. By instantiating or by traning it statically.

import MarkovModel
//...
let markovModel = MarkovModel(transitions: ["A", "B", "C", "A", "C"])

For very large amount of data (transitions), you may rather take the static approach, once it can train the model and work on it all at once in a closure.

import MarkovModel
//...
MarkovModel.process(transitions: ["A", "B", "C", "A", "C"]) { model in
	// perform the operations on model
}

Decision Process

For performance and better API design, all the Markov Decision Process algorithms are done in the matrix itself. You can calculate any future state by calling next. There are 3 possible decision process options: predict, random and weightedRandom.

markovModel.chain.next(given: "B", process: .random)

You can ommit the process parameter and the default option will be predict.

markovModel.chain.next(given: "B")

Sometimes you may want to some column of the matrix itself. The method probabilities can be used to retrieve all the possible transitions from a given state.

markovModel.chain.probabilities(given: "B")

Decision Process

For performance and better API design, all the Markov Decision Process algorithms are done in the matrix itself. You can calculate any future state by calling next. There are 3 possible decision process options: predict, random and weightedRandom.

let markovModel = MarkovModel(transitions: ["A", "A", "B"])
print(markovModel)
// or
print(markovModel.chain)

/*
It will print
   B     A    

| 0.00  0.50 |  B   
|            |
| 0.00  0.50 |  A   
*/

FAQ

What about the states with zero transitions?

  • For memmory safe and performance, they are not even considered by the MarkovModel, once they have no effect over the Decision Process.
You can’t perform that action at this time.