Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[WIP] Added continuous time Markov chain #17163

Open
wants to merge 8 commits into
base: master
from

Conversation

Projects
None yet
4 participants
@czgdp1807
Copy link
Member

commented Jul 7, 2019

References to other Issues or PRs

[1] http://u.math.biu.ac.il/~amirgi/CTMCnotes.pdf

Brief description of what is fixed or changed

Continuous time Markov chains have been added. The API is kept similar to DiscreteMarkovChain.

Other comments

ping @Upabjojr @sidhantnagpal

Release Notes

  • stats
    • ContinuousMarkovChain have been added to sympy.stats
@sympy-bot

This comment has been minimized.

Copy link

commented Jul 7, 2019

Hi, I am the SymPy bot (v147). I'm here to help you write a release notes entry. Please read the guide on how to write release notes.

Your release notes are in good order.

Here is what the release notes will look like:

  • stats

This will be added to https://github.com/sympy/sympy/wiki/Release-Notes-for-1.5.

Note: This comment will be updated with the latest check if you edit the pull request. You need to reload the page to see it.

Click here to see the pull request description that was parsed.

<!-- Your title above should be a short description of what
was changed. Do not include the issue number in the title. -->

#### References to other Issues or PRs
<!-- If this pull request fixes an issue, write "Fixes #NNNN" in that exact
format, e.g. "Fixes #1234". See
https://github.com/blog/1506-closing-issues-via-pull-requests . Please also
write a comment on that issue linking back to this pull request once it is
open. -->
[1] http://u.math.biu.ac.il/~amirgi/CTMCnotes.pdf

#### Brief description of what is fixed or changed
Continuous time Markov chains have been added. The API is kept similar to `DiscreteMarkovChain`.

#### Other comments
ping @Upabjojr @sidhantnagpal 

#### Release Notes

<!-- Write the release notes for this release below. See
https://github.com/sympy/sympy/wiki/Writing-Release-Notes for more information
on how to write release notes. The bot will check your release notes
automatically to see if they are formatted correctly. -->

<!-- BEGIN RELEASE NOTES -->
* stats
  * `ContinuousMarkovChain` have been added to `sympy.stats`
<!-- END RELEASE NOTES -->

@jksuom

This comment has been minimized.

Copy link
Member

commented Jul 8, 2019

I think that this is what is usually known as a "Markov process". According to wikipedia:

A Markov chain is a stochastic model describing a sequence ...

@czgdp1807

This comment has been minimized.

Copy link
Member Author

commented Jul 8, 2019

I cannot figure out what you want to say.
Some more references for ContinuousTimeMarkovChain:

  1. http://u.math.biu.ac.il/~amirgi/CTMCnotes.pdf

  2. https://www.utdallas.edu/~jjue/cs6352/markov/node5.html

  3. https://en.wikipedia.org/wiki/Markov_chain#Continuous-time_Markov_chain

If you want to know why I haven't implemented probability method yet, then I am working on it reading some notes and book chapters to know the specific results which can help in computation.

@jksuom

This comment has been minimized.

Copy link
Member

commented Jul 8, 2019

I cannot figure out what you want to say.

I only wanted to know if you might be planning to define a general Markov process (i.e. MarkovProcess instead of ContinuousMarkovChain).

@czgdp1807

This comment has been minimized.

Copy link
Member Author

commented Jul 8, 2019

I think unifying the API can be very hard because, discrete counter part of Markov chain has some different set of logic to handle the queries and continuous counter part has a different one.
AFAIK, discrete Markov chain is usually analysed with transition matrix and continuous ones use generator matrix and both the things diverge there. So, probably, we can add an interface function markov_process which can make objects of continuous and discrete counter parts according to the arguments received.
Or, may be an abstract class MarkovProcess?

@codecov

This comment has been minimized.

Copy link

commented Jul 8, 2019

Codecov Report

Merging #17163 into master will increase coverage by 0.001%.
The diff coverage is 68.085%.

@@             Coverage Diff              @@
##            master   #17163       +/-   ##
============================================
+ Coverage   74.518%   74.52%   +0.001%     
============================================
  Files          623      623               
  Lines       161539   161584       +45     
  Branches     37910    37919        +9     
============================================
+ Hits        120377   120413       +36     
+ Misses       35833    35832        -1     
- Partials      5329     5339       +10
@czgdp1807

This comment has been minimized.

Copy link
Member Author

commented Jul 12, 2019

ping @Upabjojr @sidhantnagpal
I have added an algorithm to handle probability queries for ContinuousMarkovChain. I did this because the mechanism in DiscreteMarkovChain.probability is not easy to maintain, less generic, and difficult to extend. I have given an overview of my approach below and what I have thought do in future.

Overview of the mechanism in ContinuousMarkovChain.probability
The algorithm works on the basis of the type of condition in a recursive manner. Below are listed the various cases.

  1. condition is of type Relational: This is the base case of the algorithm. The condition is converted to a set by using Intersection(condition.as_set(), state_space). Since, AFAIK, in a Relational only one variable is allowed, the conversion to set will be possible. Then the given_condition(which is pre-processed, using _pre_process method and is always either a Relationalor of type And) is analysed and condensed. Then on the basis of the above computations, the results are returned.

  2. condition is of type Not: The probability of the expression inside Not is calculated and the result is returned after subtraction from S(1).

  3. condition is of type And: The probability of all the args are calculated(using recursion) and the respective results are multiplied. This part needs some improvements for corner cases, like, Eq(C(0), 1) & Eq(C(1), 1) will not work, the answer should be zero but the result will not be zero. I will think over it and will do the changes and push them here.

  4. condition is of type Or: The probability of all the args are calculated(using recursion) and the respective results are added.

I have restricted the condition to a combination of And, Or and Not because we can represent any condition using these three operators and in fact these are the most intuitive.

I plan to replace the algorithm in DiscreteMarkovChain.probability with the above one, if there are not critical objections to it. Though the implementations for discrete and continuous cases cannot be unified due to the following reason,

AFAIK, discrete Markov chain is usually analysed with transition matrix and continuous ones use generator matrix and both the things diverge there.

I thought a lot and did some searching too for a generic algorithm, but was unable to find. A generic algorithm can be implemented if we can make as_set work for multivariate conditions.


class ContinuousMarkovChain(ContinuousTimeStochasticProcess, StochasticProcessUtil):
"""
Represents continuous Markov chain.

This comment has been minimized.

Copy link
@oscarbenjamin

oscarbenjamin Jul 13, 2019

Contributor

This class represents a continuous time, discrete state Markov process. To me it seems unusual to call this a "chain" when it is in continuous time. It also does not represent any continuous time Markov "chain" since it needs to have discrete state (unlike e.g. a Weiner process).

I dislike the terminology but more importantly whatever terminology you use you should provide a mathematical definition so that it is unambiguous to users.

This comment has been minimized.

Copy link
@czgdp1807

czgdp1807 Jul 13, 2019

Author Member

Sure, I will add the definitions along with the references.

@oscarbenjamin

This comment has been minimized.

Copy link
Contributor

commented Jul 13, 2019

I wonder if it would be better to simultaneously implement a fully continuous Markov process (like a Weiner process) so that you can see how best to factor the code between them. As @jksuom said (somewhere) there is commonality in the implementation of these classes if all queries are reduced to conditional probabilities.

@czgdp1807

This comment has been minimized.

Copy link
Member Author

commented Jul 13, 2019

Yes we can take out the common logic. I am also thinking of doing that in this PR.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
You can’t perform that action at this time.