-
Notifications
You must be signed in to change notification settings - Fork 62
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
sampling from distributions over lazy lists? #32
Comments
Indeed, it is very tempting to sample infinite data structures. There are no fundamental reasons why you shouldn't do it, but there are certain technical challenges involved that made me decide not to support it in monad-bayes at this point. The reason If we used the The main problem with laziness, however, is in inference. Since we allow conditioning in arbitrary places in the program, and those conditions can have a global effect on the distribution defined by the program, we can't really do inference lazily since we need to make sure we find all the conditions. Since the main focus of monad-bayes is inference, I decided not to bother with lazy sampling at all. Having said that, if you put restrictions on where conditions are allowed, you may be able to do inference on lazy data structures. This is what the original code did, you can still find it on |
Thanks! This is a very helpful explanation. |
Hi @adscib. I've been working on something that seems to intersect with your work here, and I'm curious if you have any thoughts on this. I've been working on a similar paradigm to monad-bayes, and I ran into some of the problems that you mentioned above. For example, if you make the monad interpreter lazy, then it may not ever evaluate the observation statements, since their result is never used, among other problems. etc. In order to allow the interpreter to be lazy, I split the random monad into two parts:
I investigated using observations inside the random sampling part to generate conditioned samples, but I could not see how to make this work on contingent variables for MCMC. Perhaps I'm missing something...?
|
LazyPPL has a lazy sampling monad. As discussed above, this interacts with the monad-bayes transformer stack only to a limited degree, but will nevertheless be made available with examples and documentation in an upcoming release of monad-bayes. |
For educational purposes, I modified your HMM example to work as an even simpler Markov chain. I decided to really live the Haskell lifestyle and try to define a distribution over lazy infinite lists.
Ideally, I'd be able to, say
sampleIO $ fmap head (mmlazy start)
and get a sample of just the first state; likewise usetake
, etc.. In practice, although this typechecks, the computation just hangs.Perhaps there are good reasons why I shouldn't even be trying to do this -- in which case, is there any piece of the code, paper, etc. that I might want to be looking at to gain a better understanding of why it won't work?
The text was updated successfully, but these errors were encountered: