-
Notifications
You must be signed in to change notification settings - Fork 33
This issue was moved to a discussion.
You can continue the conversation there. Go to discussion →
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
HierarchicalModelLogPDF #1134
Comments
If you can do it, then Yes please! I think @martinjrobins and @chonlei both felt they couldn't generalise their hierarchical schemes enough? |
Sum of PDFs with disjoint parameter sets should be easy enough. Somewhat analogous to 'pints.SumOfIndependentLogPDFs', but conditioning on disjoint parameter sets and and introducing some rule for the mapping, say the parameters have to in the order [psi, theta].
|
But having varying psi messes up the intended functionality for Likelihoods a bit in p(psi|theta). |
So would it be acceptable to alter the values in the likelihood p(psi|theta), even if usually the "data" is not meant to change. So a quick solution would be to access the private variable |
I'm looking at doing a hierarchical prior function, to at least add this functionality for a small subset of cases. I've written a docstring that describes what I'm intending for it to do: does it sound reasonable for a first hierarchical prior function? |
Looks really good Simon -- thanks!
…On Wed, Jun 3, 2020 at 2:49 PM Simon Marchant ***@***.***> wrote:
I'm looking at doing a hierarchical prior function, to at least add this
functionality for a small subset of cases. I've written a docstring that
describes what I'm intending for it to do: does it sound reasonable for a
first hierarchical prior function?
[image: image]
<https://user-images.githubusercontent.com/55392151/83644381-252d4000-a5a9-11ea-90c3-9e9a5283b837.png>
In particular @ben18785 <https://github.com/ben18785> the hierarchical
guru
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#1134 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/ABCILKHIM5AXINXJPTKNM4LRUZIFHANCNFSM4M6U3VHQ>
.
|
@MichaelClerx @ben18785 @martinjrobins @chonlei Additionally to the really cool HierarchicalLogPrior that @simonmarchant is implementing, I've been thinking about how one could compose hierarchical posteriors that are not constrained to Normally distributed bottom-level parameters, and more importantly where you have the chance to pool some of the bottom-level parameters. In my problems you often want some parameters like the noise to be shared between individuals, while others may vary. I couldn't really workout how pymc3 may be integrated with pints in order to do this. So here is my suggestion (sorry for the unrendered teX, but maybe just look at the example at the bottom to get a feel for the interface):
I'm grateful for any suggestions. |
I realised that for gradient based sampling or optimisation, we would need to be able to compute the partials of the top-level likelihoods with respect to both, it's inputs \theta but also with respect to \psi. If we used LogPriors for the top-level likelihoods, the priors would need to get an additional method which would return the partials with respect to the hyperparameters. |
This issue was moved to a discussion.
You can continue the conversation there. Go to discussion →
Would it be of interest to have a HierarchicalModelLogPDF class in pints?
So something that produces a joint LogPDF from individual LogPDFs with a specific dependence structure:
p(y, psi, theta) = p(y|psi)p(psi|theta)p(theta)
At the moment it is easy enough to implement the individual conditional PDFs with the existing functionality in pints I just haven't seen how to easily create sums of LogPDFs with different parameter spaces and also such with this hidden state structure.
The text was updated successfully, but these errors were encountered: