-
Notifications
You must be signed in to change notification settings - Fork 1.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Added mcmc_traceplots_unigauss.ipynb
Fig 11.14 to 11.17 | Book2
#908
Conversation
Check out this pull request on See visual diffs & provide feedback on Jupyter Notebooks. Powered by ReviewNB |
LGTM. |
Yes Sir, that's a good idea, I'm thinking to create an issue in |
@@ -0,0 +1,965 @@ | |||
{ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Line #6. return states, {"states": states, "info": info}
Not clear what info
is? Is this something your implementation adds or something provided out of the box by BlackJax?
Reply via ReviewNB
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, info
is returned by blackjax model, I've added comment to clarify it.
@@ -0,0 +1,965 @@ | |||
{ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
it may be good to add 1-2 lines of comments on why swap axes is needed. Something like -- expected format by arviz is ... and our current format is ...
Reply via ReviewNB
@@ -0,0 +1,965 @@ | |||
{ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Line #15. samples[param] = states.position[param]
for dims
> 1, you considered burn-in, but not here.
Reply via ReviewNB
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ohh, thanks for pointing out, I've fixed it.
@@ -0,0 +1,965 @@ | |||
{ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We initialised prior_alpha
in the initial cells and now we're using log_prob_alpha
etc. Should we reuse variables?
Reply via ReviewNB
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, done.
@@ -0,0 +1,965 @@ | |||
{ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Line #1. print(f"Number of divergences (bad prior) = {info.is_divergent[500:,:].sum()}")
What is 500 here? Burn-in? if so, should we create a variable for it?
Reply via ReviewNB
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I have created burn_in
as global variable now
@@ -0,0 +1,965 @@ | |||
{ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I agree too! It would be useful for the larger community! I have given some small comments on the notebook. |
Description
Converted numpyro implementation (
notebooks/book2/11/mcmc_traceplots_unigauss_numpyro.ipynb
) to blackjax.Please take a note of the following points
Figure Number
Fig 11.14
Fig 11.15
Fig 11.16
Fig 11.17
Figures
1. Fig 11.14
Before PR
After PR
2. Fig 11.15
Before PR
After PR
3. Fig 11.16
Before PR
After PR
4. Fig 11.17
Before PR
After PR
Issue
#890
Checklist
cc: Dr. @murphyk, Prof. @nipunbatra