Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Random initialization for NUTS #1744

Closed
wants to merge 43 commits into from
Closed

Random initialization for NUTS #1744

wants to merge 43 commits into from

Conversation

fonnesbeck
Copy link
Member

Added a "random" initialization option for NUTS that simply draws from the prior. Seems to work pretty well in some cases. Might be a good idea for initializing other samplers as well.

@@ -1,5 +1,5 @@
import pymc3 as pm
from numpy import ones, array
from pymc3 import *
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we prohibit this now.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sorry, was testing and forgot to take it out.

@fonnesbeck
Copy link
Member Author

Generalized to use with any step method

@@ -389,7 +389,7 @@ def _random(self, lower, upper, size=None):
# as array-like.
samples = stats.uniform.rvs(lower, upper - lower - np.finfo(float).eps,
size=size)
return np.floor(samples).astype('int32')
return np.floor(samples).astype('int64')
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we need int64 here? This causes problems with float32 interaction.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I doubt it. I think I was trying to diagnose a bug here. I will remove it.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

dtype='int64' is the default for the Discrete class. Should that be changed?

@fonnesbeck
Copy link
Member Author

fonnesbeck commented Feb 9, 2017

I will never understand git rebase I fixed the conflict below 3 TIMES on my local machine. Time to turn on rerere, I guess.

@twiecki
Copy link
Member

twiecki commented Feb 9, 2017

Sometimes it's better to make a new branch from master and cherry-pick the commits you want (can squash them before to make that faster).

@fonnesbeck
Copy link
Member Author

Yes, I need to do that more. I am lazy.

@@ -1,4 +1,4 @@
from collections import defaultdict
rom collections import defaultdict
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Missing an f

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, fixed. Thanks.

fonnesbeck and others added 5 commits February 9, 2017 08:19
* ENH Add Stein Variational Gradient Descent.

* Add updates.py, using Adagrad optimizer for SVGD as default

* WIP Replace updates.py with updates.py from Lasagne.

* MAINT Fix * imports and rename T to tt.

* Include license and reference.
@fonnesbeck
Copy link
Member Author

Same rebase issue here. Closing.

@fonnesbeck fonnesbeck closed this Feb 15, 2017
@aseyboldt aseyboldt deleted the rand_init branch June 13, 2018 11:03
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants