Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

GeneratorOp logic fix #1702

Merged
merged 11 commits into from
Feb 21, 2017
Merged

GeneratorOp logic fix #1702

merged 11 commits into from
Feb 21, 2017

Conversation

ferrine
Copy link
Member

@ferrine ferrine commented Jan 23, 2017

I was playing with minibatch OPVI and mnist but was so lucky to break generator so it began to yield nans as I expected. As result I've lost all the weights and there was no single chance to detect this beforehand. That is the reason of changing logics of GeneratorOp when generator is exhausted. Now it raises StopIteration and thus can be detected.

CC @nouiz, recently I wrote to theano-users group about GeneratorOp and you replied, I made some working example finally. Could you please review my implementation? I'm not sure that my approach is good enough as I use some hacks to make it work.

try:
if self.default is not None:
output_storage[0][0] = next(self.generator, self.default)
else:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe for back-ward compatibility, you should keep the try..except and if there is no default, do as before.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I test it on python2.7 and python3, it seems to have such compatibility


def perform(self, node, inputs, output_storage, params=None):
try:
if self.default is not None:
output_storage[0][0] = next(self.generator, self.default)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I can't comment on that next(p1, p2) syntax. I don't know it and a very quick google only talk about next(p1).

If what it does is to return default when the generator finished. This isn't clear in the doc. When this happen, should it print it that the generator finished?

For NN, I don't know what would be a good default. The first minibatch? This would bias the training. What about raising StopIteration and making Theano catch that (and do some clean up when that happen and reraise it?)

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I test raising StopIteration. It is clear that this op causes it and setting new generator works without recompiling the function. Maybe docstring should be improved

@twiecki
Copy link
Member

twiecki commented Jan 24, 2017

Thanks for checking this out, @nouiz. @ferrine I'm not familiar enough with the peculiarities so feel free to merge if you're confident.

pymc3/theanof.py Outdated

def __call__(self, value):
self.op.set_default(value)

@change_flags(compute_test_value='off')
def __call__(self, *args, **kwargs):
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't like this place of the code. It is too ugly I think.

@ferrine
Copy link
Member Author

ferrine commented Feb 21, 2017

rebased, if checks pass I'll merge in master

@ferrine ferrine merged commit 0e41768 into pymc-devs:master Feb 21, 2017
@ferrine ferrine deleted the gen_logic_fix branch February 21, 2017 20:36
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants