-
-
Notifications
You must be signed in to change notification settings - Fork 2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
GeneratorOp logic fix #1702
GeneratorOp logic fix #1702
Conversation
try: | ||
if self.default is not None: | ||
output_storage[0][0] = next(self.generator, self.default) | ||
else: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe for back-ward compatibility, you should keep the try..except and if there is no default, do as before.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I test it on python2.7 and python3, it seems to have such compatibility
|
||
def perform(self, node, inputs, output_storage, params=None): | ||
try: | ||
if self.default is not None: | ||
output_storage[0][0] = next(self.generator, self.default) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I can't comment on that next(p1, p2) syntax. I don't know it and a very quick google only talk about next(p1).
If what it does is to return default when the generator finished. This isn't clear in the doc. When this happen, should it print it that the generator finished?
For NN, I don't know what would be a good default. The first minibatch? This would bias the training. What about raising StopIteration and making Theano catch that (and do some clean up when that happen and reraise it?)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I test raising StopIteration. It is clear that this op causes it and setting new generator works without recompiling the function. Maybe docstring should be improved
pymc3/theanof.py
Outdated
|
||
def __call__(self, value): | ||
self.op.set_default(value) | ||
|
||
@change_flags(compute_test_value='off') | ||
def __call__(self, *args, **kwargs): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't like this place of the code. It is too ugly I think.
(cherry picked from commit c5ec328)
rebased, if checks pass I'll merge in master |
I was playing with minibatch OPVI and mnist but was so lucky to break generator so it began to yield nans as I expected. As result I've lost all the weights and there was no single chance to detect this beforehand. That is the reason of changing logics of GeneratorOp when generator is exhausted. Now it raises
StopIteration
and thus can be detected.CC @nouiz, recently I wrote to theano-users group about GeneratorOp and you replied, I made some working example finally. Could you please review my implementation? I'm not sure that my approach is good enough as I use some hacks to make it work.