Replace zerograd(s) with cleargrad(s) #1528

Merged
merged 7 commits into from Aug 29, 2016

Conversation

Projects
None yet
2 participants
@gwtnb
Member

gwtnb commented Aug 19, 2016

Variable.zerograd requires allocated memory and fills with zero, but these operations are not necessary.

TODOs:

  • write tests
  • write docs (I will send another PR for docs)

@gwtnb gwtnb added the enhancement label Aug 19, 2016

@gwtnb gwtnb changed the title from [WIP] Replace zerograd(s) with cleargrad(s) to Replace zerograd(s) with cleargrad(s) Aug 19, 2016

@gwtnb gwtnb added the feature label Aug 23, 2016

@@ -384,8 +384,12 @@ def update(self, lossfun=None, *args, **kwds):
"""
if lossfun is not None:
+ use_cleargrads = kwds.pop('use_cleargrads', False)

This comment has been minimized.

@beam2d

beam2d Aug 29, 2016

Member

I think it has no problem, but strictly speaking it breaks the compatibility. How do you think about adding a flag to the optimizer object itself?

@beam2d

beam2d Aug 29, 2016

Member

I think it has no problem, but strictly speaking it breaks the compatibility. How do you think about adding a flag to the optimizer object itself?

This comment has been minimized.

@gwtnb

gwtnb Aug 29, 2016

Member

If the name use_cleargrads is already used, this PR breaks backward compatibility. I think it is too pessimistic to consider such case.

@gwtnb

gwtnb Aug 29, 2016

Member

If the name use_cleargrads is already used, this PR breaks backward compatibility. I think it is too pessimistic to consider such case.

This comment has been minimized.

@beam2d

beam2d Aug 29, 2016

Member

OK, I agree.

@beam2d

beam2d Aug 29, 2016

Member

OK, I agree.

chainer/link.py
+ """Clears all gradient arrays.
+
+ This method should be called before the backward computation at every
+ iteration of the optimization.

This comment has been minimized.

@beam2d

beam2d Aug 29, 2016

Member

The note is almost same as that of zerograds, so users may be confused. Would you update the docstring of zerograds? I think new users should use cleargrads in most cases.

@beam2d

beam2d Aug 29, 2016

Member

The note is almost same as that of zerograds, so users may be confused. Would you update the docstring of zerograds? I think new users should use cleargrads in most cases.

@beam2d

This comment has been minimized.

Show comment
Hide comment
@beam2d

beam2d Aug 29, 2016

Member

LGTM except above comments

Member

beam2d commented Aug 29, 2016

LGTM except above comments

@beam2d

This comment has been minimized.

Show comment
Hide comment
@beam2d

beam2d Aug 29, 2016

Member

LGTM

Member

beam2d commented Aug 29, 2016

LGTM

@beam2d beam2d merged commit f05c796 into master Aug 29, 2016

2 of 4 checks passed

continuous-integration/appveyor/branch Waiting for AppVeyor build to complete
Details
continuous-integration/appveyor/pr Waiting for AppVeyor build to complete
Details
continuous-integration/travis-ci/pr The Travis CI build passed
Details
continuous-integration/travis-ci/push The Travis CI build passed
Details

@beam2d beam2d added this to the v1.15.0 milestone Aug 29, 2016

@gwtnb gwtnb deleted the cleargrad branch Aug 29, 2016

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment