New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add RMSProp and Adagrad optimizers. #102

Merged
merged 23 commits into from Sep 16, 2017

Conversation

Projects
None yet
2 participants
@mnottheone
Contributor

mnottheone commented Sep 6, 2017

Also resolved texture leakage and optimizers inherited from optimizer.ts

This change is Reviewable

@mnottheone

This comment has been minimized.

Contributor

mnottheone commented Sep 6, 2017

Indentation left. Will do soon.

@nsthorat

This comment has been minimized.

Collaborator

nsthorat commented Sep 8, 2017

Are you using vscode? If so can you merge with master and make sure you have clang-format extension installed? If not using vscode, make sure you format with clang-format and clang-format.style = "Google" (I won't comment on individual stylistic changes that this will fix).

Overall looks great after these minor changes!


Reviewed 1 of 7 files at r1.
Review status: 1 of 7 files reviewed at latest revision, 5 unresolved discussions, some commit checks failed.


demos/model-builder/model-builder.ts, line 853 at r1 (raw file):

}

document.registerElement(ModelBuilder.prototype.is, ModelBuilder);

revert this file


src/adagrad_optimizer.ts, line 54 at r1 (raw file):

        const gradientSquare = math.multiply(gradient, gradient);
        const cache = math.add(oldCache, gradientSquare);
        const variable = math.scaledArrayAdd(this.c!,

remove ! everywhere.


src/rmsprop_optimizer.ts, line 20 at r1 (raw file):

import {TensorArrayMap, SummedTensorArrayMap} from './tensor_array_map';

export class RmspropOptimizer extends Optimizer {

call this RMSPropOptimizer


src/rmsprop_optimizer.ts, line 36 at r1 (raw file):

    super.beforeBatch(math, batchSize, runtime,
      activationArrayMap, gradientArrayMap);
      if (this.cache.size() === 0) {

can we rename this from cache? How about accumulatedGradients


src/rmsprop_optimizer.ts, line 55 at r1 (raw file):

        const oldCache = this.cache.get(node.output);
        const gradientSquare = math.multiply(gradient, gradient);
        const cache = math.scaledArrayAdd(this.g!, oldCache,

remove ! everywhere


Comments from Reviewable

@mnottheone

This comment has been minimized.

Contributor

mnottheone commented Sep 9, 2017

src/rmsprop_optimizer.ts, line 36 at r1 (raw file):

Previously, nsthorat (Nikhil Thorat) wrote…

can we rename this from cache? How about accumulatedGradients

We have already used accumulatedGradients in optimizer.ts's after example, so it may be confusing. Here, we are storing sum of gradient squares so, we can use something like accumulatedSquaredGradients and for rmsprop we have discounted prev cache , so we can use accumulatedDiscountedSquaredGradients or accumulatedSquaredGradients only. What do you think ?


Comments from Reviewable

@nsthorat nsthorat changed the title from resolved texture leakage and optimizers inherited from optimizer.ts to Add RMSProp and Adagrad optimizers. Sep 10, 2017

@nsthorat

This comment has been minimized.

Collaborator

nsthorat commented Sep 10, 2017

Let me know when you're auto formatting and I will merge this!


Reviewed 2 of 7 files at r1, 4 of 4 files at r2.
Review status: all files reviewed at latest revision, 3 unresolved discussions.


src/adagrad_optimizer.ts, line 22 at r2 (raw file):

export class AdagradOptimizer extends Optimizer {
  constructor(protected learningRate: number,
    protected momentum: number, specifiedVariableList?: Node[]) {

with clang-format, this should be indented, can you make sure you've got it running (same for the other files)

Sorry about that, everything else looks good after the code gets formatted


src/rmsprop_optimizer.ts, line 36 at r1 (raw file):

Previously, mnottheone (Aman Kumar Singh) wrote…

We have already used accumulatedGradients in optimizer.ts's after example, so it may be confusing. Here, we are storing sum of gradient squares so, we can use something like accumulatedSquaredGradients and for rmsprop we have discounted prev cache , so we can use accumulatedDiscountedSquaredGradients or accumulatedSquaredGradients only. What do you think ?

Ah, got it. accumulatedSquaredGradients sounds good.


Comments from Reviewable

@nsthorat

This comment has been minimized.

Collaborator

nsthorat commented Sep 11, 2017

Review status: all files reviewed at latest revision, 4 unresolved discussions, some commit checks failed.


src/adagrad_optimizer.ts, line 1 at r2 (raw file):

/* Copyright 2017 Google Inc. All Rights Reserved.

quick update, this license format slightly changed, can you update it (after merging)?


Comments from Reviewable

@mnottheone

This comment has been minimized.

Contributor

mnottheone commented Sep 16, 2017

Please review them and let me know anything left. I guess, now I have clang-format setting right \m/ :)

@nsthorat

This comment has been minimized.

Collaborator

nsthorat commented Sep 16, 2017

Haha, unfortunately this indentation is still because clang-format isn't quite set up properly (it's not formatting with clang, it's formatting with the vscode settings). Do you have both the command line clang-format tool and the visual studio code extension installed?

Sorry to be pedantic, it's just important to have consistent looking code :)


Review status: 0 of 8 files reviewed at latest revision, 4 unresolved discussions.


Comments from Reviewable

@nsthorat

This comment has been minimized.

Collaborator

nsthorat commented Sep 16, 2017

Review status: 0 of 8 files reviewed at latest revision, 3 unresolved discussions.


src/rmsprop_optimizer.ts, line 36 at r1 (raw file):

Previously, nsthorat (Nikhil Thorat) wrote…

Ah, got it. accumulatedSquaredGradients sounds good.

Still need to change from cache :)


Comments from Reviewable

@mnottheone

This comment has been minimized.

Contributor

mnottheone commented Sep 16, 2017

Hopefully, okay now.


Comments from Reviewable

@nsthorat

This comment has been minimized.

Collaborator

nsthorat commented Sep 16, 2017

One more tiny comment then I will merge, thanks for bearing with me :)


Reviewed 2 of 8 files at r3, 6 of 6 files at r4.
Review status: all files reviewed at latest revision, 1 unresolved discussion.


src/index.ts, line 1 at r4 (raw file):

/**

open this file and hit save to format so we dont commit these old formatter changes


Comments from Reviewable

@nsthorat

This comment has been minimized.

Collaborator

nsthorat commented Sep 16, 2017

Reviewed 1 of 1 files at r5.
Review status: all files reviewed at latest revision, all discussions resolved.


Comments from Reviewable

@nsthorat

This comment has been minimized.

Collaborator

nsthorat commented Sep 16, 2017

:lgtm_strong:


Review status: all files reviewed at latest revision, all discussions resolved.


Comments from Reviewable

@nsthorat nsthorat merged commit 53afaa6 into tensorflow:master Sep 16, 2017

3 checks passed

cla/google All necessary CLAs are signed
code-review/reviewable 1/1 LGTMs
Details
continuous-integration/travis-ci/pr The Travis CI build passed
Details
@mnottheone

This comment has been minimized.

Contributor

mnottheone commented Sep 16, 2017

Thanks ! :)

Next things to do :

  • Implement Adadelta, Adam, and Adamax.
  • Add documentation in API docs for new optimizers.

You can add or change anything, you want.

@nsthorat

This comment has been minimized.

Collaborator

nsthorat commented Sep 16, 2017

Awesome! BTW, I'm about to do a big directory structure change (optimizers in a directory, etc). You can start working on those, but just know merging will be.. fun ;)

We'll update the API docs with your new optimizers since it's automatically generated.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment