Skip to content

Conversation

freddyC
Copy link
Contributor

@freddyC freddyC commented Jan 30, 2018

A GIF or MEME to give some spice of the internet

Description

  1. I updated the training options to be a property of the neural network. You can now pass in any of the training options into the constructor and they will be recognized, set them to the network or pass them into the training and the property will get adjusted and trained appropriately.

  2. I prepended underscores to neural network methods that are intended for private use

  3. I did some general cleanup up with the new properties being on the class, and not passed into the methods.

  4. I updated the tests to reflect changes. I also moved the logs test into their own file.

  5. I included the training options in the saving of JSON so when the network is restored it's training options is restored as well. If a log method is chosen it will restore to simply console.log. The callback method is not restored. This interaction is mentioned in the readme.

Motivation and Context

#111 will be helped but I don't think this finishes it
finishes #120 for the neural-network class

The main focus is maintaining code. This shouldn't change functionality other then extending it where logical. It should help the library be simpler to implement as well as more robust and more maintainable in the future.

How Has This Been Tested?

I added tests to mirror the functional changes. I added in a test for passing things into the constructors. I added tests for the training objects JSONification.

Screenshots (if appropriate):

Types of changes

  • Bug fix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds functionality)
  • Breaking change (fix or feature that would cause existing functionality to not work as expected)
  • Maintenance and simplification

Author's Checklist:

  • My code focuses on the main motivation and avoids scope creep.
  • My code passes current tests and adds new tests where possible.
  • My code is SOLID and DRY.
  • I have updated the documentation as needed.

Reviewer's Checklist:

  • I kept my comments to the author positive, specific, and productive.
  • I tested the code and didn't find any new problems.
  • I think the motivation is good for the project.
  • I think the code works to satisfies the motivation.

Made setActivation() accept a param to be more friendly for other users

Created log test class

Moved the options tests that were for training options into the training options tests

made async training options tests.

Updates some readme stuff
Copy link
Contributor

@robertleeplummerjr robertleeplummerjr left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can you remove run?

Fantastic work man!

test/README.md Outdated

```
npm test
npm run test
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think you need run with newer npm.

Copy link
Contributor

@robertleeplummerjr robertleeplummerjr left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

:shipit:

@freddyC freddyC merged commit 7cb31e0 into develop Jan 30, 2018
@freddyC freddyC deleted the options-update branch January 30, 2018 17:48
@freddyC freddyC mentioned this pull request Jan 30, 2018
35 tasks
* momentum: (number),
* activation: ['sigmoid', 'relu', 'leaky-relu', 'tanh']
*/
updateTrainingOptions(opts) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This way of doing this seems a little cumbersome, especially if a new option wants to get added in. Exceptions to the standard (items contained in this.trainOpts are a little obfuscated as well. Maybe you could change this to something more like

    Object.keys(NeuralNetwork.trainDefaults).forEach(opt => this.trainOpts[opt] = opts[opt] || this.trainOpts[opt]);
    this._setLogMethod(opts.log || this.trainOpts.log);
    this.activation = opts.activation || this.activation;

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This makes sense and is cleaner. I am a little less familiar with the new JS magic. The one thing I like about having the longer explicit if .... do this method is for functionality changes that may arise, then you don't have to re-parse to add functionality, but that is a weak argument. Mostly the things that I am less familiar with are harder to read and I try and make it easy to read, doesn't mean it's the best lol.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

the problem with the big if (...) {} if (...) {} was that if you want to add in a new option, you have to remember to add it into this list. All of the functions were doing the same thing, except for two exceptions (should those have even been in trainingOpts?), so why not handle the same things the same?

* Gets JSON of trainOpts object
* NOTE: Activation is stored directly on JSON object and not in the training options
*/
_getTrainOptsJSON() {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same comment as the one for updateTrainingOptions, we could change this to something like

return Object.keys(NeuralNetwork.trainDefaults)
      .reduce((opts, opt) => {
        if (this.trainOpts[opt]) opts[opt] = this.trainOpts[opt];
        return opts;
      }, {});```

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The only place that gets weird is with the callbacks. I have it set up right now where a callback doesn't get stored and log just stores true or false. When restored the log method will be set as console.log as long as we handle those this method seems much better.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this is only for log?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

iirc the callback isn't stored but if log is a truthy value we store it as true, then when restoring it we restore it to the method console.log

if (!hiddenSizes) {
sizes.push(Math.max(3, Math.floor(inputSize / 2)));
_verifyIsInitialized(data) {
if (this.sizes) return;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

should this verify that the net was initialized correctly? Or just that at some point the sizes got set?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

My idea was to just verify the size was set but it would probably be a good place to ensure the network was initialized and give the user feedback when they try to run without training.

let hiddenSizes = this.hiddenSizes;
if (!hiddenSizes) {
sizes.push(Math.max(3, Math.floor(inputSize / 2)));
_verifyIsInitialized(data) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This name should probably change, as it doesn't just verify that the initialization has occurred and will actually do something if needed. Not a big deal, as it's intended as an internal-only function, but still, might be better to refactor it a little to something more descriptive of it's actual function

let sizes = this._getSizesFromData(data);
this.initialize(sizes);
}
this._verifyIsInitialized(data);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is why I don't like this name...but I commented at the function definition.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yeah good point


while ( status.iterations < options.iterations && status.error > options.errorThresh) {
this._checkTrainingTick(data, status, options);
while (status.iterations < this.trainOpts.iterations && status.error > this.trainOpts.errorThresh && Date.now() < endTime) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This PR is supposed to add in timeouts?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah I completely forgot to add that in the pr, sorry. I probably should have broken that out into it's own pr.


if (status.error < options.errorThresh) {
this._trainingTick(data, status);
if (status.error < this.trainOpts.errorThresh || Date.now() > endTime) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

adding in timeouts?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also, why is this after the tick now?

train(data, options = {}) {
this.updateTrainingOptions(options);
data = this._formatData(data);
const endTime = Date.now() + this.trainOpts.timeout;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

adding timeout?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yeah, so they can train for a set time interval as if want.

@perkyguy
Copy link
Contributor

This was kind of a large PR, with a little bit of scope creep

@freddyC
Copy link
Contributor Author

freddyC commented Jan 30, 2018

yeah sorry I'll try and keep things more concise

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants