Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add Time Step Recurrent Neural Network #193

Merged
merged 8 commits into from
Apr 21, 2018

Conversation

robertleeplummerjr
Copy link
Contributor

@robertleeplummerjr robertleeplummerjr commented Apr 14, 2018

Add basic implementation of time step recurrent neural network so we can baseline for 2.0 and provide some support for community's requests.

This will answer #192

A GIF or MEME to give some spice of the internet
tumblr_mw2o47v2ga1s8a6wyo1_400

Description

Motivation and Context

issue

How Has This Been Tested?

Screenshots (if appropriate):

Types of changes

  • Bug fix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds functionality)
  • Breaking change (fix or feature that would cause existing functionality to not work as expected)

Author's Checklist:

  • My code focuses on the main motivation and avoids scope creep.
  • My code passes current tests and adds new tests where possible.
  • My code is SOLID and DRY.
  • I have updated the documentation as needed.

Reviewer's Checklist:

  • I kept my comments to the author positive, specific, and productive.
  • I tested the code and didn't find any new problems.
  • I think the motivation is good for the project.
  • I think the code works to satisfies the motivation.

…can baseline for 2.0 and provide some support for community's requests.
@robertleeplummerjr
Copy link
Contributor Author

This will also provide an answer for #138

@robertleeplummerjr
Copy link
Contributor Author

This will also provide an answer for #109

Copy link
Contributor

@freddyC freddyC left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I had to keep it gloomy because I have been so bad at contributing lately.

tumblr_ow8zh8oiur1wzypxlo1_500

I have some things that I pointed out that I think you will be fine with most of them.

Let me know what you think.

left: m,
product: product,
forwardFn: copy,
// backpropagationFn: () => {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do you want to leave in the commented out code?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I did not actually mean to add this method, removing.

let outputSize = this.outputSize;
let lastHiddenSize = this.hiddenSizes[this.hiddenSizes.length - 1];

//whd
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What does whd and bd mean? Is this just a reference to other code values?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I did mean to keep this, it reflects the same/similar function from the RNN class. This code will be deprecate in v2, so I think changing it would either require it changed in both locations or to just leave it alone. This was originally in recurrent.js, which was used as a foundation to eventually get to v2. I think the less work the better, but for the right reasons, so I think leave it.

* @param {Number[]} input
* @returns {number}
*/
runInput(input) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Where this method is not meant to be used by the library user, perhaps an _ before this method name would be beneficial? (it's semicolons but I think helps clarity)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'd like to leave it as it is for the time being, this code will be deprecated soon, and this is just used for baselining the new GPU stuff, changing it means we are going to maintain it and we've been very vocal that this the recurrent classes were beta. Also the deprecated argument.

}
const outputs = [];

if (this.inputSize === 1) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could we change this from:

if (...) {
  for (<loop A> ) {
    for (<loop B>) {
//      code block A
    }
  }
} else {
  for (<loop A> ) {
    for (<loop B>) {
//      code block B
    }
  }
}

to:

for (<loop A> ) {
  for (<loop B>) {
    if (...) {
//    code block A
    } else {
//    code block B
    }
  }
}

Man that took way longer then it should have to abstract it lol. Again it's semicolans but I think it helps clarity (and it looks like the only line that changes is: const error = output.weights[i] - next; vs const error = output.weights[i] - next[i]; unless I am missing something.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not going to waist time changing this because it is going to be deprecated.

return errorSum;
}

runBackpropagate() {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Again an _ I think helps clarity

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

perhaps, but deprecated.

this.bindEquation();
}
let lastOutput;
if (this.inputSize === 1) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Again we could combine these loops into 1 that has the logic in it for if (this.InputSize === 1) { to just contain the runInput([input[i]]); or runInput(input[i]);

Again Semicolons.

*
* @returns {Function}
*/
toFunction() {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you for including this!! I love that it's stubbed out and throws, so it is obviously clear if ever used what went wrong, but when we are ready to implement it we already have it stubbed out and just need to finish it.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice.

}
}

RNNTimeStep.defaults = {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Did we get this represented in the documentation anywhere? (probably did but I haven't gotten that far yet).

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Will document that now.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done

@@ -110,7 +110,7 @@ describe('lstm', () => {
assert(lastOutput.length > 0);
assert.equal(dataFormatter.toCharacters(net.toFunction()()).join(''), lastOutput);
});
it.only('can include the DataFormatter', () => {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lulz

/* NOTE: TimeStep here is deprecated though being committed as something new, it is the first feature we want using
recurrent.js because it is simple one of the simplest recurrent neural networks and serves as a baseline to completing
the GPU architecture. This test is written so as to create the baseline we can measure against.
We get this working, we have a baseline, we finish recurrent.js, we change the world.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You are too optimistic here, can you word it a bit more gloomily?
tenor

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could your give me an example of what you are looking for?

@freddyC
Copy link
Contributor

freddyC commented Apr 19, 2018

Oh, but also great job on this! I don't think I actually said it but I did think it. You have done some awesome stuff here so sorry to be the rain on you're parade.

tumblr_l9zuy5air81qcr7fqo1_r1_500

I also think I might be gloomy because I started reading this book recently and makes me sad.

@robertleeplummerjr robertleeplummerjr merged commit 865c7c9 into develop Apr 21, 2018
@robertleeplummerjr robertleeplummerjr deleted the 192-time-series-baseline branch April 21, 2018 18:35
@robertleeplummerjr
Copy link
Contributor Author

Since the items were suggests and the actual net worked and closely resemble the existing recurrent neural network and will be removed after GPU implementations from the nn-gpu-layers branch is brought up to specification, went ahead and merged to keep things moving along.

@freddyC
Copy link
Contributor

freddyC commented Apr 21, 2018

giphy
gj man

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants