Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add recurrent weight init configuration option for all RNN layers #4579

Merged
merged 3 commits into from Feb 5, 2018

Conversation

@AlexDBlack
Copy link
Contributor

commented Jan 31, 2018

Partly addresses: #4563

Adds the ability to set weight init for recurrent weights. If not set, single weight init is used for all RNN weights. If it is set, use that weight init for the recurrent weights only.

@maxpumperla
Copy link

left a comment

LGTM, thanks!

@maxpumperla
Copy link

left a comment

LGTM, thanks!

@maxpumperla maxpumperla merged commit 04e0a49 into master Feb 5, 2018

0 of 3 checks passed

continuous-integration/jenkins/pr-merge This commit cannot be built
Details
codeclimate 6 issues to fix
Details
continuous-integration/travis-ci/pr The Travis CI build failed
Details

@maxpumperla maxpumperla deleted the ab_4563_lstm_weight_init branch Feb 5, 2018

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
2 participants
You can’t perform that action at this time.