Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

sync with clab/dynet #39

Merged
merged 109 commits into from Feb 15, 2017
Merged

sync with clab/dynet #39

merged 109 commits into from Feb 15, 2017

Conversation

joelgrus
Copy link

doesn't appear there's any breaking changes, except for some CMake difference that requires a slightly different eigen path (I don't really understand why, but I updated the README)

yoavg and others added 30 commits November 21, 2016 19:33
* Remove previous dropout behavior.
* set_dropout now indicates dropout_rate, not retention rate.
* weights are scaled when dropout is applied, so no scaling is needed at test time (just set dropout_rate = 0.0)
This is preferable over manually installing Eigen, because that
will make /usr/local a mixture of Homebrew and local packages.
A pull request for clab#242 (greedy decoding and vectorization in attention.py)
@jayantk
Copy link

jayantk commented Feb 15, 2017

LGTM

@jayantk jayantk merged commit 9cdc5c5 into master Feb 15, 2017
@joelgrus joelgrus deleted the sync branch February 17, 2017 21:27
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet