Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update pretrained_word_embeddings example to use new API. #10537

Merged
merged 1 commit into from
Jul 9, 2018

Conversation

yanboliang
Copy link
Contributor

No description provided.

@fchollet
Copy link
Member

?

@yanboliang
Copy link
Contributor Author

yanboliang commented Jun 27, 2018

@fchollet I think we should use embeddings_initializer to initialize weights for embedding layer, rather than set weights directly, because in the current API, no explicit argument named weights. And it also saves one assignment/initialization, as even if we set weights directly, we still need to do default initialization(embeddings_initializer='uniform'). We should encourage users to use embeddings_initializer rather than weights. If I have misunderstanding, please correct me. Thanks.

@taehoonlee
Copy link
Contributor

Thank you for the PR, @yanboliang. LGTM.

@taehoonlee taehoonlee merged commit c469e72 into keras-team:master Jul 9, 2018
@yanboliang yanboliang deleted the emb-api branch July 10, 2018 19:20
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants