Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Using pre-trained google word embeddings #33

Closed
haskarb opened this issue Feb 22, 2018 · 3 comments
Closed

Using pre-trained google word embeddings #33

haskarb opened this issue Feb 22, 2018 · 3 comments

Comments

@haskarb
Copy link

haskarb commented Feb 22, 2018

Hi @alexander-rakhlin ,

Thanks for sharing your code.
I have a small query; how can I use google pre-trained binary file.

Thanks :)

@alexander-rakhlin
Copy link
Owner

You can use it in place of word2vec model in the code, but you need to adjust details like embedding dimensionality and interface - I don't remember whether it different or not.

@haskarb
Copy link
Author

haskarb commented Feb 23, 2018

I tried to set embedding size to 300, but it not working :(

@alexander-rakhlin
Copy link
Owner

So you need to figure out how to incorporate it yourself, because I do not know too.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants