Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Word Embedding #22

Open
aggounix opened this issue Nov 26, 2018 · 3 comments
Open

Word Embedding #22

aggounix opened this issue Nov 26, 2018 · 3 comments

Comments

@aggounix
Copy link

Is it possible to use an other word embedding than glove (for other languages)?

@QianhuiWu
Copy link

Is it possible to use an other word embedding than glove (for other languages)?

I think it's okay. You just need to change the path of the embedding file in build_glove.py.

@mraduldubey
Copy link

Yes. That'd be fine @aggounix . I have used fasttext instead of glove. And even more, I have used quantized embeddings to reduce memory requirement.

@ghost
Copy link

ghost commented Apr 16, 2019

Anyone tried the feature extractor form BERT repo? Was thinking about using the last two layers of BERT concat them and use the 1548 long vector insted of the glove ones.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants