Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

embedding evaluation #283

Closed
jwijffels opened this issue Sep 20, 2018 · 2 comments
Closed

embedding evaluation #283

jwijffels opened this issue Sep 20, 2018 · 2 comments

Comments

@jwijffels
Copy link

Hi @dselivanov
I've been working recently on the https://github.com/bnosac/ruimtehol package which does amongst other things word/sentence/article embeddings (and a bit more) based on the Starspace C++ library.
I would like the compare the embeddings which come out of the package to the Glove embeddings. The Starspace paper does this already but I would like to do this myself. As the text2vec package already implements Glove embeddings, do you have - par hasard - a script lingering around which somehow evaluates embeddings generated in different runs/toolsets?

@dselivanov
Copy link
Owner

dselivanov commented Sep 21, 2018 via email

@KafeelBasha
Copy link

With reference to text2vec documentation and below link.
https://cran.r-project.org/web/packages/text2vec/vignettes/glove.html#word_embeddings

I have created a word vector with dimension (10000,100), but in order to work with Keras, it requires sequence length. Have tried using Pre traind word vectors like glove.6B.100d.txt, but it takes long time to execute, and RStudio terminates abruptly. I am working with 8GB RAM machine.

Is there a way to use the word vector created using text2vec inside keras embedding layer?.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants