Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

About the pre-trained BERT ConceptNet embedding #3

Closed
zsun227 opened this issue Jun 12, 2020 · 2 comments
Closed

About the pre-trained BERT ConceptNet embedding #3

zsun227 opened this issue Jun 12, 2020 · 2 comments

Comments

@zsun227
Copy link

zsun227 commented Jun 12, 2020

Hello! Thanks for sharing your code and pre-trained embeddings.

I wonder how can we know the corresponding word for every bert embedding vector in the file "conceptnet_bert_embeddings.pt"?
Looks like it's shape is 78334 x 1024. But the "cn_node_names.txt" contains 78249 entities,,,,,
So I'm not sure how to link each entity to its pre-trained BERT embedding. Could you give me some insights? thanks!

@chaitanyamalaviya
Copy link
Contributor

Hi! The provide embeddings include the precomputed embeddings for the evaluation nodes as well. Here is a list of the 78334 node names which can be linked to the provided embeddings in the same order.

@zsun227
Copy link
Author

zsun227 commented Jun 15, 2020

Hi! The provide embeddings include the precomputed embeddings for the evaluation nodes as well. Here is a list of the 78334 node names which can be linked to the provided embeddings in the same order.

Thank you!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants