You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello! Thanks for sharing your code and pre-trained embeddings.
I wonder how can we know the corresponding word for every bert embedding vector in the file "conceptnet_bert_embeddings.pt"?
Looks like it's shape is 78334 x 1024. But the "cn_node_names.txt" contains 78249 entities,,,,,
So I'm not sure how to link each entity to its pre-trained BERT embedding. Could you give me some insights? thanks!
The text was updated successfully, but these errors were encountered:
Hi! The provide embeddings include the precomputed embeddings for the evaluation nodes as well. Here is a list of the 78334 node names which can be linked to the provided embeddings in the same order.
Hi! The provide embeddings include the precomputed embeddings for the evaluation nodes as well. Here is a list of the 78334 node names which can be linked to the provided embeddings in the same order.
Hello! Thanks for sharing your code and pre-trained embeddings.
I wonder how can we know the corresponding word for every bert embedding vector in the file "conceptnet_bert_embeddings.pt"?
Looks like it's shape is 78334 x 1024. But the "cn_node_names.txt" contains 78249 entities,,,,,
So I'm not sure how to link each entity to its pre-trained BERT embedding. Could you give me some insights? thanks!
The text was updated successfully, but these errors were encountered: