Fetching contributors…
Cannot retrieve contributors at this time
11 lines (7 sloc) 439 Bytes

Word Embedding

This is an example of word embedding. We implemented Mikolov's Skip-gram model and Continuous-BoW model with Hierarchical softmax and Negative sampling.

Run to train and get word2vec.model which includes embedding data. You can find top-5 nearest embedding vectors using

This example is based on the following word embedding implementation in C++.