Skip to content

TensorFlow implementation of the word2vec (skip-gram model)

Notifications You must be signed in to change notification settings

n0obcoder/Skip-Gram_Model-TensorFlow

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

12 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Skip-Gram-Model-TensorFlow

TensorFlow implementation of the word2vec (skip-gram model)


My PyTorch implemntation of Skip-Gram Model can be found here.

Requirements

  • tensorflow >= 2.0
  • numpy >= 1.18
  • matplotlib
  • tqdm
  • nltk
  • gensim

Training

python main.py

Visualizing real-time training loss in Tensorboard

tensorboard --logdir <PATH_TO_TENSORBOARD_EVENTS_FILE>

NOTE: By default, PATH_TO_TENSORBOARD_EVENTS_FILE is set to SUMMARY_DIR in config.py

Sharing the training loss for Visualization in real-time using Tensorboard

tensorboard dev upload--logdir <PATH_TO_TENSORBOARD_EVENTS_FILE>

Testing

python test.py

Inference

war india crime guitar movies desert physics religion football computer
invasion provinces will bass movie shore mathematics judaism baseball digital
soviet pakistan prosecution drum albums hilly mathematical islam championship computers
troop mainland accusations solo songs plateau chemistry religions basketball software
army asian provoke quartet cartoon basin theoretical religious coach electronic
ally colonial prosecute vocals animate highlands analysis jewish wrestler interface

Blog-Post

Check out my blog post on word2vec here.

About

TensorFlow implementation of the word2vec (skip-gram model)

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Languages