Skip to content

MananSuri27/Word2Vec-NLP

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 

Repository files navigation

Word2Vec-NLP

Medium Article: A Dummy's Guide to Word2Vec

Word2Vec creates a representation of each word present in our vocabulary into a vector. Words used in similar contexts or having semantic relationships are captured effectively through their closeness in the vector space- effectively speaking similar words will have similar word vectors! History. Word2vec was created, patented, and published in 2013 by a team of researchers led by Tomas Mikolov at Google.

Hypothetical features to understand word embeddingsHypothetical features to understand word embeddings

We can easily train word2vec word embeddings using Gensim, which is, “is a free open-source Python library for representing documents as semantic vectors, as efficiently (computer-wise) and painlessly (human-wise) as possible.”

In the above notebook, I've demonstrated an implementation of word2vec using the Gensim library.

About

Code accompanying my Medium article explaining word embeddings!

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published