Skip to content

A word embedding is a learned representation for text where words that have the same meaning have a similar representation.Word embeddings are in fact a class of techniques where individual words are represented as real-valued vectors in a predefined vector space. Each word is mapped to one vector and the vector values are learned in a way that …

Notifications You must be signed in to change notification settings

palashmoon/word_embedding_using_keras

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 

Repository files navigation

Implementing Word Embedding Using Keras

Word Embedding:

A word embedding is a learned representation for text where words that have the same meaning have a similar representation.
Word embeddings are in fact a class of techniques where individual words are represented as real-valued vectors in a predefined vector space. Each word is mapped to one vector and the vector values are learned in a way that resembles a neural network, and hence the technique is often lumped into the field of deep learning.
Key to the approach is the idea of using a dense distributed representation for each word.

Each word is represented by a real-valued vector, often tens or hundreds of dimensions. This is contrasted to the thousands or millions of dimensions required for sparse word representations, such as a one-hot encoding.

The distributed representation is learned based on the usage of words. This allows words that are used in similar ways to result in having similar representations, naturally capturing their meaning.

Prerequisite:

if you are using Anacaonda promt then create new Environment using (conda create -n myenv python=3.6) then follw the following commands.

  1. Tensorflow 2.2 (pip install tensorflow==2.2)
  2. keras (pip install keras)
    Or
    pip install -r requirement.txt (this will download all the dependencies).

About

A word embedding is a learned representation for text where words that have the same meaning have a similar representation.Word embeddings are in fact a class of techniques where individual words are represented as real-valued vectors in a predefined vector space. Each word is mapped to one vector and the vector values are learned in a way that …

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published