Skip to content

Sequence to sequence with attention model implemented in Keras

Notifications You must be signed in to change notification settings

ShuvenduRoy/seq2seq_attention

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

seq2seq with attention

This is the implementation of a Neural Machine Translation (NMT) model to translate human readable dates ("25th of June, 2009") into machine readable dates ("2009-06-25"). This is done using an attention model, one of the most sophisticated sequence to sequence models.

The network will input a date written in a variety of possible formats (e.g. "the 29th of August 1958", "03/30/1968", "24 JUNE 1987") and translate them into standardized, machine readable dates (e.g. "1958-08-29", "1968-03-30", "1987-06-24"). We will have the network learn to output dates in the common machine-readable format YYYY-MM-DD.

Architecture of the model

Attention model

Attention mechanism

About

Sequence to sequence with attention model implemented in Keras

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Languages