Skip to content

Latest commit

 

History

History
10 lines (8 loc) · 775 Bytes

README.md

File metadata and controls

10 lines (8 loc) · 775 Bytes

seq2seq with attention

This is the implementation of a Neural Machine Translation (NMT) model to translate human readable dates ("25th of June, 2009") into machine readable dates ("2009-06-25"). This is done using an attention model, one of the most sophisticated sequence to sequence models.

The network will input a date written in a variety of possible formats (e.g. "the 29th of August 1958", "03/30/1968", "24 JUNE 1987") and translate them into standardized, machine readable dates (e.g. "1958-08-29", "1968-03-30", "1987-06-24"). We will have the network learn to output dates in the common machine-readable format YYYY-MM-DD.

Architecture of the model

Attention model

Attention mechanism