A Keras+TensorFlow Implementation of the Transformer: Attention Is All You Need
-
Updated
Sep 24, 2021 - Python
A Keras+TensorFlow Implementation of the Transformer: Attention Is All You Need
Multilingual Automatic Speech Recognition with word-level timestamps and confidence
Neural Machine Translation with Keras
Image to LaTeX (Seq2seq + Attention with Beam Search) - Tensorflow
TensorFlow implementation of Match-LSTM and Answer pointer for the popular SQuAD dataset.
Gathers Tensorflow deep learning models.
Fully batched seq2seq example based on practical-pytorch, and more extra features.
Attention-based end-to-end ASR on TIMIT in PyTorch
An Image Caption Generation based search
A simple attention deep learning model to answer questions about a given video with the most relevant video intervals as answers.
Generates summary of a given news article. Used attention seq2seq encoder decoder model.
Text Summarizer implemented in PyTorch
Vietnamese and Chinese to English
Generating short length description of news articles
English-Hindi translation with attention. WIP
A T5-based Seq2Seq Model that Generates Titles for Machine Learning Papers using the Abstract
Seq2Seq model that restores punctuation on English input text.
Tensorflow2.0 implementation of neural machine translation with Bahdanau attention
Three different implementations for neural machine translation
Convolution Sequence to Sequence models for Hand Written Text Recognition
Add a description, image, and links to the attention-seq2seq topic page so that developers can more easily learn about it.
To associate your repository with the attention-seq2seq topic, visit your repo's landing page and select "manage topics."