Sequence-to-sequence framework with a focus on Neural Machine Translation based on PyTorch
-
Updated
Jun 7, 2024 - Python
Sequence-to-sequence framework with a focus on Neural Machine Translation based on PyTorch
list of efficient attention modules
[TPAMI 2023 ESI Highly Cited Paper] SePiCo: Semantic-Guided Pixel Contrast for Domain Adaptive Semantic Segmentation https://arxiv.org/abs/2204.08808
Implementation of Basic Conversational Agent(a.k.a Chatbot) using PyTorch Transformer Module
Implementation of Transformer Pointer-Critic Deep Reinforcement Learning Algorithm
This repository contains my research work on building the state of the art next basket recommendations using techniques such as Autoencoders, TF-IDF, Attention based BI-LSTM and Transformer Networks
Codes and write-up for Red Dragon AI Advanced NLP Course.
Implementation of Transformer, BERT and GPT models in both Tensorflow 2.0 and PyTorch.
A PyTorch implementation of a transformer network trained using back-translation
Add a description, image, and links to the transformer-network topic page so that developers can more easily learn about it.
To associate your repository with the transformer-network topic, visit your repo's landing page and select "manage topics."