Natural Language Generation using Reformer is a Transformer model for longer sequences
-
Updated
Dec 13, 2021 - Jupyter Notebook
Natural Language Generation using Reformer is a Transformer model for longer sequences
reformer-pytorch中文版本,简单高效的生成模型。类似GPT2的效果
An adaptation of Reformer: The Efficient Transformer for text-to-speech task.
This repository has code for a chatbot using the reformer model. The model was trained on the Multi-Woz dataset.
Imran Parthib 🚀 Enthusiastic Web Developer and programmer 🌐 Crafting seamless digital experiences with passion and precision. Proudly representing the vibrant spirit of Bangladesh
Natural Language Processing
Scientific Guide AI notebooks is a collection of machine learning and deep learning notebooks prepared by Salem Messoud.
A dedicated convenient repo for different Music Transformers implementations (Reformer/XTransformer/Sinkhorn/etc)
Grammatical Error Correction at the character level using Reformers.
Decent and capable Music AI implementation based on the SOTA Google Reformer transformer and code/colab.
An implementation of multiple notable attention mechanisms using TensorFlow 2
NLP Code Snippets and Conference related
reformer-pytorch中文版本,简单高效的生成模型。类似GPT2的效果
Symbolic music generation taking inspiration from NLP and human composition process
Absolutely amazing SOTA Google Colab (Jupyter) Notebooks for creating/training SOTA Music AI models and for generating music with Transformer technology (Google XLNet/Transformer-XL)
list of efficient attention modules
Add a description, image, and links to the reformer topic page so that developers can more easily learn about it.
To associate your repository with the reformer topic, visit your repo's landing page and select "manage topics."