Open Source Pre-training Model Framework in PyTorch & Pre-trained Model Zoo
-
Updated
May 9, 2024 - Python
Open Source Pre-training Model Framework in PyTorch & Pre-trained Model Zoo
Tencent Pre-training framework in PyTorch & Pre-trained Model Zoo
Multilingual/multidomain question generation datasets, models, and python library for question generation.
MinT: Minimal Transformer Library and Tutorials
code for AAAI2022 paper "Open Vocabulary Electroencephalography-To-Text Decoding and Zero-shot Sentiment Classification"
Calculate perplexity on a text with pre-trained language models. Support MLM (eg. DeBERTa), recurrent LM (eg. GPT3), and encoder-decoder LM (eg. Flan-T5).
Code associated with the "Data Augmentation using Pre-trained Transformer Models" paper
NAACL 2021 - Progressive Generation of Long Text
Codes for our paper "JointGT: Graph-Text Joint Representation Learning for Text Generation from Knowledge Graphs" (ACL 2021 Findings)
Official implementation of the paper "IteraTeR: Understanding Iterative Revision from Human-Written Text" (ACL 2022)
Automated Categorization: Utilizing the power of neural networks, this project offers an automated solution to categorize bank descriptions, reducing manual effort and enhancing efficiency while maintaining privacy.
Code for EMNLP 2021 paper "Topic-Aware Contrastive Learning for Abstractive Dialogue Summarization"
The first-ever vast natural language generation benchmark for Indonesian, Sundanese, and Javanese. We provide multiple downstream tasks, pre-trained IndoGPT and IndoBART models, and a starter code! (EMNLP 2021)
A tool to automatically summarize documents abstractively using the BART or PreSumm Machine Learning Model.
Source codes and dataset of Call for Customized Conversation: Customized Conversation Grounding Persona and Knowledge
TrAVis: Visualise BERT attention in your browser
Code associated with the "Data Augmentation using Pre-trained Transformer Models" paper
An English-to-Cantonese machine translation model
Add a description, image, and links to the bart topic page so that developers can more easily learn about it.
To associate your repository with the bart topic, visit your repo's landing page and select "manage topics."