Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

GPT2 Collection #13

Open
daviddwlee84 opened this issue Nov 10, 2020 · 1 comment
Open

GPT2 Collection #13

daviddwlee84 opened this issue Nov 10, 2020 · 1 comment

Comments

@daviddwlee84
Copy link
Owner

daviddwlee84 commented Nov 10, 2020

Code Support Chinese Framework Remark
openai/gpt-2: Code for the paper "Language Models are Unsupervised Multitask Learners" No Tensorflow 1.x Official (Open AI) one; Better Language Models and Their Implications
minimaxir/gpt-2-simple: Python package to easily retrain OpenAI's GPT-2 text-generating model on new texts No Tensorflow 1.x minimaxir/textgenrnn: Easily train your own text-generating neural network of any size and complexity on any text dataset with a few lines of code.; OpenAI「假新闻」生成器GPT-2的最简Python实现 - 知乎
yangjianxin1/GPT2-chitchat: GPT2 for Chinese chitchat/用于中文闲聊的GPT2模型(实现了DialoGPT的MMI思想) Yes PyTorch Very cool project based on Huggingface; 用于中文闲聊的GPT2模型:GPT2-chitchat - 知乎
rish-16/gpt2client: ✍🏻 gpt2-client: Easy-to-use TensorFlow Wrapper for GPT-2 117M, 345M, 774M, and 1.5B Transformer Models 🤖 📝 No TensorFlow 1.x 新加坡高中生开源轻量级GPT-2“客户端”:五行代码玩转GPT-2 - 知乎
Morizeyao/GPT2-Chinese: Chinese version of GPT2 training code, using BERT tokenizer. Yes PyTorch Based on Huggingface

Huggingface

GPT2LMHeadModel

TODO: WWM?!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant