-
Data Science and Analytic Thrust, Information Hub, HKUST(GZ)
- GuangZhou
- https://www.zhihu.com/people/peijieDong
- https://pprp.github.io
- https://scholar.google.com/citations?user=TqS6s4gAAAAJ
GPT
a.k.a. Awesome ChatGPT Prompts. Share, discover, and collect prompts from the community. Free and open source — self-host for your organization with complete privacy.
Chinese version of GPT2 training code, using BERT tokenizer.
A minimal PyTorch re-implementation of the OpenAI GPT (Generative Pretrained Transformer) training
A simple prompt-chatting AI based on wechaty and fintuned NLP model
tiktoken is a fast BPE tokeniser for use with OpenAI's models.
Curated list of ChatGPT related resource, tools, prompts, apps / ChatGPT 相關優質資源、工具、應用的精選清單。
使唤 AI 使魔必备魔咒,一些验证可用的操作 ChatGPT 的咒语。
ChatGPT 中文调教指南。各种场景使用指南。学习怎么让它听你的话。
Central place for the engineering/scaling WG: documentation, SLURM scripts and logs, compute environment and data.
Toolkit for creating, sharing and using natural language prompts.
Pretrained language model with 100B parameters
A trend starts from "Chain of Thought Prompting Elicits Reasoning in Large Language Models".
Cramming the training of a (BERT-type) language model into limited compute.
The simplest, fastest repository for training/finetuning medium-sized GPTs.
A repo for distributed training of language models with Reinforcement Learning via Human Feedback (RLHF)
OpenAssistant is a chat-based assistant that understands tasks, can interact with third-party systems, and retrieve information dynamically to do so.
ICML'2022: Black-Box Tuning for Language-Model-as-a-Service & EMNLP'2022: BBTv2: Towards a Gradient-Free Future with Large Language Models
Multimodal AI Story Teller, built with Stable Diffusion, GPT, and neural text-to-speech
ICML'2022: NLP From Scratch Without Large-Scale Pretraining: A Simple and Efficient Framework
LlamaIndex is the leading document agent and OCR platform
Official repository of OFA (ICML 2022). Paper: OFA: Unifying Architectures, Tasks, and Modalities Through a Simple Sequence-to-Sequence Learning Framework
[ACL 2023] Reasoning with Language Model Prompting: A Survey
An autoregressive character-level language model for making more things





