Stars
ArcticTraining is a framework designed to simplify and accelerate the post-training process for large language models (LLMs)
OpenAI 接口接入适配,支持千帆大模型平台、讯飞星火大模型、腾讯混元以及MiniMax、Deep-Seek,等兼容OpenAI接口,仅单可执行文件,配置超级简单,一键部署,开箱即用. Seamlessly integrate with OpenAI and compatible APIs using a single executable for quick setup and depl…
Local & Open Source Alternative to CharacterAI
[EMNLP 2023]This the repository of Harry Potter Dialogue Dataset.
Reference implementation for DPO (Direct Preference Optimization)
[ACL2023] We introduce LLM-Blender, an innovative ensembling framework to attain consistently superior performance by leveraging the diverse strengths of multiple open-source LLMs. LLM-Blender cut …
✨ RepoBench: Benchmarking Repository-Level Code Auto-Completion Systems - ICLR 2024
Code for "Small Models are Valuable Plug-ins for Large Language Models"
Let ChatGPT teach your own chatbot in hours with a single GPU!
Fine-tuning 6-Billion GPT-J (& other models) with LoRA and 8-bit compression
[EMNLP 2022] Language Model Pre-Training with Sparse Latent Typing
Mirror: Plug-and-Play Data Query, Summarization and Visualization with Natural Language Interface
arXiv LaTeX Cleaner: Easily clean the LaTeX code of your paper to submit to arXiv
🤗 Diffusers: State-of-the-art diffusion models for image, video, and audio generation in PyTorch and FLAX.
Training and serving large-scale neural networks with auto parallelization.
Implementation of Denoising Diffusion Probabilistic Model in Pytorch
OneFlow is a deep learning framework designed to be user-friendly, scalable and efficient.
Data and code for our paper "Exploring and Predicting Transferability across NLP Tasks", to appear at EMNLP 2020.
Implementation of paper "Towards a Unified View of Parameter-Efficient Transfer Learning" (ICLR 2022)
Efficient Training of Audio Transformers with Patchout