Unified Efficient Fine-Tuning of 100+ LLMs (ACL 2024)
-
Updated
Oct 11, 2024 - Python
Unified Efficient Fine-Tuning of 100+ LLMs (ACL 2024)
ChatGLM2-6B: An Open Bilingual Chat LLM | 开源双语对话语言模型
ChatGPT 中文指南🔥,ChatGPT 中文调教指南,指令指南,应用开发指南,精选资源清单,更好的使用 chatGPT 让你的生产力 up up up! 🚀
Firefly: 大模型训练工具,支持训练Qwen2.5、Qwen2、Yi1.5、Phi-3、Llama3、Gemma、MiniCPM、Yi、Deepseek、Orion、Xverse、Mixtral-8x7B、Zephyr、Mistral、Baichuan2、Llma2、Llama、Qwen、Baichuan、ChatGLM2、InternLM、Ziya2、Vicuna、Bloom等大模型
Replace OpenAI GPT with another LLM in your app by changing a single line of code. Xinference gives you the freedom to use any LLM you need. With Xinference, you're empowered to run inference with any open-source language models, speech recognition models, and multimodal models, whether in the cloud, on-premises, or even on your laptop.
GLM-4 series: Open Multilingual Multimodal Chat LMs | 开源多语言多模态对话模型
Fine-tuning ChatGLM-6B with PEFT | 基于 PEFT 的高效 ChatGLM 微调
Openai style api for open large language models, using LLMs just as chatgpt! Support for LLaMA, LLaMA-2, BLOOM, Falcon, Baichuan, Qwen, Xverse, SqlCoder, CodeLLaMA, ChatGLM, ChatGLM2, ChatGLM3 etc. 开源大模型的统一后端接口
chatglm 6b finetuning and alpaca finetuning
An elegent pytorch implement of transformers
[AI Agent Application Development Framework] - 🚀 Build AI agent native application in very few code 💬 Easy to interact with AI agent in code using structure data and chained-calls syntax 🧩 Enhance AI Agent using plugins instead of rebuild a whole new agent
[NeurIPS 2023] LLM-Pruner: On the Structural Pruning of Large Language Models. Support Llama-3/3.1, Llama-2, LLaMA, BLOOM, Vicuna, Baichuan, TinyLlama, etc.
A collection of one-click self-hosted AI
An LLM-powered repository agent designed to assist developers and teams in generating documentation and understanding repositories quickly.
Add a description, image, and links to the chatglm topic page so that developers can more easily learn about it.
To associate your repository with the chatglm topic, visit your repo's landing page and select "manage topics."