Starred repositories
Awesome Knowledge Distillation
Graph based Knowledge Distillation: A Survey
欢迎来到 “LLM Tutorial 一起学大模型” GitHub 仓库!在这里,你将找到本系列课程的所有代码、文档和资源,涵盖大模型开发的核心主题。我们将重点探索 LangChain 和 LangGraph 框架,同时包括大模型微调、理论基础和 Agents 开发等内容。 无论你是初学者还是有经验的开发者,这里都是一个学习与分享的理想空间。欢迎大家下载、交流,和我一起在大模型的旅程中成长!…
《开源大模型食用指南》针对中国宝宝量身打造的基于Linux环境快速微调(全参数/Lora)、部署国内外开源大模型(LLM)/多模态大模型(MLLM)教程
整理开源的中文大语言模型,以规模较小、可私有化部署、训练成本较低的模型为主,包括底座模型,垂直领域微调及应用,数据集与教程等。
《深入浅出图神经网络:GNN原理解析》配套代码
21 Lessons, Get Started Building with Generative AI 🔗 https://microsoft.github.io/generative-ai-for-beginners/
The Pytorch implementation of Graph convolution network (Kipf et.al. 2017) with vanilla Teacher-Student architecture of knowledge distillation (Hinton et.al 2015).
Source code for AAAI2023 "T2-GNN: Graph Neural Networks for Graphs with Incomplete Features and Structure via Teacher-Student Distillation"
Pytorch implementation of various Knowledge Distillation (KD) methods.
FCC China open source codebase and curriculum. Learn to code and help nonprofits.
freeCodeCamp.org's open-source codebase and curriculum. Learn to code for free.