Stars
Research Code for Multimodal-Cognition Team in Ant Group
myazi / Bert-Chinese-Text-Classification-Pytorch
Forked from 649453932/Bert-Chinese-Text-Classification-Pytorch使用Bert,ERNIE,进行中文文本分类
The official gpt4free repository | various collection of powerful language models | o3 and deepseek r1, gpt-4.5
Pytorch implementation of JointBERT: "BERT for Joint Intent Classification and Slot Filling"
An open source implementation of CLIP.
Easily compute clip embeddings and build a clip retrieval system with them
Chinese version of CLIP which achieves Chinese cross-modal retrieval and representation generation.
The official repo of Qwen (通义千问) chat & pretrained large language model proposed by Alibaba Cloud.
A series of large language models developed by Baichuan Intelligent Technology
中文LLaMA-2 & Alpaca-2大模型二期项目 + 64K超长上下文模型 (Chinese LLaMA-2 & Alpaca-2 LLMs with 64K long context models)
ChatGLM2-6B: An Open Bilingual Chat LLM | 开源双语对话语言模型
myazi / ann-benchmarks
Forked from erikbern/ann-benchmarksBenchmarks of approximate nearest neighbor libraries in Python
Implementation of the LLaMA language model based on nanoGPT. Supports flash attention, Int8 and GPTQ 4bit quantization, LoRA and LLaMA-Adapter fine-tuning, pre-training. Apache 2.0-licensed.
🤗 PEFT: State-of-the-art Parameter-Efficient Fine-Tuning.
Using Low-rank adaptation to quickly fine-tune diffusion models.
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
An optimized deep prompt tuning strategy comparable to fine-tuning across scales and tasks
ChatGLM-6B: An Open Bilingual Dialogue Language Model | 开源双语对话语言模型
《代码随想录》LeetCode 刷题攻略:200道经典题目刷题顺序,共60w字的详细图解,视频难点剖析,50余张思维导图,支持C++,Java,Python,Go,JavaScript等多语言版本,从此算法学习不再迷茫!🔥🔥 来看看,你会发现相见恨晚!🚀
Code associated with the Don't Stop Pretraining ACL 2020 paper
Chinese version of GPT2 training code, using BERT tokenizer.
🚀 RocketQA, dense retrieval for information retrieval and question answering, including both Chinese and English state-of-the-art models.