中文LLaMA&Alpaca大语言模型+本地CPU/GPU训练部署 (Chinese LLaMA & Alpaca LLMs)
-
Updated
Apr 30, 2024 - Python
中文LLaMA&Alpaca大语言模型+本地CPU/GPU训练部署 (Chinese LLaMA & Alpaca LLMs)
The official GitHub page for the survey paper "A Survey of Large Language Models".
An Open-Source Framework for Prompt-Learning.
Top2Vec learns jointly embedded topic, document and word vectors.
RoBERTa中文预训练模型: RoBERTa for Chinese
An Open-sourced Knowledgable Large Language Model Framework.
A novel method to tune language models. Codes and datasets for paper ``GPT understands, too''.
Keyphrase or Keyword Extraction 基于预训练模型的中文关键词抽取方法(论文SIFRank: A New Baseline for Unsupervised Keyphrase Extraction Based on Pre-trained Language Model 的中文版代码)
A PyTorch-based model pruning toolkit for pre-trained language models
HugNLP is a unified and comprehensive NLP library based on HuggingFace Transformer. Please hugging for NLP now!😊 HugNLP will released to @HugAILab
[ICLR 2022] Differentiable Prompt Makes Pre-trained Language Models Better Few-shot Learners
The code of our paper "SIFRank: A New Baseline for Unsupervised Keyphrase Extraction Based on Pre-trained Language Model"
[ICLR 2024] Domain-Agnostic Molecular Generation with Chemical Feedback
We start a company-name recognition task with a small scale and low quality training data, then using skills to enhanced model training speed and predicting performance with least artificial participation. The methods we use involve lite pre-training models such as Albert-small or Electra-small with financial corpus, knowledge of distillation an…
[ICLR 2023] Multimodal Analogical Reasoning over Knowledge Graphs
VaLM: Visually-augmented Language Modeling. ICLR 2023.
[ACL'23] Open KG Completion with PLM (Bridging Text Mining and Prompt Engineering)
ChatCell: Facilitating Single-Cell Analysis with Natural Language
Code for EMNLP 2021 main conference paper "Dynamic Knowledge Distillation for Pre-trained Language Models"
Zero-shot Transfer Learning from English to Arabic
Add a description, image, and links to the pre-trained-language-models topic page so that developers can more easily learn about it.
To associate your repository with the pre-trained-language-models topic, visit your repo's landing page and select "manage topics."