[NeurIPS'23 Oral] Visual Instruction Tuning (LLaVA) built towards GPT-4V level capabilities and beyond.
-
Updated
Jun 30, 2024 - Python
[NeurIPS'23 Oral] Visual Instruction Tuning (LLaVA) built towards GPT-4V level capabilities and beyond.
中文LLaMA&Alpaca大语言模型+本地CPU/GPU训练部署 (Chinese LLaMA & Alpaca LLMs)
Running Llama 2 and other Open-Source LLMs on CPU Inference Locally for Document Q&A
Run any Llama 2 locally with gradio UI on GPU or CPU from anywhere (Linux/Windows/Mac). Use `llama2-wrapper` as your local llama2 backend for Generative Agents/Apps.
[NeurIPS 2023] LLM-Pruner: On the Structural Pruning of Large Language Models. Support LLaMA, Llama-2, BLOOM, Vicuna, Baichuan, etc.
improve Llama-2's proficiency in comprehension, generation, and translation of Chinese.
InsightSolver: Colab notebooks for exploring and solving operational issues using deep learning, machine learning, and related models.
kani (カニ) is a highly hackable microframework for chat-based language models with tool use/function calling. (NLP-OSS @ EMNLP 2023)
📚 Local PDF-Integrated Chat Bot: Secure Conversations and Document Assistance with LLM-Powered Privacy
Examples of RAG using Llamaindex with local LLMs - Gemma, Mixtral 8x7B, Llama 2, Mistral 7B, Orca 2, Phi-2, Neural 7B
Docker image for LLaVA: Large Language and Vision Assistant
This package simplifies your interaction with various GPT models, removing the need for tokens or other methods to access GPT
Like grep but for natural language questions. Based on Mistral 7B or Mixtral 8x7B.
[KO-Platy🥮] Korean-Open-platypus를 활용하여 llama-2-ko를 fine-tuning한 KO-platypus model
An Offline Document Enquiry LLM for Everyone
Add a description, image, and links to the llama-2 topic page so that developers can more easily learn about it.
To associate your repository with the llama-2 topic, visit your repo's landing page and select "manage topics."