fastchat/Integrate Langchain/Create Private Knowledge Base
-
Updated
Aug 18, 2023 - Python
fastchat/Integrate Langchain/Create Private Knowledge Base
KATI-LLAMA is an AI desktop chat application using Large Language Models. It supports voice and visual emotion feedback of the AI. The goal of the development goes in the direction of J.A.R.V.I.S or HAL 9000. I imagine an application that is uncomplicated to set up and does not cost anything. Just download, launch and use.
Node-RED Flow (and web page example) for the Vicuna AI model
A speech-to-speech talking bot (in development)
This is the repo for Vicuna Chemical Expert, which can help to solve some chemical questions.
[Work In Progress] Server/Cloud-ready FastChat Docker images.
Vicuna 7B is a large language model that runs in the browser. Exposes programmatic access with minimal configuration.
Concepts and examples on using and training LLMs
A RL approach to enable cost-effective, intelligent interactions between a local agent and a remote LLM
A full pipeline to finetune Vicuna LLM with LoRA and RLHF on consumer hardware. Implementation of RLHF (Reinforcement Learning with Human Feedback) on top of the Vicuna architecture. Basically ChatGPT but with Vicuna
Add a description, image, and links to the vicuna-7b topic page so that developers can more easily learn about it.
To associate your repository with the vicuna-7b topic, visit your repo's landing page and select "manage topics."