Skip to content

TesAnti/LangChainChat

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

🦜️🔗 LangChain Chat

preview

This project shows how to make a chat with LLM usgin LangChain and Blazor.

Features

  • chat with any local or paid model
  • have multiple models with any chain configurations(agents, RAG, custom tools, etc...)
  • conversation history and multiple conversations
  • automatic conversation name generation
  • code syntax higlight

Installation

To run the chat with default configuration you would need docker and ollama container. Follow the installation steps for ollama container. When eterything is ready, pull the latest mistral model:

docker exec -it ollama ollama pull mistral:latest

This will take some minutes, depending on your internet speed.

Now clone the project into any folder you like:

git clone https://github.com/TesAnti/LangChainChat.git

Run it with editor of your choise.

Configuration

Just clone the project and open LangChainConfigExtensions.cs file. There you can change your model provider, chain and add more models.

You can change provided from local ollama to ChatGPT or, pretty much, any existing model supported by LangChain. For more information see LangChain wiki.

About

LangChain Serve + Blazor chat

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published