Skip to content

๐Ÿ’ฌ Chat with the LangChain JS/TS documentation, with sources. ๐Ÿ’ฌ

License

Notifications You must be signed in to change notification settings

mhart/chat-langchainjs

ย 
ย 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

ย 

History

44 Commits
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 

Repository files navigation

๐Ÿฆœ๏ธ๐Ÿ”— Chat LangChain.js

This repo is an implementation of a locally hosted chatbot specifically focused on question answering over the LangChain documentation. Built with LangChain, and Next.js.

Deployed version: chatjs.langchain.com

Looking for the Python version? Click here

The app leverages LangChain's streaming API to update the page in real time for multiple users.

โœ… Running locally

  1. Install dependencies via: yarn install.
  2. Set the required environment variables listed inside backend/.env.example for the backend, and frontend/.env.example for the frontend.

Ingest

  1. Build the backend via yarn build --filter=backend (from root).
  2. Run the ingestion script by navigating into ./backend and running yarn ingest.

Frontend

  1. Navigate into ./frontend and run yarn dev to start the frontend.
  2. Open localhost:3000 in your browser.

๐Ÿ“š Technical description

There are two components: ingestion and question-answering.

Ingestion has the following steps:

  1. Pull html from documentation site as well as the Github Codebase
  2. Load html with LangChain's RecursiveUrlLoader and SitemapLoader
  3. Split documents with LangChain's RecursiveCharacterTextSplitter
  4. Create a vectorstore of embeddings, using LangChain's Weaviate vectorstore wrapper (with OpenAI's embeddings).

Question-Answering has the following steps:

  1. Given the chat history and new user input, determine what a standalone question would be using GPT-3.5.
  2. Given that standalone question, look up relevant documents from the vectorstore.
  3. Pass the standalone question and relevant documents to the model to generate and stream the final answer.
  4. Generate a trace URL for the current chat session, as well as the endpoint to collect feedback.

๐Ÿš€ Deployment

Deploy the frontend Next.js app as a serverless Edge function on Vercel.

About

๐Ÿ’ฌ Chat with the LangChain JS/TS documentation, with sources. ๐Ÿ’ฌ

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • TypeScript 93.1%
  • JavaScript 6.4%
  • CSS 0.5%