Skip to content

josiahcoad/chat-quickstart

Repository files navigation

Getting Started with Chat Agents

Quick Start

curl -LsSf https://astral.sh/uv/install.sh | sh # install uv
brew install make # install make
make setup-env && source .venv/bin/activate # setup venv
cp .env-example .env # then edit `.env`
make demo # run the agent with a single input to make sure it works
make dev # run the dev server for the chat UI

This repository demonstrates how to create a chat bot (think chatgpt) using LangGraph. But extended with tools, memory, and a GUI.

It uses a "ReAct agent" (short for Reasoning and Action agent) which is basically just an LLM that can call tools.

While simple, it is incredibly powerful and can be extended to do many complex tasks.

If you prefer using docker

cp .env-example .env # then edit `.env`
docker compose up

The navigate to https://smith.langchain.com/studio/thread?baseUrl=http://127.0.0.1:2024

Dev Notes

  • we use uv (rust based package manager) for super fast installs
  • we use ruff for linting and formatting

Glossary

The LangX Stack

  • Lansmith: a platform for tracing and monitoring your LLM apps
  • LangChain: a library for building LLM based apps (chains are sequences of LLM calls basically)
  • LangGraph: a library for building stateful, multi-step LLM workflows (built on top of LangChain but extends it with graphs)
  • LangGraph Studio: a UI for viewing and interacting with LangGraph graphs

Path to get started

1. Create a langsmith account and add your API key to the .env file

2. Create an agent (an agent is a particular type of graph)

  • this has been done in the agent/assistant.py file

3. Serve that agent

4. Connect to that agent via the UI

5. Add a tool

6. Deploy your agent on the cloud

  • Langgraph offers a managed deployment (Langgraph Cloud/Platform)
    • go to https://smith.langchain.com/ and click the langgraph platform tab
    • click new deployment in top right
    • fork this repo into your own github account
    • point the deployment to your fork
    • click deploy
  • Deploy the frontend using... (TODO)
  • You can also deploy the langgraph server yourself
    • using docker and ec2
    • invoking the graph manually behind your own api
      • check out langserve for that (although it has been soft deprecated in favor of langgraph platform)

Build more complex agents...

Extra Helpful Links

General langchain docs

Check out MCP servers

Tools are great but they are a python function... MCP servers are just an abstraction on top of tools that give a language-agnostic way to communicate with them over stdin/stdout And a way to group tools together into packages.

Add a GUI to your chat app

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors