A minimal conversational AI agent built with LangGraph, designed for agent scaffolding and playground, providing a practical starting point for learning, building, and experimenting with advanced agent technologies.
- Python: Programming language environment
- pip: Python package manager
- Miniconda: Lightweight conda environment manager
- VSCode: Recommended development environment
Create a conda environment:
conda create -n clean_graph python=3.12
conda activate clean_graphpip install -r requirements.txtEdit .env file with your configuration:
# Required: LLM API Configuration (supports any OpenAI-compatible API)
LLM_API_BASE=http://localhost:1234/v1 # LM Studio, Ollama, or other OpenAI-compatible API endpoint
LLM_MODEL=qwen/qwen3-next-80b # Your model name
LLM_API_KEY=your-api-key-here # API key for authentication
# Optional: LangSmith Tracing
LANGSMITH_TRACING=false # Set to 'true' to enable
LANGSMITH_API_KEY=your-langsmith-key # Your LangSmith API keylanggraph dev --no-reloadAccess the LangGraph Studio interface at http://localhost:2024.
This project supports multiple LLM providers, with LM Studio recommended for local development:
- Download and install LM Studio
- Select and download suitable models in LM Studio
- Start the local server (typically at
http://localhost:1234)
For the best development experience, register for a LangSmith account:
- Visit LangSmith and create an account
- Get your API key
- Configure
LANGSMITH_API_KEYin your.envfile
Note: Even when using local graphs with LangGraph Studio, you still need to register for a LangSmith Key (free) and have logged into LangSmith (as of November 2025).
| Variable | Required | Description | Default |
|---|---|---|---|
LLM_API_BASE |
Yes | Base URL for LLM API endpoint | - |
LLM_MODEL |
Yes | Model name to use | - |
LLM_API_KEY |
Yes | API key for authentication | - |
LANGSMITH_TRACING |
No | Enable LangSmith tracing | false |
LANGSMITH_API_KEY |
Yes | LangSmith API key | - |
This project is compatible with any OpenAI-compatible API:
- Local LLMs: LM Studio, Ollama, LocalAI
- Cloud Providers: OpenAI, Together AI, Groq, etc.
clean_graph/
├── src/
│ ├── __init__.py # Package initialization
│ ├── graph.py # Main LangGraph application logic
│ └── llms.py # LLM configuration and setup
├── .env # Environment configuration
├── langgraph.json # LangGraph application definition
├── requirements.txt # Python dependencies
├── README.md # English documentation
└── README_CN.md # Chinese documentation
langchain~=1.0- Core LangChain frameworklanggraph~=1.0- Graph-based AI application frameworklangchain-core~=1.0- Core LangChain componentslangchain_openai- OpenAI API clientpython-dotenv~=1.0- Environment variable managementlanggraph-checkpoint>=2.1.0- State checkpointinglanggraph-cli[inmem]- Development tools (in-memory storage)pydantic~=2.0- Data validation and settings management