A template implementation of a conversational agent using LangGraph and GPT-4. This agent demonstrates the power of LangGraph for building sophisticated workflow-based AI agents with tool integration capabilities, state management, and graph-based execution patterns.
- Features
- Quick Start
- Prerequisites
- Installation
- Usage
- Project Structure
- Troubleshooting
- Contributing
- Support
- License
- Interactive conversational interface with workflow orchestration
- Graph-based agent execution with state management
- Tool integration support (including weather and search capabilities)
- Streaming responses for real-time interaction
- Built on LangGraph for sophisticated agent workflows and decision-making
- Conditional routing and multi-step reasoning capabilities
- Easy deployment and integration with Blaxel platform
For those who want to get up and running quickly:
# Clone the repository
git clone https://github.com/blaxel-ai/template-langgraph-py.git
# Navigate to the project directory
cd template-langgraph-py
# Install dependencies
uv sync
# Start the server
bl serve --hotreload
# In another terminal, test the agent
bl chat --local blaxel-agent
- Python: 3.10 or later
- UV: An extremely fast Python package and project manager, written in Rust
- Blaxel Platform Setup: Complete Blaxel setup by following the quickstart guide
- Blaxel CLI: Ensure you have the Blaxel CLI installed. If not, install it globally:
curl -fsSL https://raw.githubusercontent.com/blaxel-ai/toolkit/main/install.sh | BINDIR=/usr/local/bin sudo -E sh
- Blaxel login: Login to Blaxel platform
bl login YOUR-WORKSPACE
- Blaxel CLI: Ensure you have the Blaxel CLI installed. If not, install it globally:
Clone the repository and install dependencies:
git clone https://github.com/blaxel-ai/template-langgraph-py.git
cd template-langgraph-py
uv sync
Start the development server with hot reloading:
bl serve --hotreload
Note: This command starts the server and enables hot reload so that changes to the source code are automatically reflected.
You can test your agent using the chat interface:
bl chat --local blaxel-agent
Or run it directly with specific input:
bl run agent blaxel-agent --local --data '{"input": "What is the weather in Paris?"}'
When you are ready to deploy your application:
bl deploy
This command uses your code and the configuration files under the .blaxel
directory to deploy your application.
- src/main.py - Application entry point
- src/agent.py - Core agent implementation with LangGraph integration
- src/server/ - Server implementation and routing
- router.py - API route definitions
- middleware.py - Request/response middleware
- pyproject.toml - UV package manager configuration
- blaxel.toml - Blaxel deployment configuration
-
Blaxel Platform Issues:
- Ensure you're logged in to your workspace:
bl login MY-WORKSPACE
- Verify models are available:
bl get models
- Check that functions exist:
bl get functions
- Ensure you're logged in to your workspace:
-
Tool Integration Failures:
- Ensure tool functions are properly registered in LangGraph
- Check tool parameter validation through Blaxel platform
- Verify tool execution permissions and access
- Review tool response formats and error handling
-
Dependency and Environment Issues:
- Make sure you have Python 3.10+
- Try
uv sync --upgrade
to update dependencies - Check for conflicting package versions
- Verify virtual environment activation
For more help, please submit an issue on GitHub.
Contributions are welcome! Here's how you can contribute:
- Fork the repository
- Create a feature branch:
git checkout -b feature/amazing-feature
- Commit your changes:
git commit -m 'Add amazing feature'
- Push to the branch:
git push origin feature/amazing-feature
- Submit a Pull Request
Please make sure to update tests as appropriate and follow the code style of the project.
If you need help with this template:
- Submit an issue for bug reports or feature requests
- Visit the Blaxel Documentation for platform guidance
- Check the LangGraph Documentation for framework-specific help
- Join our Discord Community for real-time assistance
This project is licensed under the MIT License. See the LICENSE file for more details.