A sophisticated conversational AI chatbot built with LangGraph that demonstrates advanced conversation management, tool integration, memory persistence, and human-in-the-loop functionality for complex query handling.
- 🧠 Memory Persistence: Maintains conversation context across sessions
- 🔍 Web Search Integration: Real-time information retrieval via Tavily Search API
- 🤖 Google Gemini 2.0 Flash: Fast, efficient AI responses with experimental model support
- 🔄 State Management: Robust conversation flow with conditional routing
- 📝 Interactive CLI: User-friendly command-line interface
- 🛡️ Error Handling: Comprehensive error management and graceful exits
- 📊 Real-time Streaming: Live response generation and display
- 🤝 Human-in-the-Loop: AI can request human assistance for complex queries
- ⏸️ Smart Interrupts: Graph execution pauses for human intervention when needed
- 🔄 Seamless Resumption: Conversation flow continues naturally after human input
The chatbot is built using a modern, scalable architecture:
- LangGraph StateGraph: Structured conversation flow management
- Tool-based Architecture: Conditional routing with external tool integration
- Memory Checkpointer: Conversation persistence across sessions
- Interrupt-based Human Assistance: Seamless human intervention integration
- Stream-based Processing: Real-time response generation and display
- Python 3.10 or higher
- Google AI API key (for Gemini access)
- Internet connection (for web search functionality)
-
Clone the repository:
git clone <repository-url> cd "LangGraph Tutorial Chatbot"
-
Create and activate a virtual environment:
python -m venv .venv source .venv/bin/activate # On Windows: .venv\Scripts\activate
-
Install dependencies:
pip install -r requirements.txt
-
Set up environment variables:
cp .env.example .env
Edit the
.env
file and add your API keys:GOOGLE_API_KEY=your_google_api_key_here LANGSMITH_API_KEY=your_langsmith_api_key_here # Optional LANGSMITH_TRACING=true # Optional
- Visit Google AI Studio
- Create a new API key
- Add it to your
.env
file asGOOGLE_API_KEY
- Visit LangSmith
- Create an account and get your API key
- Add it to your
.env
file asLANGSMITH_API_KEY
- Set
LANGSMITH_TRACING=true
to enable tracing
Run the chatbot:
python bot.py
Simple Query:
User: What's the weather like today?
Assistant: I'll search for current weather information for you...
Complex Query with Human Assistance:
User: Help me plan a detailed itinerary for a 2-week trip to Japan
Assistant: This is a complex request that would benefit from human expertise. Let me get some assistance...
[Human assistance requested - you can provide additional context]
Web Search Integration:
User: What are the latest developments in AI?
Assistant: [Searches web and provides current information with sources]
- Exit: Type
quit
,exit
, orq
to end the conversation - Help: The chatbot will guide you through its capabilities
The chatbot uses Google's Gemini 2.0 Flash model by default. You can modify the model settings in bot.py
:
LLM_MODEL = "gemini-2.0-flash-exp" # Model name
temperature=0.7 # Creativity level (0.0-1.0)
max_tokens=2048 # Maximum response length
Tavily search settings can be adjusted:
MAX_SEARCH_RESULTS = 10 # Maximum search results
MAX_RESULT_CONTENT_LENGTH = 500 # Content length per result
TOP_RESULTS_LIMIT = 3 # Top results to display
LangGraph Tutorial Chatbot/
├── bot.py # Main chatbot application
├── requirements.txt # Python dependencies
├── .env.example # Environment variables template
├── .env # Your environment variables (create this)
├── CHANGELOG.md # Project changelog
├── README.md # This file
├── chatbot.log # Application logs
└── .venv/ # Virtual environment (created after setup)
- langgraph: Graph-based conversation flow management
- langchain: Core LangChain functionality
- langchain-google-genai: Google Gemini integration
- langchain-tavily: Web search capabilities
- langsmith: Optional tracing and monitoring
- python-dotenv: Environment variable management
The application logs all activities to chatbot.log
with timestamps and detailed information for debugging and monitoring purposes.
The chatbot can request human assistance for complex queries. When this happens:
- The AI identifies a query that would benefit from human expertise
- Graph execution pauses and requests human input
- You can provide additional context or guidance
- The conversation resumes with enhanced information
API Key Errors:
- Ensure your Google API key is correctly set in the
.env
file - Verify the API key has the necessary permissions
Network Errors:
- Check your internet connection
- Verify firewall settings allow outbound connections
Import Errors:
- Ensure all dependencies are installed:
pip install -r requirements.txt
- Verify you're using Python 3.10 or higher
The application provides detailed error messages and suggestions for resolution. Check the console output and chatbot.log
for detailed error information.
See CHANGELOG.md for detailed version history and updates.
Ebube Imoh
- Version: 1.3.0
- License: MIT
- Fork the repository
- Create a feature branch
- Make your changes
- Add tests if applicable
- Update documentation
- Submit a pull request
This project is licensed under the MIT License - see the LICENSE file for details.
- Built with LangGraph for conversation flow management
- Powered by Google Gemini for AI responses
- Web search provided by Tavily
- Monitoring and tracing via LangSmith