A production-ready starter template for building multi-agent AI workflows with IBM watsonx.ai and LangFlow. Features comprehensive documentation, cross-platform support, and modern Python tooling with UV package manager.
This project demonstrates how to integrate IBM watsonx.ai Granite models with LangFlow for visual AI workflow building. It includes:
- π¨ Visual Flow Builder - Build AI workflows with drag-and-drop
- π€ Multi-Agent Support - CrewAI integration examples
- π§ Production Ready - Complete with Makefile, UV support, testing
- π Comprehensive Docs - 2,000+ lines of documentation
- πͺ Cross-Platform - Works on Linux, macOS, and Windows
- β‘ Modern Tooling - UV package manager for 10-100x faster installs
- Native watsonx.ai Integration - Direct connection to IBM Granite models
- Interactive Chat Demo - Command-line interface with conversation history
- LangFlow Visual Builder - Drag-and-drop workflow creation
- Environment Management - Automated setup with Make or scripts
- Code Quality Tools - Black, Ruff, mypy configured
- Windows Native - First-class Windows support with run.bat
- UV Compatible - Fast package installation and dependency management
- Complete Examples - RAG, chatbots, multi-agent systems
- Python 3.11+ (Download)
- IBM watsonx.ai account (Sign up)
- API Key (Get one)
- Project ID (From your watsonx.ai project settings)
- UV package manager for faster installs (Install UV)
- Make for build automation (comes with most Unix systems)
# Clone the repository
git clone https://github.com/ruslanmv/hello-watsonx-langflow.git
cd hello-watsonx-langflow
# One-command setup
make install && make setup && make demo# Clone the repository
git clone https://github.com/ruslanmv/hello-watsonx-langflow.git
cd hello-watsonx-langflow
# Same as Unix
make install && make setup && make demoREM Clone the repository
git clone https://github.com/ruslanmv/hello-watsonx-langflow.git
cd hello-watsonx-langflow
REM Setup and run
run.bat install
run.bat setup
run.bat demo# Clone the repository
git clone https://github.com/ruslanmv/hello-watsonx-langflow.git
cd hello-watsonx-langflow
# Install UV if not already installed
curl -LsSf https://astral.sh/uv/install.sh | sh
# Super fast setup with UV
make install # Automatically uses UV if available
make setup
make demoChoose the appropriate URL for your region:
| Region | URL |
|---|---|
| πΊπΈ Dallas (US South) | https://us-south.ml.cloud.ibm.com |
| π©πͺ Frankfurt (EU) | https://eu-de.ml.cloud.ibm.com |
| π¬π§ London (UK) | https://eu-gb.ml.cloud.ibm.com |
| π―π΅ Tokyo (Japan) | https://jp-tok.ml.cloud.ibm.com |
| π¦πΊ Sydney (Australia) | https://au-syd.ml.cloud.ibm.com |
hello-watsonx-langflow/
βββ π Documentation (7 files)
β βββ README.md # This file
β βββ QUICKSTART.md # 5-minute getting started
β βββ WINDOWS_SETUP.md # Complete Windows guide
β βββ ENVIRONMENT_SETUP.md # Environment configuration
β βββ PROJECT_SUMMARY.md # Project overview
β βββ UPDATE_SUMMARY.md # Latest updates
β
βββ π» Code (1 file)
β βββ agent_langflow.py # Main demo application
β
βββ π§ Scripts (2 files)
β βββ run.sh # Unix/Linux/macOS script
β βββ run.bat # Windows batch script
β
βββ βοΈ Configuration (6 files)
β βββ Makefile # Build automation (30+ commands)
β βββ pyproject.toml # Modern Python config
β βββ setup.cfg # Additional config
β βββ requirements.txt # Dependencies
β βββ .env.example # Environment template
β βββ .gitignore # Git ignore rules
# Installation & Setup
make install # Install dependencies (auto-detects UV)
make install-dev # Install with dev dependencies
make setup # Interactive credential setup
# Running
make demo # Run interactive chat demo
make simple # Run simple demo
make ui # Start LangFlow UI (http://localhost:7860)
# Development
make format # Format code with Black
make lint # Run linting (Ruff/flake8)
make type-check # Type checking with mypy
make test # Run tests
make check # Run all quality checks
# Utilities
make clean # Clean temporary files
make update # Update dependencies
make show-env # Show environment info
make help # Show all commands./run.sh install # Install dependencies
./run.sh setup # Setup credentials
./run.sh demo # Run demo
./run.sh ui # Start LangFlow UI
./run.sh test # Test connection
./run.sh help # Show helprun.bat install # Install dependencies
run.bat setup # Setup credentials
run.bat demo # Run demo
run.bat ui # Start LangFlow UI
run.bat test # Test connection
run.bat help # Show helpStart a conversation with watsonx.ai:
make demo
# or
python agent_langflow.pyAvailable commands in chat:
/help- Show help/clear- Clear conversation history/history- Show conversation/model- Show model info/exit- Quit
Run without conversation history:
make simple
# or
python agent_langflow.py --simpleLaunch the visual workflow builder:
make ui
# or
langflow runThen open http://localhost:7860 in your browser.
Use a different watsonx.ai model:
python agent_langflow.py --model ibm/granite-13b-instruct-v2| Model | Model ID | Size | Best For |
|---|---|---|---|
| Granite 3 8B | ibm/granite-3-8b-instruct |
8B | β Balanced (Recommended) |
| Granite 3 2B | ibm/granite-3-2b-instruct |
2B | Fast, efficient |
| Granite 13B | ibm/granite-13b-instruct-v2 |
13B | Complex tasks |
| Llama 3 70B | meta-llama/llama-3-70b-instruct |
70B | High capability |
| Llama 3 8B | meta-llama/llama-3-8b-instruct |
8B | Efficient |
| Mixtral 8x7B | mistralai/mixtral-8x7b-instruct-v01 |
8x7B | Mixture of experts |
- Start Here: QUICKSTART.md - 5-minute setup
- Environment Setup: ENVIRONMENT_SETUP.md
- Windows Users: WINDOWS_SETUP.md
- Run Demo:
make demo
- Read this README completely
- Explore
make helpfor all commands - Review
agent_langflow.pysource code - Try custom models and parameters
- Build visual flows in LangFlow UI
- Study the Makefile for build automation
- Review pyproject.toml for tooling config
- Extend with custom agents
- Integrate with CrewAI
- Build production deployments
- Start LangFlow UI:
make ui-
Create a New Flow:
- Click "New Flow"
- Search for "ChatWatsonx" component
- Drag it to the canvas
-
Configure watsonx.ai:
- Model ID:
ibm/granite-3-8b-instruct - API Key: Your watsonx.ai API key
- URL: Your regional endpoint
- Project ID: Your project ID
- Model ID:
-
Build and Test:
- Add components (prompts, parsers, etc.)
- Connect them visually
- Test in real-time
from langchain_ibm import ChatWatsonx
from langchain.schema import HumanMessage
# Initialize watsonx.ai
chat = ChatWatsonx(
model_id="ibm/granite-3-8b-instruct",
url="https://us-south.ml.cloud.ibm.com",
project_id="your_project_id",
)
# Simple chat
response = chat.invoke([HumanMessage(content="Hello!")])
print(response.content)See agent_langflow.py for complete examples.
- β Never commit credentials to version control
- β Use environment variables (.env file)
- β Rotate API keys regularly
- β Add .env to .gitignore (already configured)
- β Limit API key permissions in IBM Cloud IAM
- β Use separate keys for dev/staging/production
Create .env file:
# Copy template
cp .env.example .env
# Edit with your credentials
nano .env # or use your favorite editorRequired variables:
WATSONX_API_KEY=your_api_key_here
WATSONX_URL=https://us-south.ml.cloud.ibm.com
WATSONX_PROJECT_ID=your_project_id_herePython 3.11 not found
Solution:
# Check available Python versions
python --version
python3 --version
python3.11 --version
# Install Python 3.11
# Ubuntu/Debian:
sudo apt install python3.11
# macOS:
brew install python@3.11
# Windows:
# Download from https://www.python.org/downloads/LangFlow won't start
Solution:
# Reinstall LangFlow
pip uninstall langflow
pip install langflow --upgrade
# Or with UV
uv pip install langflow --upgradewatsonx.ai connection errors
Solution:
# Verify credentials
python -c "from dotenv import load_dotenv; load_dotenv(); import os; print('API Key:', 'SET' if os.getenv('WATSONX_API_KEY') else 'MISSING')"
# Test connection
make test-connectionPort 7860 already in use
Solution:
# Use different port
langflow run --port 8080
# Or kill existing process
lsof -ti:7860 | xargs kill -9 # Unix/macOS
netstat -ano | findstr :7860 # Windows (find PID)
taskkill /PID <PID> /F # Windows (kill process)UV not installing packages
Solution:
# Upgrade UV
uv self update
# Clear cache
rm -rf ~/.cache/uv
# Reinstall
uv pip install -r requirements.txt --force-reinstallSee platform-specific guides for more help:
- Windows: WINDOWS_SETUP.md
- Environment: ENVIRONMENT_SETUP.md
- QUICKSTART.md - 5-minute getting started guide
- WINDOWS_SETUP.md - Complete Windows setup (450+ lines)
- ENVIRONMENT_SETUP.md - Environment management (500+ lines)
- PROJECT_SUMMARY.md - Project overview and statistics
- agent_langflow.py - Well-commented main application
- Makefile - Documented build automation
- pyproject.toml - Modern Python configuration
Contributions are welcome! Here's how to get started:
- Fork the repository
# Click "Fork" on GitHub, then:
git clone https://github.com/YOUR_USERNAME/hello-watsonx-langflow.git- Create a branch
git checkout -b feature/amazing-feature- Make changes and test
make dev-setup # Install dev dependencies
make check # Run all checks- Commit and push
git commit -m "Add amazing feature"
git push origin feature/amazing-feature- Open a Pull Request on GitHub
# Install with development dependencies
make dev-setup
# Run quality checks before committing
make format # Format code
make lint # Check style
make type-check # Check types
make test # Run tests
make check # All checks- π Files: 16 total
- π Lines: 4,000+ lines of code and documentation
- π Documentation: 2,500+ lines
- π§ Commands: 30+ Makefile targets
- π Platforms: Linux, macOS, Windows, WSL
- π Python: 3.11+ required
- β‘ Speed: 10-100x faster with UV
Build a basic chatbot with conversation memory.
Create retrieval-augmented generation workflows visually.
Integrate with CrewAI for collaborative agent workflows.
Build document analysis pipelines with vector stores.
Deploy production AI applications with watsonx.ai.
- β Complete Quick Start - Get the demo running
- π¨ Explore LangFlow UI - Build visual workflows
- π€ Try Different Models - Experiment with Granite, Llama, Mixtral
- π Build RAG Pipeline - Add document retrieval
- π§ Create Custom Agents - Extend with your own logic
- π Deploy to Production - Use LangFlow's export features
This project is licensed under the MIT License - see the LICENSE file for details.
- IBM watsonx.ai - Enterprise AI platform with Granite foundation models
- LangFlow - Visual flow builder for LangChain applications
- LangChain - Framework for developing LLM applications
- UV - Ultra-fast Python package manager by Astral
- π Issues: GitHub Issues
- π‘ Discussions: GitHub Discussions
- π§ Email: Contact
If you find this project helpful, please consider giving it a star! β
Built with β€οΈ by ruslanmv
Happy Building with watsonx.ai + LangFlow! π