Skip to content

LangFlow + IBM watsonx.ai Integration Demo This demo shows how to integrate IBM watsonx.ai with LangFlow for visual AI workflow building.

Notifications You must be signed in to change notification settings

ruslanmv/hello-watsonx-langflow

Repository files navigation

Hello watsonx + LangFlow πŸš€

Python 3.11+ License: MIT IBM watsonx LangFlow

A production-ready starter template for building multi-agent AI workflows with IBM watsonx.ai and LangFlow. Features comprehensive documentation, cross-platform support, and modern Python tooling with UV package manager.

🎯 What is This?

This project demonstrates how to integrate IBM watsonx.ai Granite models with LangFlow for visual AI workflow building. It includes:

  • 🎨 Visual Flow Builder - Build AI workflows with drag-and-drop
  • πŸ€– Multi-Agent Support - CrewAI integration examples
  • πŸ”§ Production Ready - Complete with Makefile, UV support, testing
  • πŸ“š Comprehensive Docs - 2,000+ lines of documentation
  • πŸͺŸ Cross-Platform - Works on Linux, macOS, and Windows
  • ⚑ Modern Tooling - UV package manager for 10-100x faster installs

✨ Features

  • Native watsonx.ai Integration - Direct connection to IBM Granite models
  • Interactive Chat Demo - Command-line interface with conversation history
  • LangFlow Visual Builder - Drag-and-drop workflow creation
  • Environment Management - Automated setup with Make or scripts
  • Code Quality Tools - Black, Ruff, mypy configured
  • Windows Native - First-class Windows support with run.bat
  • UV Compatible - Fast package installation and dependency management
  • Complete Examples - RAG, chatbots, multi-agent systems

πŸ“‹ Prerequisites

  • Python 3.11+ (Download)
  • IBM watsonx.ai account (Sign up)
  • API Key (Get one)
  • Project ID (From your watsonx.ai project settings)

Optional (Recommended)

  • UV package manager for faster installs (Install UV)
  • Make for build automation (comes with most Unix systems)

πŸš€ Quick Start

Method 1: Clone and Make (Unix/macOS/Linux)

# Clone the repository
git clone https://github.com/ruslanmv/hello-watsonx-langflow.git
cd hello-watsonx-langflow

# One-command setup
make install && make setup && make demo

Method 2: Clone and Make (Windows with Git Bash)

# Clone the repository
git clone https://github.com/ruslanmv/hello-watsonx-langflow.git
cd hello-watsonx-langflow

# Same as Unix
make install && make setup && make demo

Method 3: Windows Native (run.bat)

REM Clone the repository
git clone https://github.com/ruslanmv/hello-watsonx-langflow.git
cd hello-watsonx-langflow

REM Setup and run
run.bat install
run.bat setup
run.bat demo

Method 4: With UV (Fastest!)

# Clone the repository
git clone https://github.com/ruslanmv/hello-watsonx-langflow.git
cd hello-watsonx-langflow

# Install UV if not already installed
curl -LsSf https://astral.sh/uv/install.sh | sh

# Super fast setup with UV
make install  # Automatically uses UV if available
make setup
make demo

🌐 Regional Endpoints

Choose the appropriate URL for your region:

Region URL
πŸ‡ΊπŸ‡Έ Dallas (US South) https://us-south.ml.cloud.ibm.com
πŸ‡©πŸ‡ͺ Frankfurt (EU) https://eu-de.ml.cloud.ibm.com
πŸ‡¬πŸ‡§ London (UK) https://eu-gb.ml.cloud.ibm.com
πŸ‡―πŸ‡΅ Tokyo (Japan) https://jp-tok.ml.cloud.ibm.com
πŸ‡¦πŸ‡Ί Sydney (Australia) https://au-syd.ml.cloud.ibm.com

πŸ“ Project Structure

hello-watsonx-langflow/
β”œβ”€β”€ πŸ“„ Documentation (7 files)
β”‚   β”œβ”€β”€ README.md                    # This file
β”‚   β”œβ”€β”€ QUICKSTART.md                # 5-minute getting started
β”‚   β”œβ”€β”€ WINDOWS_SETUP.md             # Complete Windows guide
β”‚   β”œβ”€β”€ ENVIRONMENT_SETUP.md         # Environment configuration
β”‚   β”œβ”€β”€ PROJECT_SUMMARY.md           # Project overview
β”‚   └── UPDATE_SUMMARY.md            # Latest updates
β”‚
β”œβ”€β”€ πŸ’» Code (1 file)
β”‚   └── agent_langflow.py            # Main demo application
β”‚
β”œβ”€β”€ πŸ”§ Scripts (2 files)
β”‚   β”œβ”€β”€ run.sh                       # Unix/Linux/macOS script
β”‚   └── run.bat                      # Windows batch script
β”‚
β”œβ”€β”€ βš™οΈ Configuration (6 files)
β”‚   β”œβ”€β”€ Makefile                     # Build automation (30+ commands)
β”‚   β”œβ”€β”€ pyproject.toml               # Modern Python config
β”‚   β”œβ”€β”€ setup.cfg                    # Additional config
β”‚   β”œβ”€β”€ requirements.txt             # Dependencies
β”‚   β”œβ”€β”€ .env.example                 # Environment template
β”‚   └── .gitignore                   # Git ignore rules

πŸ› οΈ Available Commands

Using Makefile (Unix/macOS/Linux/Windows with Git Bash)

# Installation & Setup
make install          # Install dependencies (auto-detects UV)
make install-dev      # Install with dev dependencies
make setup            # Interactive credential setup

# Running
make demo             # Run interactive chat demo
make simple           # Run simple demo
make ui               # Start LangFlow UI (http://localhost:7860)

# Development
make format           # Format code with Black
make lint             # Run linting (Ruff/flake8)
make type-check       # Type checking with mypy
make test             # Run tests
make check            # Run all quality checks

# Utilities
make clean            # Clean temporary files
make update           # Update dependencies
make show-env         # Show environment info
make help             # Show all commands

Using run.sh (Unix/macOS/Linux)

./run.sh install      # Install dependencies
./run.sh setup        # Setup credentials
./run.sh demo         # Run demo
./run.sh ui           # Start LangFlow UI
./run.sh test         # Test connection
./run.sh help         # Show help

Using run.bat (Windows)

run.bat install       # Install dependencies
run.bat setup         # Setup credentials
run.bat demo          # Run demo
run.bat ui            # Start LangFlow UI
run.bat test          # Test connection
run.bat help          # Show help

πŸ’‘ Usage Examples

1. Interactive Chat Demo

Start a conversation with watsonx.ai:

make demo
# or
python agent_langflow.py

Available commands in chat:

  • /help - Show help
  • /clear - Clear conversation history
  • /history - Show conversation
  • /model - Show model info
  • /exit - Quit

2. Simple Stateless Demo

Run without conversation history:

make simple
# or
python agent_langflow.py --simple

3. LangFlow Visual Builder

Launch the visual workflow builder:

make ui
# or
langflow run

Then open http://localhost:7860 in your browser.

4. Custom Model

Use a different watsonx.ai model:

python agent_langflow.py --model ibm/granite-13b-instruct-v2

πŸ“Š Available watsonx.ai Models

Model Model ID Size Best For
Granite 3 8B ibm/granite-3-8b-instruct 8B ⭐ Balanced (Recommended)
Granite 3 2B ibm/granite-3-2b-instruct 2B Fast, efficient
Granite 13B ibm/granite-13b-instruct-v2 13B Complex tasks
Llama 3 70B meta-llama/llama-3-70b-instruct 70B High capability
Llama 3 8B meta-llama/llama-3-8b-instruct 8B Efficient
Mixtral 8x7B mistralai/mixtral-8x7b-instruct-v01 8x7B Mixture of experts

πŸŽ“ Learning Path

For Beginners

  1. Start Here: QUICKSTART.md - 5-minute setup
  2. Environment Setup: ENVIRONMENT_SETUP.md
  3. Windows Users: WINDOWS_SETUP.md
  4. Run Demo: make demo

For Intermediate Users

  1. Read this README completely
  2. Explore make help for all commands
  3. Review agent_langflow.py source code
  4. Try custom models and parameters
  5. Build visual flows in LangFlow UI

For Advanced Users

  1. Study the Makefile for build automation
  2. Review pyproject.toml for tooling config
  3. Extend with custom agents
  4. Integrate with CrewAI
  5. Build production deployments

πŸ”§ LangFlow Integration Methods

Method 1: Visual Builder (LangFlow UI)

  1. Start LangFlow UI:
   make ui
  1. Create a New Flow:

    • Click "New Flow"
    • Search for "ChatWatsonx" component
    • Drag it to the canvas
  2. Configure watsonx.ai:

    • Model ID: ibm/granite-3-8b-instruct
    • API Key: Your watsonx.ai API key
    • URL: Your regional endpoint
    • Project ID: Your project ID
  3. Build and Test:

    • Add components (prompts, parsers, etc.)
    • Connect them visually
    • Test in real-time

Method 2: Python API (Programmatic)

from langchain_ibm import ChatWatsonx
from langchain.schema import HumanMessage

# Initialize watsonx.ai
chat = ChatWatsonx(
    model_id="ibm/granite-3-8b-instruct",
    url="https://us-south.ml.cloud.ibm.com",
    project_id="your_project_id",
)

# Simple chat
response = chat.invoke([HumanMessage(content="Hello!")])
print(response.content)

See agent_langflow.py for complete examples.

πŸ”’ Security Best Practices

  1. βœ… Never commit credentials to version control
  2. βœ… Use environment variables (.env file)
  3. βœ… Rotate API keys regularly
  4. βœ… Add .env to .gitignore (already configured)
  5. βœ… Limit API key permissions in IBM Cloud IAM
  6. βœ… Use separate keys for dev/staging/production

Setting Up Credentials

Create .env file:

# Copy template
cp .env.example .env

# Edit with your credentials
nano .env  # or use your favorite editor

Required variables:

WATSONX_API_KEY=your_api_key_here
WATSONX_URL=https://us-south.ml.cloud.ibm.com
WATSONX_PROJECT_ID=your_project_id_here

πŸ› Troubleshooting

Common Issues

Python 3.11 not found

Solution:

# Check available Python versions
python --version
python3 --version
python3.11 --version

# Install Python 3.11
# Ubuntu/Debian:
sudo apt install python3.11

# macOS:
brew install python@3.11

# Windows:
# Download from https://www.python.org/downloads/
LangFlow won't start

Solution:

# Reinstall LangFlow
pip uninstall langflow
pip install langflow --upgrade

# Or with UV
uv pip install langflow --upgrade
watsonx.ai connection errors

Solution:

# Verify credentials
python -c "from dotenv import load_dotenv; load_dotenv(); import os; print('API Key:', 'SET' if os.getenv('WATSONX_API_KEY') else 'MISSING')"

# Test connection
make test-connection
Port 7860 already in use

Solution:

# Use different port
langflow run --port 8080

# Or kill existing process
lsof -ti:7860 | xargs kill -9  # Unix/macOS
netstat -ano | findstr :7860    # Windows (find PID)
taskkill /PID <PID> /F          # Windows (kill process)
UV not installing packages

Solution:

# Upgrade UV
uv self update

# Clear cache
rm -rf ~/.cache/uv

# Reinstall
uv pip install -r requirements.txt --force-reinstall

See platform-specific guides for more help:

πŸ“š Documentation

Complete Guides

Code Documentation

🀝 Contributing

Contributions are welcome! Here's how to get started:

  1. Fork the repository
   # Click "Fork" on GitHub, then:
   git clone https://github.com/YOUR_USERNAME/hello-watsonx-langflow.git
  1. Create a branch
   git checkout -b feature/amazing-feature
  1. Make changes and test
   make dev-setup  # Install dev dependencies
   make check      # Run all checks
  1. Commit and push
   git commit -m "Add amazing feature"
   git push origin feature/amazing-feature
  1. Open a Pull Request on GitHub

Development Setup

# Install with development dependencies
make dev-setup

# Run quality checks before committing
make format      # Format code
make lint        # Check style
make type-check  # Check types
make test        # Run tests
make check       # All checks

πŸ“Š Project Statistics

  • πŸ“ Files: 16 total
  • πŸ“ Lines: 4,000+ lines of code and documentation
  • πŸ“š Documentation: 2,500+ lines
  • πŸ”§ Commands: 30+ Makefile targets
  • 🌍 Platforms: Linux, macOS, Windows, WSL
  • 🐍 Python: 3.11+ required
  • ⚑ Speed: 10-100x faster with UV

🎯 Use Cases

1. Simple Chatbot

Build a basic chatbot with conversation memory.

2. RAG Pipeline

Create retrieval-augmented generation workflows visually.

3. Multi-Agent Systems

Integrate with CrewAI for collaborative agent workflows.

4. Document Processing

Build document analysis pipelines with vector stores.

5. Enterprise AI

Deploy production AI applications with watsonx.ai.

πŸš€ Next Steps

  1. βœ… Complete Quick Start - Get the demo running
  2. 🎨 Explore LangFlow UI - Build visual workflows
  3. πŸ€– Try Different Models - Experiment with Granite, Llama, Mixtral
  4. πŸ“Š Build RAG Pipeline - Add document retrieval
  5. πŸ”§ Create Custom Agents - Extend with your own logic
  6. πŸš€ Deploy to Production - Use LangFlow's export features

πŸ“– Additional Resources

Official Documentation

Tutorials

Community

πŸ“„ License

This project is licensed under the MIT License - see the LICENSE file for details.

πŸ™ Acknowledgments

  • IBM watsonx.ai - Enterprise AI platform with Granite foundation models
  • LangFlow - Visual flow builder for LangChain applications
  • LangChain - Framework for developing LLM applications
  • UV - Ultra-fast Python package manager by Astral

πŸ’¬ Support

⭐ Star History

If you find this project helpful, please consider giving it a star! ⭐

Star History Chart


Built with ❀️ by ruslanmv

Happy Building with watsonx.ai + LangFlow! πŸš€


GitHub stars GitHub forks GitHub watchers

About

LangFlow + IBM watsonx.ai Integration Demo This demo shows how to integrate IBM watsonx.ai with LangFlow for visual AI workflow building.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published