A streaming AI chat interface for Tableau data analysis, featuring real-time agent thinking steps and enhanced user experience. Built on the Tableau MCP (Model Context Protocol) with LangChain integration.
Credits: This project is derived from Will Sutton's excellent tableau_mcp_starter_kit created at The Information Lab. The original work provided the foundation for Tableau MCP integration with LangChain.
- π Real-time Streaming: Watch the AI agent think through your questions step-by-step
- π€ Multiple LLM Providers: Support for OpenAI, AWS Bedrock, and easily extensible for other providers
- π± Personalized Interface: Custom cat favicon and friendly UI included, easily updated
- π Natural Language Queries: Ask questions about your Tableau data in plain English
- π― Smart Error Handling: Improved schema validation and error recovery
- π± Responsive Design: Works on desktop and mobile devices
- π§ Dashboard Extension: Embed directly into Tableau dashboards Under Continued Construction
- π Flexible Callbacks: Choose between FileCallbackHandler, Langfuse, or no callbacks
Architecture Note: This application uses a streamable-http interface with an MCP server instead of a localized instance. The Tableau MCP server uses Direct Trust and Connected Apps to facilitate the connection and authentication.
- Tableau Server 2025.1+ or Tableau Cloud (Free trial available)
- Python 3.11+ - Download Python
- Node.js 22.15.0 LTS - Download Node.js
- Git - Download Git
- LLM Provider Credentials:
- OpenAI: API key from OpenAI Platform
- AWS Bedrock: AWS Access Key ID, Secret Access Key, and Region (requires Bedrock access enabled)
This application sends Tableau data to external AI models. For production use with sensitive data:
- Use the sample dataset for testing
- Consider configuring AWS Bedrock for private model hosting, or other on-premise AI solutions
- Review your organization's data governance policies
- All queries and results are sent to the configured LLM provider
node -v
git clone https://github.com/tableau/tableau-mcp.git
cd tableau-mcp
#build .env file
npm install
npm run buildgit clone https://github.com/yourusername/tableau_mcp_tabby.git
cd tableau_mcp_tabby# Create virtual environment
python -m venv .venv
# Activate it
# Windows:
.venv\Scripts\activate
# macOS/Linux:
source .venv/bin/activate
# Install dependencies
pip install -r requirements.txtCopy the template and configure your settings:
cp .env_template .envEdit .env with your credentials:
# Tableau MCP Configuration
TABLEAU_MCP_HTTP_URL=http://localhost:3927/tableau-mcp
# Model Provider Configuration
MODEL_PROVIDER=openai # Options: "openai" or "aws"
MODEL_USED=gpt-5 # Model name (e.g., "gpt-5", "gpt-4-turbo" for OpenAI)
MODEL_TEMPERATURE=0 # Temperature setting (0-2)
# OpenAI Configuration (required if MODEL_PROVIDER=openai)
OPENAI_API_KEY=your-openai-api-key
# AWS Bedrock Configuration (required if MODEL_PROVIDER=aws)
AWS_ACCESS_KEY_ID=your-aws-access-key-id
AWS_SECRET_ACCESS_KEY=your-aws-secret-access-key
AWS_REGION=us-east-1 # AWS region where Bedrock is available
# AWS_SESSION_TOKEN=optional # Only needed for temporary credentials
# Optional: Langfuse Observability
# Set to "true" to enable Langfuse tracing false for local file tracing or none for no tracing
USE_LANGFUSE=none
LANGFUSE_PUBLIC_KEY=your-public-key
LANGFUSE_SECRET_KEY=your-secret-key
LANGFUSE_HOST=https://us.cloud.langfuse.comChoose your LLM provider by setting MODEL_PROVIDER:
-
OpenAI (
MODEL_PROVIDER=openai): RequiresOPENAI_API_KEY- Popular models:
gpt-5,gpt-4-turbo,gpt-3.5-turbo
- Popular models:
-
AWS Bedrock (
MODEL_PROVIDER=aws): Requires AWS credentials- Popular models:
anthropic.claude-3-sonnet-20240229-v1:0,anthropic.claude-3-5-sonnet-20241022-v2:0 - Ensure Bedrock is enabled in your AWS account and region
- Popular models:
Control tracing and logging behavior:
- FileCallbackHandler (default when
USE_LANGFUSE=false): Writes traces to.logs/agent_trace.jsonl - Langfuse (
USE_LANGFUSE=true): Sends traces to Langfuse cloud for observability - None: Disable callbacks by setting environment to neither option
Important: The MCP server must be running before starting the web application.
In the tableau-mcp directory:
# HTTP mode (required for this application)
npm run serve:httpThe MCP server will start on port 3927 by default. Ensure it's accessible at the URL configured in your .env file (default: http://localhost:3927/tableau-mcp).
Prerequisites: Ensure the Tableau MCP server is running
# Activate virtual environment (if not already active)
source .venv/bin/activate # macOS/Linux
# or
.venv\Scripts\activate # Windows
# Start the application
python web_app.pyOpen your browser to http://localhost:8000 and start chatting with your data!
Option 1: Single EC2 Instance (Recommended for simpler deployments)
- Both the web application and MCP server run on the same EC2 instance
- Simpler setup and deployment
- Lower network latency (localhost communication)
- Single machine to manage
- Requires both Node.js and Python installed
Option 2: Separate Instances
- Web application and MCP server on different EC2 instances
- Better isolation and security boundaries
- Can scale components independently
- Requires network configuration between instances
- Install prerequisites:
# Install Node.js (required for MCP server if running on same instance)
curl -fsSL https://rpm.nodesource.com/setup_22.x | sudo bash -
sudo yum install -y nodejs
# Install Python 3.11+ (usually pre-installed on Amazon Linux 2023)
python3 --version
# If needed: sudo yum install -y python3.11 python3.11-pip python3.11-venv- Set up Tableau MCP Server (if running on same instance):
# Clone and build MCP server
git clone https://github.com/tableau/tableau-mcp.git
# Set up MCP server environment (create .env file with Tableau credentials)
# Note: MCP server has its own configuration requirements
npm install
npm run build
- Create a dedicated user for the web application:
sudo useradd -m -s /bin/bash tabby-user
sudo su - tabby-user- Clone and set up the web application:
git clone https://github.com/vizBrewer/tableau_mcp_tabby.git
cd tableau_mcp_tabby
python3 -m venv venv
source venv/bin/activate
pip install -r requirements.txt
cp .env_template .env
# Edit .env with your configuration
# If MCP server is on same instance, use: TABLEAU_MCP_HTTP_URL=http://localhost:3927/tableau-mcp- Set up MCP server as a systemd service (if running on same instance):
Create /etc/systemd/system/tableau-mcp.service:
[Unit]
Description=Tableau MCP Server
After=network.target
[Service]
Type=simple
User=tabby-user
WorkingDirectory=/opt/tableau-mcp
ExecStart=/usr/bin/node build/index.js serve:http
Restart=always
Environment=NODE_ENV=production
[Install]
WantedBy=multi-user.targetEnable and start the MCP server:
sudo systemctl daemon-reload
sudo systemctl enable tableau-mcp
sudo systemctl start tableau-mcp
sudo systemctl status tableau-mcp- Create systemd service file for web application:
Create /etc/systemd/system/tabby.service:
[Unit]
Description=Tableau Chatbot
After=network.target tableau-mcp.service
[Service]
User=tabby-user
WorkingDirectory=/home/tabby-user/tableau_mcp_tabby
ExecStart=/home/tabby-user/tableau_mcp_tabby/venv/bin/gunicorn \
--workers 1 \
--threads 4 \
--worker-class uvicorn.workers.UvicornWorker \
--bind 0.0.0.0:8000 \
--worker-connections 1000 \
web_app:app
Restart=always
[Install]
WantedBy=multi-user.target- Enable and start the web application service:
sudo systemctl daemon-reload
sudo systemctl enable tabby
sudo systemctl start tabby
sudo systemctl status tabby- View logs:
# Web application logs
sudo journalctl -u tabby -f
# MCP server logs (if running on same instance)
sudo journalctl -u tableau-mcp -fImportant Notes:
- The web application service uses
--workers 1because session state is stored in-memory. For multi-worker support, you would need to implement shared session storage (Redis, SQLite, etc.). - If running both services on the same instance, ensure the MCP server starts before the web application (service dependency is configured in the systemd unit files).
- Adjust instance size based on expected load - both services running together will require more CPU and memory.
- If MCP server is on a different instance, remove
tableau-mcp.servicefrom theAfter=line intabby.serviceand configure the appropriate network URL in.env.
- Run the web app (above)
- Open Tableau Desktop/Server
- Create or open a dashboard
- Add an Extension object
- Choose "Local Extension" and select
dashboard_extension/tableau_langchain.trex
Try these natural language questions:
- "Show me the top 10 customers by sales"
- "What are the sales trends over the last 12 months?"
- "Which regions have negative profit?"
- "Compare Q1 vs Q2 performance"
- "Find outliers in the customer data"
The application uses Server-Sent Events (SSE) to stream AI agent thoughts:
- Backend: FastAPI with streaming endpoints
- Frontend: JavaScript EventSource for real-time updates
- Agent: LangGraph with custom streaming handlers
The application uses a flexible model provider system (utilities/model_provider.py) that makes it easy to add new LLM providers:
- Extensible Design: Add new providers by implementing a
_get_<provider>_llm()function - Environment-based Configuration: All provider settings managed via environment variables
- Lazy Loading: Provider libraries only imported when that provider is selected
Enhanced error handling for common Tableau MCP issues:
- Schema validation errors
- Authentication timeouts (401 errors)
- Improved query parameter validation
- Provider-specific error messages
- Animated thinking indicators
- Cat favicon integration
- Responsive design with mobile support
- Smooth transitions between thinking steps and results
tableau_mcp_tabby/
βββ web_app.py # Main FastAPI application
βββ dashboard_app.py # Dashboard extension version
βββ static/ # Frontend assets
β βββ index.html # Main UI
β βββ script.js # Streaming chat logic with SSE
β βββ style.css # Custom styling
β βββ favicon.ico # Cat favicon π±
βββ utilities/
β βββ chat.py # Streaming response handlers
β βββ prompt.py # Agent system prompts and instructions
β βββ model_provider.py # LLM provider abstraction and initialization
β βββ logging_config.py # Logging setup and configuration
βββ dashboard_extension/ # Tableau extension files
β βββ tableau_langchain.trex # Extension manifest
βββ .env_template # Environment variable template
βββ requirements.txt # Python dependencies
βββ .logs/ # Application logs (auto-created)
βββ web_app.log # Application logs
βββ agent_trace.jsonl # Agent execution traces (if FileCallbackHandler enabled)
401 Authentication Errors:
- Verify the Tableau MCP server is running and accessible at the configured URL
- Check that the MCP server has valid Tableau credentials configured
- Ensure Direct Trust with Connected Apps authentication is properly set up
Model Provider Initialization Errors:
- OpenAI: Verify
OPENAI_API_KEYis set and valid - AWS Bedrock:
- Verify
AWS_ACCESS_KEY_ID,AWS_SECRET_ACCESS_KEY, andAWS_REGIONare set - Ensure Bedrock is enabled in your AWS account for the specified region
- Check that your AWS credentials have
bedrock:InvokeModelpermissions - Verify the model ID matches a Bedrock-available model in your region
- Verify
Schema Validation Errors:
- The app includes improved error handling for invalid Tableau functions
- Check the logs in
.logs/web_app.logfor detailed error information - Review the agent's query attempts in
.logs/agent_trace.jsonl(if FileCallbackHandler enabled)
Streaming Not Working:
- Ensure you're using a modern browser with EventSource support
- Check browser console for JavaScript errors
- Verify the
/chat/streamendpoint is accessible - Check that the MCP server is running before starting the web app
Callback Handler Warnings:
- If you see
FileCallbackHandler without a context managerwarnings, ensure you're using the latest version - The application now properly manages callback handlers as context managers
Service Won't Start (systemd):
- Verify the
tabby-userexists and has correct permissions - Check that the virtual environment path is correct in
tabby.service - Ensure all dependencies are installed in the virtual environment
- Verify the working directory path matches your deployment location
Contributions welcome! This project builds on the excellent foundation from The Information Lab. Please:
- Fork the repository
- Create a feature branch
- Make your changes
- Add tests if applicable
- Submit a pull request
MIT License - see LICENSE file for details.
- Will Sutton and The Information Lab for the original tableau_mcp_starter_kit
- Tableau MCP Team for the core MCP implementation
- LangChain for the AI framework
- Tableau for the analytics platform
- Tableau MCP - Core MCP server
- Original Starter Kit - Foundation project
- Tableau MCP Experimental - Advanced MCP tools
β If this project helps you analyze data more effectively, please give it a star!
Made with π± and β for the Tableau community
