Skip to content

Emrys73/BotGeneratorMarkI

Repository files navigation

MCP Telegram Bot Generator

Create custom Telegram bots through natural conversation, powered by AI.

Turn your ideas into fully functional Telegram bots in minutes - just describe what you want in plain English, and our AI will generate production-ready code for you!


Features

Core Capabilities

  • Conversational Bot Creation - Describe your bot in natural language
  • AI-Powered Code Generation - Real, production-ready code (not templates!)
  • Multiple Bot Types - Expense trackers, quizzes, reminders, and more
  • Instant Deployment - Get a ready-to-run ZIP file
  • Iterative Refinement - Request changes and improvements

Technical Features

  • LLM-Powered - Uses Ollama (local), Gemini, OpenAI, or Claude
  • Modern Framework - Built with aiogram 3.x
  • FSM State Management - Proper conversation flow handling
  • Input Validation - Secure string sanitization and validation
  • Error Recovery - Robust partial-success handling
  • MCP Architecture - Modular microservice design

Bot Types Supported

  • Expense Trackers - With categories, stats, CSV export
  • Todo Lists - Task management with priorities
  • Quiz Bots - Trivia games with scoring
  • Reminder Bots - Schedule notifications
  • Echo Bots - Simple message responses
  • Custom Bots - Anything you can describe!

Quick Start

Prerequisites

  • Python 3.10+
  • uv package manager
  • Ollama (for local LLM) OR API key for Gemini/OpenAI

Installation

1. Clone the repository:

git clone <repository-url>
cd mcp-telegram-bot-v1

2. Install dependencies:

uv sync

3. Set up Ollama (Recommended - Free & Unlimited!):

# Install Ollama (macOS)
brew install ollama

# Pull the code generation model
ollama pull qwen2.5-coder:latest

# Ollama will auto-start in the background

Alternative: Use Gemini or OpenAI instead (see Configuration below)

4. Configure environment:

cp .env.example .env
# Edit .env with your settings

Required configuration:

# Your BotGenerator bot token from @BotFather
BOT_TOKEN=your_telegram_bot_token

# LLM Configuration (Ollama - Local & Free!)
LLM_PROVIDER=ollama
LLM_MODEL=qwen2.5-coder:latest

# Code generation LLM
CODEGEN_LLM_PROVIDER=ollama
CODEGEN_LLM_MODEL=qwen2.5-coder:latest

5. Run the bot:

cd bot_generator
uv run python main.py

6. Start creating bots!

  • Open Telegram
  • Find your bot
  • Send /create
  • Follow the conversation!

Usage Example

Conversation with BotGenerator:

You: /create

Bot: Let's create your Telegram bot!
     What kind of bot would you like to build?

You: I want an expense tracking bot with categories

Bot: Great! A few questions:
     1. Should users be able to export data (like CSV)?
     2. Do you need statistics/reports?
     3. Any specific commands you want?

You: Yes to export, yes to stats, and I want /add, /list, and /stats commands

Bot: Perfect! Here's what I understand:
     - Expense tracker bot
     - Category support
     - CSV export
     - Statistics/reports
     - Commands: /add, /list, /stats
     
     Does this sound right?

You: Yes!

Bot: Excellent! Now I need your bot token from @BotFather.
     (If you don't have one, message @BotFather and create a new bot)

You: 123456789:ABCdefGHIjklMNOpqrsTUVwxyz123456789

Bot: Token validated for @my_expense_bot!
     
     Generating your bot now...
     This will take about 10-15 seconds. I'm:
     1. Analyzing your requirements
     2. Designing the architecture
     3. Generating code files
     4. Validating everything
     
     Please wait...

Bot: Your bot is ready!
     
     Package: expense_tracker_bot.zip (12 files generated)
     
     [ZIP file attachment]
     
     **Next steps:**
     1. Extract the ZIP file
     2. Install: pip install -r requirements.txt
     3. Run: python main.py
     
     Your bot will be live! Need any changes?

Architecture

System Overview

┌─────────────────────────────────────────────────┐
│          User (Telegram)                        │
└───────────────────┬─────────────────────────────┘
                    │
┌───────────────────▼─────────────────────────────┐
│     BotGenerator (Main Application)             │
│   ┌─────────────────────────────────────────┐   │
│   │  Conversational Orchestrator (Brain)    │   │
│   │  • LLM-powered conversation             │   │
│   │  • State management (FSM)               │   │
│   │  • Workflow coordination                │   │
│   └───────────────┬─────────────────────────┘   │
└───────────────────┼─────────────────────────────┘
                    │
        ┌───────────┼───────────┐
        │           │           │
┌───────▼────┐ ┌───▼────┐ ┌───▼─────────┐
│ LLM Service│ │MCP     │ │ Session     │
│            │ │Client  │ │ Manager     │
│ Ollama/    │ │        │ │             │
│ Gemini/    │ │        │ │             │
│ OpenAI     │ │        │ │             │
└────────────┘ └───┬────┘ └─────────────┘
                   │
    ┌──────────────┼──────────────┐
    │              │              │
┌───▼───┐  ┌──────▼─────┐  ┌────▼────┐
│Parser │  │Architecture│  │Generator│
│       │  │  Designer  │  │(LLM-    │
│       │  │            │  │powered) │
└───────┘  └────────────┘  └─────────┘

MCP Servers (Microservices)

  1. Parser - Extract structured intent from natural language
  2. Architecture Designer - Design bot architecture
  3. Template Library - Find matching code patterns
  4. Code Generator - Generate production code with LLM
  5. Validator - Check syntax, logic, and security

Detailed Architecture: See Architecture Guide


Configuration

LLM Providers

Ollama (Recommended - Local & Free)

LLM_PROVIDER=ollama
LLM_MODEL=qwen2.5-coder:latest
CODEGEN_LLM_PROVIDER=ollama
CODEGEN_LLM_MODEL=qwen2.5-coder:latest

Advantages:

  • Unlimited usage
  • No API costs
  • Fast inference
  • Privacy (runs locally)

Google Gemini

LLM_PROVIDER=gemini
LLM_API_KEY=your_gemini_api_key
LLM_MODEL=gemini-2.5-flash
CODEGEN_LLM_PROVIDER=gemini
CODEGEN_LLM_API_KEY=your_gemini_api_key
CODEGEN_LLM_MODEL=gemini-2.5-flash

Limits: 20 requests/day (free tier)

OpenAI

LLM_PROVIDER=openai
LLM_API_KEY=sk-proj-...
LLM_MODEL=gpt-4o-mini
CODEGEN_LLM_PROVIDER=openai
CODEGEN_LLM_API_KEY=sk-proj-...
CODEGEN_LLM_MODEL=gpt-4o

Note: Requires paid API access


Project Structure

mcp-telegram-bot-v1/
├── bot_generator/              # Main application
│   ├── main.py                # Entry point
│   ├── handlers/              # Telegram message handlers
│   │   ├── start.py          # /start, /create commands
│   │   └── messages.py       # Message processing
│   ├── services/             # Business logic
│   │   ├── conversational_orchestrator.py  # AI conversation brain
│   │   ├── generator_service.py            # Bot generation workflow
│   │   ├── llm_service.py                  # LLM provider abstraction
│   │   └── session_manager.py              # User session management
│   ├── models/               # Data models
│   └── utils/                # Utilities
│       ├── mcp_client.py    # MCP server communication
│       └── states.py        # FSM states
│
├── mcp_servers/              # MCP microservices
│   ├── parser/              # Intent extraction
│   ├── architecture/        # Architecture design
│   ├── templates_lib/       # Code templates
│   ├── generator/           # Code generation
│   │   ├── llm_codegen.py  # LLM-powered code gen
│   │   └── prompts/        # Generation prompts
│   └── validator/           # Code validation
│
├── shared/                   # Shared utilities
│   ├── models.py            # Shared data models
│   ├── utils.py             # Helper functions
│   └── validation.py        # Input validation & sanitization
│
├── .env                      # Configuration (create from .env.example)
└── requirements.txt          # Python dependencies

Security Features

Input Validation

  • Prompt sanitization - Prevents DoS attacks
  • Bot name sanitization - Prevents path traversal
  • File name validation - Secure ZIP creation
  • Token validation - Verifies with Telegram API

Code Quality

  • Syntax validation - Python syntax checking
  • Security scanning - Vulnerability detection
  • Logic testing - Bot functionality verification

Development

Running Tests

# Add tests here when implemented
pytest tests/

Code Quality

# Format code
black bot_generator/ mcp_servers/ shared/

# Lint
flake8 bot_generator/ mcp_servers/ shared/

# Type checking
mypy bot_generator/ mcp_servers/ shared/

Debug Mode

Enable detailed logging:

# In main.py
logging.basicConfig(level=logging.DEBUG)

Generated Bot Structure

When you create a bot, you'll receive a ZIP file with this structure:

your_bot_name/
├── main.py                    # Bot entry point
├── .env                       # Configuration (with your token)
├── requirements.txt           # Dependencies
├── handlers/                  # Command handlers
│   ├── add.py                # /add command (with FSM)
│   ├── list.py               # /list command
│   └── stats.py              # /stats command
├── services/                  # Business logic
│   └── expense_service.py    # Data management
├── models/                    # Data models
│   └── expense.py            # Expense dataclass
└── data/                      # Storage directory
    └── expenses.json         # Data file

Generated bots include:

  • Complete FSM state management
  • Input validation
  • Error handling
  • Database integration
  • Logging
  • Professional code structure

Examples

Expense Tracker Bot Features

# Generated handler includes:
- FSM states (AddExpense.amount, AddExpense.category, etc.)
- Input validation (amount must be numeric)
- Category selection with inline keyboard
- Error messages in user-friendly format
- Database persistence
- Logging for debugging

Quiz Bot Features

# Generated features:
- Question database management
- Score tracking
- Timer functionality
- Leaderboard
- Multiple choice questions
- Answer validation

Troubleshooting

Ollama not responding

# Check if Ollama is running
ollama ps

# Restart Ollama
ollama serve

Bot token invalid

# Get a new token from @BotFather
# Make sure to copy the entire token (numbers:letters)
# Token format: 123456789:ABCdefGHIjklMNOpqrsTUVwxyz

Import errors

# Reinstall dependencies
uv sync --force

# Or use pip
pip install -r requirements.txt

LLM generation fails

# Check your .env configuration
# Verify API keys are correct
# Try switching providers (Ollama is most reliable)

Roadmap

Completed

  • LLM-powered code generation
  • Multi-provider LLM support (Ollama, Gemini, OpenAI)
  • Conversational bot creation
  • Input validation and security
  • Token validation with Telegram API
  • Error recovery with partial success
  • FSM state management

In Progress

  • Comprehensive test suite
  • Rate limiting for LLM calls
  • Metrics and monitoring
  • True MCP protocol implementation

Planned

  • Web interface
  • Bot template marketplace
  • Code iteration and refinement
  • Multi-language support
  • Bot hosting service
  • Visual bot builder

Contributing

Contributions welcome! Please:

  1. Fork the repository
  2. Create a feature branch
  3. Make your changes
  4. Add tests if applicable
  5. Submit a pull request

Areas needing help:

  • Additional bot type templates
  • LLM prompt optimization
  • Test coverage
  • Documentation improvements
  • Bug fixes

License

[Add your license here]


Acknowledgments

  • aiogram - Modern Telegram Bot framework
  • Ollama - Local LLM inference
  • Google Gemini - AI model
  • OpenAI - GPT models
  • MCP Protocol - Microservice architecture inspiration

Support


Star History

If you find this project useful, please consider giving it a star!


Made with care by [Your Name]

Empowering everyone to create Telegram bots, no coding required!

About

generates telegram bots python files via prompt

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors