A Python-based Telegram bot that uses the Groq API to provide intelligent, context-aware responses. The bot acts as a "Mad Dog" persona (or any persona you define), using a local text file to guide its behavior and knowledge.
-
LLM Integration: Powered by Groq for lightning-fast inference using models like Llama 3 or Mixtral.
-
Contextual Memory: Loads a
SUMMARY.txtfile to act as the "System Prompt," giving the bot a consistent personality or knowledge base. -
Custom Commands: *
/bhaw: Sends a dog photo and a playful LLM-generated caption. -
/biscuits: A quick way to ask the bot about its favorite treats. -
/bones: Interactive response about bones. -
Group Chat Ready: Responds to direct messages or mentions in group chats.
-
Asynchronous: Built with
python-telegram-botandhttpxfor efficient, non-blocking performance.
- Python 3.10+
- A Telegram Bot Token (from @BotFather)
- A Groq API Key (from Groq Console)
Clone the repository and install the required dependencies:
pip install python-telegram-bot httpx python-dotenv
Create a .env file in the root directory and add your credentials:
TELEGRAM_TOKEN=your_telegram_bot_token_here
GROQ_API_KEY=your_groq_api_key_here
GROQ_MODEL=llama3-8b-8192
Create a file named SUMMARY.txt in the same directory. Write a description of how the bot should behave. For example:
"You are a helpful but slightly chaotic dog named Mad Dog. You love treats, hate mailmen, and speak in a friendly, energetic tone. Use dog puns often."
python bot.py
| File | Description |
|---|---|
bot.py |
The main application logic and Telegram handlers. |
SUMMARY.txt |
The context/personality file used by the LLM. |
.env |
(Hidden) Stores sensitive API keys and configuration. |
- Texting the Bot: Simply send any message to get an LLM-generated response based on your
SUMMARY.txt. /bhaw- Triggers a photo response + text./biscuits- Ask about snacks./bones- Talk about bones.
- API Costs/Limits: Be mindful of your Groq API usage limits.
- Security: Never commit your
.envfile to version control (GitHub/GitLab). - Model Names: Ensure the
GROQ_MODELin your.envmatches a valid model ID from the Groq documentation (e.g.,llama3-70b-8192).
Looking for a way to contribute? Check out our Issues labeled good first issue.
- Fork the Project
- Create your Feature Branch
- Commit your Changes
- Open a Pull Request
It was just a fun project and hence didnt add a "memory" feature so the bot remembers the last few messages of the conversation.