TigerGPT is a web-based AI chatbot application that supports multiple LLMs (Language Learning Models) and MCP (Model Control Protocol) servers. It provides a user-friendly interface for interacting with AI models and configuring various aspects of the system.
- Chat with AI language models (OpenAI or Azure OpenAI)
- Support for MCP servers for extending AI capabilities
- Streamlit-based UI for easy interaction
- FastAPI backend for robust API support
- Configurable LLM settings
- Conversation management
- Python 3.12 or later
- Required Python libraries (listed in requirements.txt)
- Access to OpenAI or Azure OpenAI services
-
Clone the repository:
git clone https://github.com/yourusername/tigergpt.git cd tigergpt -
Create a virtual environment:
python -m venv .venv
-
Activate the virtual environment:
- Windows:
.venv\Scripts\activate
- Linux/Mac:
source .venv/bin/activate
- Windows:
-
Install dependencies:
pip install -r requirements.txt
-
Configure environment variables (optional):
# For Azure OpenAI set AZURE_OPENAI_API_KEY=your_api_key set AZURE_OPENAI_ENDPOINT=https://your-endpoint.openai.azure.com set AZURE_OPENAI_API_VERSION=2024-09-01-preview set AZURE_OPENAI_DEPLOYMENT=your-deployment-name
Run the FastAPI backend server:
python main.pyThe backend will start on http://localhost:8000. You can access the API documentation at http://localhost:8000/docs.
In a separate terminal, run the Streamlit UI:
cd ui
streamlit run streamlit_app.pyThe UI will be available at http://localhost:8501.
- Navigate to the Streamlit UI at http://localhost:8501
- Use the "Chat" tab to interact with the AI
- Type your message in the input box and press Enter
- View the AI's response in the chat window
- Go to the "LLM Config" tab in the UI
- Select the provider (OpenAI or Azure)