mAIdAI is a personal open-source chatbot designed to be your smart assistant within Google Chat.
It leverages Google's Generative AI (Gemini via Vertex AI or Google AI Studio) to understand your specific context and automate responses.
| About | Message | Slash |
|---|---|---|
![]() |
![]() |
![]() |
The primary goal of mAIdAI is to provide a context-aware conversational agent that knows about your projects, preferences, and documentation. It can:
- Automate common answers: shortcuts to links, standard procedures, or FAQs using a simple command system.
- Provide contextual assistance: answer complex questions by relying on a custom "context" file (e.g., your personal docs, project specs).
- Integrate deeply: works directly within Google Chat as a bot.
mAIdAI is a Python application built with FastAPI. It functions as a webhook service for Google Chat.
- Context: The bot is initialized with a
context.mdfile (System Instructions), giving it a persona and knowledge base. - Commands: It checks incoming messages against a
commands.jsonfile for exact match shortcuts. - GenAI: If no command matches, it sends the user's prompt to a Google Gemini model to generate a helpful response based on the provided context.
main.py: The core application logic (FastAPI app, GenAI client).context.md: The "memory" of the AI. Markdown file defining system instructions and knowledge.commands.json: Key-value pairs for instant, deterministic prompts and responses (shortcuts).justfile: Task runner for setup, testing, running, and deployment.pyproject.toml: Project configuration and dependency management (usinguv).
Before you begin, ensure you have the following installed:
- Python 3.13+
- uv: An extremely fast Python package installer and resolver.
- just: A handy command runner.
- Google Cloud SDK: Required for authentication and deployment to Google Cloud.
Create a .env file in the root directory. You can use the following template:
# Google Cloud Configuration
GOOGLE_CLOUD_PROJECT="your-project-id"
GOOGLE_CLOUD_LOCATION="us-central1"
GOOGLE_GENAI_USE_VERTEXAI="true" # Set to false if using API Key / Google AI Studio
# Model Configuration
MODEL_NAME="gemini-2.0-flash"
LOGGING_LEVEL="INFO"For authentication:
- If using Vertex AI, run
just authto authenticate locally with your Google credentials. - If using Google AI Studio (API Key), you might need to adapt the client initialization in
main.pyor set standard authentication environment variables.
-
context.md: Write your specific context here. Tell the AI who it is, what it knows, and how it should behave. -
commands.json: Add key-value shortcuts.{ "1": "Here is the link to the documentation: https://...", "2": "I can help you with..." }(See
commands.sample.jsonfor an example)
Install python dependencies:
just installRun linting and formatting checks:
just checkThe project is configured to deploy easily to Google Cloud Run.
-
Prepare Production Config: Create a
.env.prod.yamlfile for Cloud Run environment variables. Note that this must be a YAML file, unlike the local.env.GOOGLE_CLOUD_PROJECT: "your-project-id" GOOGLE_CLOUD_LOCATION: "us-central1" GOOGLE_GENAI_USE_VERTEXAI: "true" MODEL_NAME: "gemini-2.0-flash-exp" LOGGING_LEVEL: "INFO"
-
Authenticate: Ensure you have authenticated with gcloud.
just auth
-
Deploy:
just deploy
This command will:
- Check code quality.
- Build and deploy the source code to Cloud Run.
- Configure the service with the necessary environment variables (ensure reference to
.env.prod.yamlinjustfileexists or is configured correctly).
Once your service is deployed and has a public URL (e.g., https://maidai-xyz.a.run.app):
- Go to the Google Cloud Console > API & Services > Enable APIs.
- Enable the Google Chat API.
- Go to the Google Chat API configuration page.
- Set up the App URL to point to your deployed Cloud Run endpoint.
- In the Connection settings, select "HTTP Endpoint".
- Configure permissions and visibility (who can call the bot).
- Configure the commands (each number is associated with a shortcut).
If your Cloud Run service requires authentication, grant the Google Chat service account permission to invoke it.
gcloud run services add-iam-policy-binding [SERVICE_NAME] \
--member="serviceAccount:[SERVICE_ACCOUNT_EMAIL]" \
--role="roles/run.invoker" \
--region [REGION] \
--project [PROJECT_ID]Now you can mention @mAIdAI in your Google Chat spaces or DM it directly!


