This guide provides step-by-step instructions for installing and running the Bahmni AI Assistant module integrated with Bahmni EMR using the Groq LLM API.
- Prerequisites
- Getting a Groq API Key
- Installation
- Configuration
- Starting Bahmni
- Verifying the Installation
- Accessing the AI Assistant
- Troubleshooting
- Updating & Maintaining
- Uninstalling
Before starting, verify you have:
- Docker - Container runtime (version 20.10+)
- Docker Compose - Container orchestration (version 1.29+)
To verify installation:
docker --version # Should show version >= 20.10
docker-compose --version # Should show version >= 1.29Your project must have both directories at the same level:
your-project/
├── bahmni-docker/ # Bahmni Docker Compose setup
│ ├── bahmni-standard/ # Standard installation (or bahmni-lite)
│ │ ├── docker-compose.yml
│ │ ├── .env
│ │ └── run-bahmni.sh
│ └── ...
├── ai-module/ # AI Assistant module
│ ├── setup-ai-integration.sh
│ ├── install.sh
│ ├── verify-ai-integration.sh
│ ├── config/
│ ├── files/
│ └── ...
You need a free Groq API key to use the AI Assistant. See Getting a Groq API Key below.
The AI Assistant uses the Groq Cloud API for AI inference. Follow these steps to get a free key:
- Visit https://console.groq.com
- Click Sign Up (or Sign In if you have an account)
- Complete the registration with your email
- Verify your email address
- In the Groq console, navigate to API Keys (usually in sidebar menu)
- Click Create New API Key
- Give it a name like "Bahmni-AI" for easy identification
- Click Generate or Create
- Copy the key immediately - you won't be able to see it again
The key will look like:
gsk_aBcDeFgHiJkLmNoPqRsT123uVwXyZ456
- Never commit this key to Git - it gives access to your Groq account
- Store it securely in your
.envfile only - If you accidentally expose it, regenerate a new one in the Groq console
Follow these steps in order. Run all commands from your project root directory.
cd ai-moduleFor Bahmni Standard (recommended for full features):
./setup-ai-integration.sh install --bahmni-standardFor Bahmni Lite only:
./setup-ai-integration.sh install --bahmni-liteFor both Standard and Lite:
./setup-ai-integration.sh install --allIf your bahmni-docker directory is in a different location, specify it explicitly:
./setup-ai-integration.sh install --bahmni-standard --bahmni-docker-path /path/to/bahmni-dockerThe installation script should complete with success messages like:
✓ All required files found
✓ Bahmni Standard installation found at: /path/to/bahmni-docker/bahmni-standard
✓ Backed up Bahmni Standard/.env
✓ Merged AI configuration into Bahmni Standard/.env
✓ Normalized AI runtime defaults in Bahmni Standard/.env
✓ Created Bahmni Standard/docker-compose.override.yml
✓ Injected custom display control into Bahmni Standard
✓ Docker Compose configuration is valid for Bahmni Standard
✓ Post-install verification passed for Bahmni Standard
The installation script automatically:
- ✓ Creates
docker-compose.override.ymlwith AI services - ✓ Merges AI configuration into
.env - ✓ Sets up the UI extension for the AI Assistant button
- ✓ Configures Docker volumes and environment defaults
- ✓ Creates backups of original configurations
- ✓ Validates the Docker Compose configuration
Backups: Original configurations are backed up at:
ai-module/backups/backup_YYYYMMDD_HHMMSS/Bahmni Standard/
Edit the .env file in your Bahmni Standard directory:
nano bahmni-docker/bahmni-standard/.envOr use your preferred editor (VSCode, vim, etc.)
Find or add this line:
GROQ_API_KEY=gsk_your_actual_key_hereReplace gsk_your_actual_key_here with your actual key from Groq (step from above).
Example:
GROQ_API_KEY=gsk_aBcDeFgHiJkLmNoPqRsT123uVwXyZ456Save the file (Ctrl+O, Enter, Ctrl+X in nano).
Check these important settings are present in .env:
# Must be present and enabled
COMPOSE_PROFILES=emr,ai
OPENMRS_SESSION_AUTH_ENABLED=true
# Recommended security settings
OPENMRS_ALLOW_INSECURE_TLS=false
# AI service configuration (auto-set by installer)
AI_FRONTEND_BASE_PATH=/openmrs/ai-assistant
AI_API_BASE_PATH=/openmrs/ai-api
AI_BACKEND_INTERNAL_URL=http://ai-backend:3001
GROQ_MODEL=llama-3.3-70b-versatile
EMR_BASE_URL=http://openmrs:8080/openmrs/ws/rest/v1All of these should be automatically set. If any are missing, add them manually.
You can customize AI behavior with these optional environment variables:
# AI model to use (default: llama-3.3-70b-versatile)
# Other options: mixtral-8x7b-32768, gemma-7b-it
GROQ_MODEL=llama-3.3-70b-versatile
# Cache settings for better performance
CACHE_TTL_MS=3600000 # Cache duration in milliseconds
CACHE_MAX_ENTRIES=500 # Maximum cached responses
# Timeout limits (milliseconds)
AI_ROUTE_TIMEOUT_MS=2500 # HTTP timeout
AI_RETRIEVAL_TIMEOUT_MS=4000 # Data retrieval timeout
AI_GENERATION_TIMEOUT_MS=8000 # AI generation timeout
AI_RETRY_ATTEMPTS=2 # Retry failed requests
# Security and logging
LOG_PHI_MODE=strict # strict, permissive, or disabled
ALLOWED_ORIGINS=https://localhost,http://localhost:8080 # add your browser origin here too
# If you are using Codespaces or a remote URL, add that exact origin here as well.
# Example:
# ALLOWED_ORIGINS=https://your origin,https://localhost,http://localhost:8080With configuration complete, start Bahmni:
cd bahmni-docker/bahmni-standard./run-bahmni.shcd bahmni-docker/bahmni-standard
docker-compose ps
# Look for these containers with status "Up":
# NAME STATUS
# openmrs Up X minutes
# bahmni-config Up X minutes
# ai-backend Up X minutes
# ai-frontend Up X minutes
# proxy Up X minutesNavigate to:
https://localhost
Note: You may get a security warning because localhost uses a self-signed certificate. Click "Advanced" then "Proceed" or "Accept Risk" (varies by browser).
- Username:
admin - Password:
Admin123(Use your actual credentials if changed)
- Click Clinical in the main menu
- Search for and select a patient
- View the patient's clinical information
Look for the AI Assistant button:
- Desktop: Floating button in the bottom-right corner (looks like a chat bubble) OR toolbar button
- Mobile: Chat icon in the clinical toolbar
Click it to open the AI chat panel.
The AI Assistant is context-aware and knows about the current patient. You can ask:
- "Summarize this patient's history"
- "What medications is this patient on?"
- "What are the recent lab results?"
- "Generate a clinical note"
- Or any other medical questions in the context of this patient
Symptoms: Error when running scripts
Solution:
# Check if Docker Compose is installed
docker-compose --version
# If not installed, install it:
# On Ubuntu/Debian:
sudo apt-get install docker-compose
# Or try the newer syntax:
docker compose --versionSymptoms:
Error: Please set GROQ_API_KEY in your env file
Solution:
-
Verify the key is in
.env:grep GROQ_API_KEY bahmni-docker/bahmni-standard/.env
-
If not present, add it:
echo "GROQ_API_KEY=gsk_your_key_here" >> bahmni-docker/bahmni-standard/.env
-
Restart services:
cd bahmni-docker/bahmni-standard ./run-bahmni.sh # Select option 3 (Restart)
Symptoms:
ai-backend Unhealthy
ai-frontend Exit code 1
Debugging:
-
Check service logs:
cd bahmni-docker/bahmni-standard docker-compose logs ai-backend docker-compose logs ai-frontend -
Common causes:
- Invalid Groq API key → Verify key is correct and active in Groq console
- Network connectivity → Check Docker network:
docker network ls - Port conflicts → Ensure port 3001 is available:
lsof -i :3001
-
Restart services:
cd bahmni-docker/bahmni-standard ./run-bahmni.sh # Select option 3 (Restart)
Symptoms: No chat button visible in Clinical section
Solution:
-
Clear browser cache (Ctrl+Shift+Delete)
-
Do a hard refresh (Ctrl+Shift+R or Cmd+Shift+R)
-
Verify UI extension was installed:
ls bahmni-docker/bahmni-standard/bahmni_config/openmrs/apps/customDisplayControl/js/ # Should contain: customControl.js -
Restart bahmni-config:
cd bahmni-docker/bahmni-standard docker-compose restart bahmni-config
Symptoms:
curl: (7) Failed to connect to localhost port 443: Connection refused
Solution:
-
Verify proxy is running:
cd bahmni-docker/bahmni-standard docker-compose ps proxy # Should show: proxy ... Up
-
Check if port 443 is in use:
lsof -i :443
-
Restart proxy:
docker-compose restart proxy
Symptoms: AI takes >10 seconds to respond or times out
Possible causes & solutions:
- Groq API rate limiting → Wait a few minutes and retry
- Network latency → Test connection:
curl https://api.groq.com - Increase timeout limits in
.env:Then restart services.AI_GENERATION_TIMEOUT_MS=15000 # Increase from 8000
To safely update to the latest AI module version:
cd ai-module
git pull origin master # Update module code
./setup-ai-integration.sh install --bahmni-standardThe install script will:
- Preserve your
.envconfiguration (including API keys) - Update only the AI files and configurations
- Create a new backup
- Validate the setup
Monitor AI services in real-time:
cd bahmni-docker/bahmni-standard
# View all logs
docker-compose logs -f
# View specific service logs
docker-compose logs -f ai-backend # AI API
docker-compose logs -f ai-frontend # AI UI
docker-compose logs -f proxy # Web proxyPress Ctrl+C to stop viewing logs.
To disable AI without uninstalling:
Edit .env and change:
# FROM:
COMPOSE_PROFILES=emr,ai
# TO:
COMPOSE_PROFILES=emrThen restart Bahmni. Services will work normally without the AI Assistant.
To remove the AI module completely:
cd ai-module
# Uninstall without restoring backups (creates new backups)
./setup-ai-integration.sh uninstall --bahmni-standard
# Uninstall and restore original configuration
./setup-ai-integration.sh uninstall --bahmni-standard --restoreThis will:
- Remove docker-compose.override.yml
- Remove AI environment variables from .env
- Remove UI extensions
- Optionally restore original backups
- Create dated backups for reference
- Groq API Docs: https://console.groq.com/docs
- Bahmni Docs: https://bahmni.atlassian.net/wiki
- Docker Compose Docs: https://docs.docker.com/compose
- Issue Reporting: Submit issues with verification output from:
./setup-ai-integration.sh verify --json > report.json