Production-grade autonomous email assistant with Gmail integration, powered by OpenAI GPT-4o.
- FastAPI - Async REST API
- PostgreSQL with pgvector - Multi-tenant database with vector similarity search
- Redis + Celery - Background task processing
- SQLAlchemy 2.0 - Async ORM
- OpenAI GPT-4o - LLM routing and email understanding
- Google OAuth 2.0 - Secure Gmail/Calendar integration
- Gmail inbox monitoring
- Email classification (urgent, needs response, category)
- Learning from user behavior
- Style profile building from sent emails
- AI-powered reply drafting
- User feedback loop (accept/edit/reject)
- Context-aware suggestions using thread history and similar emails
- Confidence-based automation
- High-confidence replies auto-send (optional)
- Safety verification for all drafts
- Python 3.11+
- Docker & Docker Compose
- Google Cloud Project with OAuth 2.0 credentials
- OpenAI API key
cd echo
cp .env.example .envEdit .env with your credentials:
# Google OAuth
GOOGLE_CLIENT_ID=your-client-id
GOOGLE_CLIENT_SECRET=your-client-secret
# OpenAI
OPENAI_API_KEY=sk-your-key
# Security (generate secure keys)
SECRET_KEY=$(openssl rand -hex 32)
ENCRYPTION_KEY=$(python -c "from cryptography.fernet import Fernet; print(Fernet.generate_key().decode())")docker-compose up -dThis starts:
- PostgreSQL with pgvector
- Redis
- FastAPI application (port 8000)
- Celery worker
- Celery beat scheduler
docker-compose exec api alembic upgrade head- API: http://localhost:8000
- Docs: http://localhost:8000/docs
- Health: http://localhost:8000/health
pip install -r requirements.txt# PostgreSQL with pgvector
docker run -d --name echo-db -p 5432:5432 \
-e POSTGRES_PASSWORD=postgres \
-e POSTGRES_DB=echo \
pgvector/pgvector:pg16
# Redis
docker run -d --name echo-redis -p 6379:6379 redis:7-alpinealembic upgrade headuvicorn main:app --reloadcelery -A workers.celery_app worker --loglevel=infocelery -A workers.celery_app beat --loglevel=infoGET /auth/google/loginReturns authorization URL. User completes OAuth flow.
GET /auth/google/callback?code=<code>Returns JWT token.
GET /emails/
Authorization: Bearer <token>GET /suggestions/
Authorization: Bearer <token>POST /suggestions/{id}/feedback
Authorization: Bearer <token>
{
"feedback_type": "accepted",
"final_text": "optional edited text"
}GET /users/me/export
Authorization: Bearer <token>DELETE /users/me
Authorization: Bearer <token>Celery handles:
- Email Fetching: Every 5 minutes (configurable)
- Email Classification: On new email
- Embedding Generation: For similarity search
- Suggestion Creation: For emails needing response
- Style Profile Updates: On demand
- Token Budget Reset: Daily at midnight
- OAuth tokens encrypted at rest (AES-256 via Fernet)
- Multi-tenant data isolation (all queries scoped by user_id)
- JWT authentication for API access
- Safety verification for all AI-generated content
- Audit logging for all tool actions
- GDPR-compliant data export and deletion
Key settings in .env:
# LLM Models
CLASSIFICATION_MODEL=gpt-4o-mini # Cost-effective classification
DRAFTING_MODEL=gpt-4o # High-quality drafting
SUMMARIZATION_MODEL=gpt-4o
# Confidence Thresholds
AUTO_SEND_THRESHOLD=0.95 # Auto-send only at very high confidence
APPROVAL_THRESHOLD=0.70 # Show for approval
SUGGESTION_THRESHOLD=0.50 # Show as suggestion
# Processing
EMAIL_FETCH_INTERVAL_MINUTES=5
MAX_EMAILS_PER_FETCH=50
STYLE_PROFILE_SAMPLE_SIZE=200users- User accounts and OAuth tokensemails- Ingested Gmail messagesembeddings- Vector embeddings for similarity searchsuggestions- AI-generated reply draftsstyle_profiles- User writing style analysisfeedback_logs- User feedback for ML improvement
- Hosting: AWS ECS / Google Cloud Run / Railway
- Database: Managed PostgreSQL with pgvector support
- Redis: Managed Redis (ElastiCache / Cloud Memorystore)
- Monitoring: Sentry + CloudWatch / Stackdriver
Ensure all production secrets are set:
DATABASE_URL=postgresql+asyncpg://user:pass@host:5432/echo
REDIS_URL=redis://host:6379/0
SECRET_KEY=<strong-random-key>
ENCRYPTION_KEY=<fernet-key>
GOOGLE_CLIENT_ID=<production-client-id>
GOOGLE_CLIENT_SECRET=<production-secret>
OPENAI_API_KEY=<production-key>
ENVIRONMENT=production
LOG_LEVEL=INFOGET /healthReturns {"status": "healthy"} when operational.
echo/
├── core/ # Configuration, security, logging
├── api/ # FastAPI routes
├── models/ # SQLAlchemy models
├── schemas/ # Pydantic schemas
├── services/ # Business logic
├── tools/ # External integrations (Gmail, Calendar)
├── workers/ # Celery tasks
├── db/ # Database session management
└── migrations/ # Alembic migrations
- Create model in
models/ - Import in
models/__init__.py - Create migration:
alembic revision --autogenerate -m "description" - Apply:
alembic upgrade head
- Create service in
services/ - Inject
AsyncSessionvia constructor - Use in routes via dependency injection
Proprietary