An intelligent drink mixing system that uses AI to generate custom drink recipes based on voice prompts. The system manages port-drink pairings and leverages Claude AI to create optimal drink combinations.
The project consists of three main components:
AWS Lambda functions that provide CRUD operations for managing port-drink pairs in an AWS PostgreSQL database. The Lambda handles all database interactions and exposes a RESTful API through AWS API Gateway.
Key Features:
- Create, Read, Update, Delete operations for port-drink pairs
- PostgreSQL database integration using
pg8000 - RESTful API endpoints via API Gateway
- Automatic table creation on cold start
A FastAPI server that provides two main services:
-
server.py: Main API server with endpoints for:- Storing drink creation prompts (
/set_prompt) - Generating drink recipes using Claude AI (
/mix) - Integrates with the MCP server to query available ingredients
- Storing drink creation prompts (
-
mcp_server.py: Model Context Protocol (MCP) server that:- Exposes the Lambda API to query the database
- Provides a
list_drinks()tool for Claude AI to discover available ingredients - Acts as a bridge between Claude AI and the database
A React Native mobile application built with Expo that provides:
- Voice prompt recording and submission
- Port-drink pair management (CRUD operations)
- Real-time synchronization with the backend
- Modern, intuitive UI for managing the drink mixing system
Pour/
├── aws_lambda/ # AWS Lambda functions for database CRUD
│ ├── lambda_function.py
│ └── requirements.txt
├── server/ # FastAPI server and MCP server
│ ├── server.py # Main API server
│ └── mcp_server.py # MCP server for Claude AI integration
└── client/ # React Native frontend
├── app/ # App screens and navigation
├── components/ # Reusable UI components
├── hooks/ # Custom React hooks
├── lib/ # API client and utilities
└── types/ # TypeScript type definitions
- Python 3.8+
- Node.js 18+
- AWS Account with RDS PostgreSQL instance
- Anthropic API key for Claude AI
- Expo CLI (
npm install -g expo-cli)
- Navigate to the
aws_lambdadirectory:
cd aws_lambda- Install dependencies (dependencies are already bundled, but for local development):
pip install -r requirements.txt-
Configure environment variables in AWS Lambda:
DB_HOST: Your RDS PostgreSQL hostDB_USER: Database usernameDB_PASSWORD: Database passwordDB_NAME: Database nameDB_PORT: Database port (default: 5432)
-
Deploy the Lambda function to AWS (zip the entire
aws_lambdadirectory including dependencies) -
Configure API Gateway to proxy requests to the Lambda function
- Navigate to the
serverdirectory:
cd server- Install dependencies:
pip install fastapi uvicorn fastmcp httpx requests- Set environment variables:
export ANTHROPIC_API_KEY="your-anthropic-api-key"-
Update the
MCP_URLinserver.pyto point to your MCP server endpoint (or use ngrok for local development) -
Start the MCP server:
python mcp_server.py- Start the main API server:
uvicorn server:app --host 0.0.0.0 --port 8081- Navigate to the
clientdirectory:
cd client- Install dependencies:
npm install- Create a
.envfile in theclientdirectory:
EXPO_PUBLIC_BASE_URL=https://your-api-gateway-url/Test- Start the development server:
npm start- Run on your preferred platform:
npm run ios # For iOS
npm run android # For Android
npm run web # For webBase URL: https://your-api-gateway-url/Test
GET /pairsResponse:
{
"ok": true,
"data": [
{"port": 1, "drink": "Coca Cola"},
{"port": 2, "drink": "Sprite"}
]
}GET /pairs/{port}POST /pairs
Content-Type: application/json
{
"port": 1,
"drink": "Coca Cola"
}PUT /pairs/{port}
Content-Type: application/json
{
"drink": "Pepsi"
}DELETE /pairs/{port}Base URL: http://your-server-url:8081
POST /set_prompt
Content-Type: application/json
{
"prompt": "make a pepsi-forward drink around 300 ml total"
}POST /mixReturns:
{"status": 0}- No prompt stored{"status": 1, "recipe": [...]}- Success, recipe generated{"status": 2}- Error generating recipe
The MCP server exposes a list_drinks() tool that returns all available port-drink pairs from the database. This is used by Claude AI to discover available ingredients when generating recipes.
-
Port-Drink Management: Users can add, edit, or delete port-drink pairs through the React Native app. These are stored in the PostgreSQL database via AWS Lambda.
-
Voice Prompt: Users record a voice prompt describing the drink they want (e.g., "make a pepsi-forward drink around 300 ml total").
-
Prompt Storage: The voice prompt is transcribed and sent to the FastAPI server's
/set_promptendpoint. -
Recipe Generation: When
/mixis called, the server:- Retrieves available ingredients from the database via the MCP server
- Sends the prompt to Claude AI with the ingredient list
- Claude generates a JSON recipe with port numbers and volumes
- Returns the recipe for execution
-
Drink Mixing: The recipe is executed by the physical mixing machine using the specified ports and volumes.
DB_HOST: PostgreSQL database hostDB_USER: Database usernameDB_PASSWORD: Database passwordDB_NAME: Database nameDB_PORT: Database port (default: 5432)
ANTHROPIC_API_KEY: Your Anthropic API key (required)SERVO_URL: URL for servo control (optional)
EXPO_PUBLIC_BASE_URL: Base URL for the AWS Lambda API Gateway endpoint
The port_drinks table has the following structure:
CREATE TABLE port_drinks (
port INTEGER PRIMARY KEY,
drink TEXT NOT NULL
);- Start the MCP server on port 8080
- Start the FastAPI server on port 8081
- Use ngrok or similar to expose the MCP server to Claude AI
- Update
MCP_URLinserver.pywith your ngrok URL - Run the React Native app with
npm startin the client directory
The client includes mock API functionality when EXPO_PUBLIC_BASE_URL is not set, allowing for frontend development without a backend connection.