Skip to content

EvanWen036/Pour

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Pour - Intelligent Drink Mixing Machine

An intelligent drink mixing system that uses AI to generate custom drink recipes based on voice prompts. The system manages port-drink pairings and leverages Claude AI to create optimal drink combinations.

Architecture

The project consists of three main components:

1. AWS Lambda (aws_lambda/)

AWS Lambda functions that provide CRUD operations for managing port-drink pairs in an AWS PostgreSQL database. The Lambda handles all database interactions and exposes a RESTful API through AWS API Gateway.

Key Features:

  • Create, Read, Update, Delete operations for port-drink pairs
  • PostgreSQL database integration using pg8000
  • RESTful API endpoints via API Gateway
  • Automatic table creation on cold start

2. Server (server/)

A FastAPI server that provides two main services:

  • server.py: Main API server with endpoints for:

    • Storing drink creation prompts (/set_prompt)
    • Generating drink recipes using Claude AI (/mix)
    • Integrates with the MCP server to query available ingredients
  • mcp_server.py: Model Context Protocol (MCP) server that:

    • Exposes the Lambda API to query the database
    • Provides a list_drinks() tool for Claude AI to discover available ingredients
    • Acts as a bridge between Claude AI and the database

3. Client (client/)

A React Native mobile application built with Expo that provides:

  • Voice prompt recording and submission
  • Port-drink pair management (CRUD operations)
  • Real-time synchronization with the backend
  • Modern, intuitive UI for managing the drink mixing system

Project Structure

Pour/
├── aws_lambda/          # AWS Lambda functions for database CRUD
│   ├── lambda_function.py
│   └── requirements.txt
├── server/              # FastAPI server and MCP server
│   ├── server.py        # Main API server
│   └── mcp_server.py    # MCP server for Claude AI integration
└── client/              # React Native frontend
    ├── app/             # App screens and navigation
    ├── components/      # Reusable UI components
    ├── hooks/           # Custom React hooks
    ├── lib/             # API client and utilities
    └── types/           # TypeScript type definitions

Setup Instructions

Prerequisites

  • Python 3.8+
  • Node.js 18+
  • AWS Account with RDS PostgreSQL instance
  • Anthropic API key for Claude AI
  • Expo CLI (npm install -g expo-cli)

AWS Lambda Setup

  1. Navigate to the aws_lambda directory:
cd aws_lambda
  1. Install dependencies (dependencies are already bundled, but for local development):
pip install -r requirements.txt
  1. Configure environment variables in AWS Lambda:

    • DB_HOST: Your RDS PostgreSQL host
    • DB_USER: Database username
    • DB_PASSWORD: Database password
    • DB_NAME: Database name
    • DB_PORT: Database port (default: 5432)
  2. Deploy the Lambda function to AWS (zip the entire aws_lambda directory including dependencies)

  3. Configure API Gateway to proxy requests to the Lambda function

Server Setup

  1. Navigate to the server directory:
cd server
  1. Install dependencies:
pip install fastapi uvicorn fastmcp httpx requests
  1. Set environment variables:
export ANTHROPIC_API_KEY="your-anthropic-api-key"
  1. Update the MCP_URL in server.py to point to your MCP server endpoint (or use ngrok for local development)

  2. Start the MCP server:

python mcp_server.py
  1. Start the main API server:
uvicorn server:app --host 0.0.0.0 --port 8081

Client Setup

  1. Navigate to the client directory:
cd client
  1. Install dependencies:
npm install
  1. Create a .env file in the client directory:
EXPO_PUBLIC_BASE_URL=https://your-api-gateway-url/Test
  1. Start the development server:
npm start
  1. Run on your preferred platform:
npm run ios      # For iOS
npm run android  # For Android
npm run web      # For web

API Documentation

AWS Lambda API (via API Gateway)

Base URL: https://your-api-gateway-url/Test

Get All Port-Drink Pairs

GET /pairs

Response:

{
  "ok": true,
  "data": [
    {"port": 1, "drink": "Coca Cola"},
    {"port": 2, "drink": "Sprite"}
  ]
}

Get Single Port-Drink Pair

GET /pairs/{port}

Create Port-Drink Pair

POST /pairs
Content-Type: application/json

{
  "port": 1,
  "drink": "Coca Cola"
}

Update Port-Drink Pair

PUT /pairs/{port}
Content-Type: application/json

{
  "drink": "Pepsi"
}

Delete Port-Drink Pair

DELETE /pairs/{port}

FastAPI Server Endpoints

Base URL: http://your-server-url:8081

Set Drink Creation Prompt

POST /set_prompt
Content-Type: application/json

{
  "prompt": "make a pepsi-forward drink around 300 ml total"
}

Mix Drink

POST /mix

Returns:

  • {"status": 0} - No prompt stored
  • {"status": 1, "recipe": [...]} - Success, recipe generated
  • {"status": 2} - Error generating recipe

MCP Server

The MCP server exposes a list_drinks() tool that returns all available port-drink pairs from the database. This is used by Claude AI to discover available ingredients when generating recipes.

How It Works

  1. Port-Drink Management: Users can add, edit, or delete port-drink pairs through the React Native app. These are stored in the PostgreSQL database via AWS Lambda.

  2. Voice Prompt: Users record a voice prompt describing the drink they want (e.g., "make a pepsi-forward drink around 300 ml total").

  3. Prompt Storage: The voice prompt is transcribed and sent to the FastAPI server's /set_prompt endpoint.

  4. Recipe Generation: When /mix is called, the server:

    • Retrieves available ingredients from the database via the MCP server
    • Sends the prompt to Claude AI with the ingredient list
    • Claude generates a JSON recipe with port numbers and volumes
    • Returns the recipe for execution
  5. Drink Mixing: The recipe is executed by the physical mixing machine using the specified ports and volumes.

Environment Variables

AWS Lambda

  • DB_HOST: PostgreSQL database host
  • DB_USER: Database username
  • DB_PASSWORD: Database password
  • DB_NAME: Database name
  • DB_PORT: Database port (default: 5432)

Server

  • ANTHROPIC_API_KEY: Your Anthropic API key (required)
  • SERVO_URL: URL for servo control (optional)

Client

  • EXPO_PUBLIC_BASE_URL: Base URL for the AWS Lambda API Gateway endpoint

Database Schema

The port_drinks table has the following structure:

CREATE TABLE port_drinks (
    port  INTEGER PRIMARY KEY,
    drink TEXT NOT NULL
);

Development

Running Locally

  1. Start the MCP server on port 8080
  2. Start the FastAPI server on port 8081
  3. Use ngrok or similar to expose the MCP server to Claude AI
  4. Update MCP_URL in server.py with your ngrok URL
  5. Run the React Native app with npm start in the client directory

Testing

The client includes mock API functionality when EXPO_PUBLIC_BASE_URL is not set, allowing for frontend development without a backend connection.

About

ECE-COMPSCI-655 Intelligent Drink Mixing Machine

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors