Skip to content

techmengg/wastesortos

Repository files navigation

WasteSortOS

An AI-powered waste sorting app. Point your phone at any waste item and get an instant disposal recommendation — Recycling, Compost, Landfill, Hazardous Waste, E-Waste, or Special Collection.

How it works

  1. Capture — the mobile app takes a photo
  2. Vision — Gemini identifies the item and its materials
  3. Rules engine — a campus-specific ruleset maps the item to a bin
  4. AI fallback — if the ruleset doesn't match, Gemini classifies it directly
  5. Explanation — a plain-language reason is generated and shown to the user

The app also supports text search — type an item name to get its disposal method without using the camera.

Project structure

wastesortos/
├── backend/                  # FastAPI server (Python)
│   ├── main.py               # API routes: /classify, /search, /detect, /health
│   ├── models.py             # Pydantic response models
│   ├── pipeline/
│   │   ├── flow.py           # Orchestrates the full image classification pipeline
│   │   ├── vision_agent.py   # Gemini vision: identifies item + materials from image
│   │   ├── ai_classifier.py  # Gemini fallback: classifies items the ruleset doesn't know
│   │   ├── rules_engine.py   # Campus ruleset lookup (no AI, fully offline)
│   │   ├── decision_resolver.py  # Merges vision + rules into final response shape
│   │   ├── explanation_agent.py  # Generates human-readable disposal explanation
│   │   ├── text_flow.py      # Pipeline for text-based search queries
│   │   └── image_utils.py    # Image resizing before API calls
│   └── ruleset/
│       └── queens_campus.json  # Disposal rules for Queen's University campus
└── mobile/                   # React Native app (Expo)
    └── App.tsx               # Single-file app: camera, result sheet, search, history

Prerequisites

  • Python 3.11+
  • Node.js 18+
  • Google Cloud project with Vertex AI enabled and access to gemini-2.5-flash
  • Xcode (for iOS builds — macOS only)
  • Physical iPhone (react-native-vision-camera does not work in simulators or Expo Go)

Backend setup

cd backend

# Create and activate virtual environment (first time only)
python3 -m venv .venv
source .venv/bin/activate       # macOS/Linux
# .venv\Scripts\activate        # Windows

# Install dependencies
pip install -r requirements.txt

# Configure environment
cp .env.example .env
# Edit .env and set GOOGLE_CLOUD_PROJECT to your GCP project ID

# Authenticate with Google Cloud
gcloud auth application-default login

# Start the server
uvicorn main:app --reload

The API will be available at http://localhost:8000. Visit http://localhost:8000/docs for interactive API docs.

Environment variables

Variable Required Description
GOOGLE_CLOUD_PROJECT Yes GCP project ID to bill Vertex AI usage against
GOOGLE_CLOUD_LOCATION No Gemini region (default: us-central1)
GOOGLE_APPLICATION_CREDENTIALS No Path to service account JSON. If omitted, uses gcloud auth application-default login

Exposing the backend to your phone

The mobile app reads EXPO_PUBLIC_API_URL from mobile/.env. To expose your local backend over the internet:

# Option A — cloudflared (recommended)
brew install cloudflared
cloudflared tunnel --url http://localhost:8000
# Copy the printed https://xxxx.trycloudflare.com URL

# Option B — localtunnel
npx localtunnel --port 8000 --subdomain wastesort
# URL will be https://wastesort.loca.lt

Update mobile/.env with the tunnel URL:

EXPO_PUBLIC_API_URL=https://xxxx.trycloudflare.com

Restart the Expo dev server after changing this file.

Mobile setup

Note: This app uses react-native-vision-camera which does not work with Expo Go. A custom dev client must be built and installed on your device.

First-time device setup

  1. Add your Apple ID to Xcode (free account is enough)

    • Xcode → Settings → Accounts → add your Apple ID
    • Click Manage Certificates+Apple Development
  2. Enable Developer Mode on your iPhone

    • Settings → Privacy & Security → Developer Mode → ON → restart
  3. Connect your iPhone via USB and trust the computer when prompted

  4. Build and install the dev client (one-time, ~5–10 min):

    cd mobile
    npm install
    npx expo run:ios --device

    Select your iPhone when prompted.

  5. Clear DerivedData if you hit build errors:

    rm -rf ~/Library/Developer/Xcode/DerivedData

Running the dev server (after setup)

cd mobile
npx expo start --dev-client --tunnel

Open the WasteSort OS app on your iPhone and connect to the dev server. Hot reload is active — code changes appear instantly without rebuilding.

Re-run npx expo run:ios --device only if you add new native modules or change app.json.

API reference

Method Path Description
GET /health Health check
POST /classify Classify a waste item from an image (multipart file)
POST /search Classify a waste item from a text query ({ "query": "..." })
GET /search/suggestions?q= Autocomplete suggestions for the search bar
POST /detect Detect objects in an image and return bounding boxes

Classification response

{
  "item_label": "aluminum soda can",
  "materials": ["aluminum"],
  "contamination_risk": false,
  "bin": "Recycling",
  "explanation": "Empty aluminum cans are accepted in the blue recycling bin...",
  "confidence": 0.95,
  "warning": null
}

Possible bin values: Recycling, Compost, Landfill, Hazardous waste, E-waste drop-off, Special collection

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors