An AI-powered waste sorting app. Point your phone at any waste item and get an instant disposal recommendation — Recycling, Compost, Landfill, Hazardous Waste, E-Waste, or Special Collection.
- Capture — the mobile app takes a photo
- Vision — Gemini identifies the item and its materials
- Rules engine — a campus-specific ruleset maps the item to a bin
- AI fallback — if the ruleset doesn't match, Gemini classifies it directly
- Explanation — a plain-language reason is generated and shown to the user
The app also supports text search — type an item name to get its disposal method without using the camera.
wastesortos/
├── backend/ # FastAPI server (Python)
│ ├── main.py # API routes: /classify, /search, /detect, /health
│ ├── models.py # Pydantic response models
│ ├── pipeline/
│ │ ├── flow.py # Orchestrates the full image classification pipeline
│ │ ├── vision_agent.py # Gemini vision: identifies item + materials from image
│ │ ├── ai_classifier.py # Gemini fallback: classifies items the ruleset doesn't know
│ │ ├── rules_engine.py # Campus ruleset lookup (no AI, fully offline)
│ │ ├── decision_resolver.py # Merges vision + rules into final response shape
│ │ ├── explanation_agent.py # Generates human-readable disposal explanation
│ │ ├── text_flow.py # Pipeline for text-based search queries
│ │ └── image_utils.py # Image resizing before API calls
│ └── ruleset/
│ └── queens_campus.json # Disposal rules for Queen's University campus
└── mobile/ # React Native app (Expo)
└── App.tsx # Single-file app: camera, result sheet, search, history
- Python 3.11+
- Node.js 18+
- Google Cloud project with Vertex AI enabled and access to
gemini-2.5-flash - Xcode (for iOS builds — macOS only)
- Physical iPhone (react-native-vision-camera does not work in simulators or Expo Go)
cd backend
# Create and activate virtual environment (first time only)
python3 -m venv .venv
source .venv/bin/activate # macOS/Linux
# .venv\Scripts\activate # Windows
# Install dependencies
pip install -r requirements.txt
# Configure environment
cp .env.example .env
# Edit .env and set GOOGLE_CLOUD_PROJECT to your GCP project ID
# Authenticate with Google Cloud
gcloud auth application-default login
# Start the server
uvicorn main:app --reloadThe API will be available at http://localhost:8000. Visit http://localhost:8000/docs for interactive API docs.
| Variable | Required | Description |
|---|---|---|
GOOGLE_CLOUD_PROJECT |
Yes | GCP project ID to bill Vertex AI usage against |
GOOGLE_CLOUD_LOCATION |
No | Gemini region (default: us-central1) |
GOOGLE_APPLICATION_CREDENTIALS |
No | Path to service account JSON. If omitted, uses gcloud auth application-default login |
The mobile app reads EXPO_PUBLIC_API_URL from mobile/.env. To expose your local backend over the internet:
# Option A — cloudflared (recommended)
brew install cloudflared
cloudflared tunnel --url http://localhost:8000
# Copy the printed https://xxxx.trycloudflare.com URL
# Option B — localtunnel
npx localtunnel --port 8000 --subdomain wastesort
# URL will be https://wastesort.loca.ltUpdate mobile/.env with the tunnel URL:
EXPO_PUBLIC_API_URL=https://xxxx.trycloudflare.com
Restart the Expo dev server after changing this file.
Note: This app uses
react-native-vision-camerawhich does not work with Expo Go. A custom dev client must be built and installed on your device.
-
Add your Apple ID to Xcode (free account is enough)
- Xcode → Settings → Accounts → add your Apple ID
- Click Manage Certificates → + → Apple Development
-
Enable Developer Mode on your iPhone
- Settings → Privacy & Security → Developer Mode → ON → restart
-
Connect your iPhone via USB and trust the computer when prompted
-
Build and install the dev client (one-time, ~5–10 min):
cd mobile npm install npx expo run:ios --deviceSelect your iPhone when prompted.
-
Clear DerivedData if you hit build errors:
rm -rf ~/Library/Developer/Xcode/DerivedData
cd mobile
npx expo start --dev-client --tunnelOpen the WasteSort OS app on your iPhone and connect to the dev server. Hot reload is active — code changes appear instantly without rebuilding.
Re-run
npx expo run:ios --deviceonly if you add new native modules or changeapp.json.
| Method | Path | Description |
|---|---|---|
GET |
/health |
Health check |
POST |
/classify |
Classify a waste item from an image (multipart file) |
POST |
/search |
Classify a waste item from a text query ({ "query": "..." }) |
GET |
/search/suggestions?q= |
Autocomplete suggestions for the search bar |
POST |
/detect |
Detect objects in an image and return bounding boxes |
{
"item_label": "aluminum soda can",
"materials": ["aluminum"],
"contamination_risk": false,
"bin": "Recycling",
"explanation": "Empty aluminum cans are accepted in the blue recycling bin...",
"confidence": 0.95,
"warning": null
}Possible bin values: Recycling, Compost, Landfill, Hazardous waste, E-waste drop-off, Special collection