Skip to content

KTH-Sys/Skaler_HackDavis26

Repository files navigation

Skaler

AI-matched volunteer opportunities for college campuses

Built at HackDavis 2026 · UC Davis


The pitch

Volunteer signups are dropping because discovery feels like a 2005 job board. Skaler is a campus volunteering app that uses an LLM to rank opportunities for each student, write a one-line reason for why this fits you, and surface social proof from your network — so the right cause finds the right person without anyone having to scroll a list.

"Plays to your Design and Photography skills."
"Tyler from your network volunteered here."
                                   ↑ what every card shows

What it does

  • Personalized feed — every opportunity is scored 0–100 and gets a one-line AI explanation tied to your actual skills + interests
  • Social proof — "Maya and 2 others from your network volunteered here" pulled from a real follow graph
  • Save / unsave with optimistic UI and a dedicated saved tab
  • AI Coach — chat overlay that knows your profile and gives specific advice (not generic platitudes)
  • Two roles — same login page, toggle between Volunteer and Organizer. Organizers land on a dashboard showing their listings + save counts.
  • Graceful degradation — when the LLM rate-limits, a deterministic stub matcher takes over so the feed never breaks

Tech stack

Layer Tech
Frontend React 18, Vite, react-router-dom 6, Tabler Icons, Inter
Backend Next.js 15 (App Router) + TypeScript, runs as a thin API server
Database MongoDB Atlas via Mongoose 8
AI Google Gemini API — gemini-flash-latest (coach), gemini-2.0-flash (matcher)
Dev proxy Vite proxies /api/* → Next.js so the frontend uses relative URLs

Architecture

flowchart LR
    User((User))
    UI["React + Vite<br/>localhost:5173"]
    API["Next.js API<br/>localhost:3000"]
    DB[("MongoDB Atlas")]
    AI["Google Gemini"]

    User --> UI
    UI -->|/api/* proxy| API
    API <-->|Mongoose| DB
    API <-->|generateContent| AI
Loading

The frontend is a static React SPA. The backend is Next.js API routes only — no SSR pages — which means it can deploy as a serverless function on Vercel and the React bundle on any CDN.

How AI matching works

sequenceDiagram
    participant U as User
    participant F as React UI
    participant API as Next.js
    participant DB as MongoDB
    participant G as Gemini

    U->>F: Open Dashboard
    F->>API: GET /api/feed?userId=...
    par Concurrent reads
        API->>DB: Opportunities (35)
        API->>DB: Network users
        API->>DB: Saved IDs
    end
    DB-->>API: ...
    API->>G: 1 batched call: rank all 35 + write reasons
    G-->>API: [{ id, score, reason }, ...]
    API->>API: Compute social proof from network graph
    API-->>F: 35 cards w/ score, reason, social proof, saved flag
    F-->>U: Personalized feed
Loading

One LLM call ranks all 35 opportunities in a single batched request with responseSchema to force structured JSON. Results are cached per-user for 5 minutes so swiping through cards doesn't re-burn quota. If Gemini rate-limits, matcher.ts falls back to a deterministic skill-overlap matcher with templated reasons — same return shape, no UI change.

Project structure

.
├── src/                          # Next.js backend
│   ├── app/api/
│   │   ├── users/                # GET list, GET :id
│   │   ├── feed/                 # GET ranked feed for a user
│   │   ├── saves/                # GET list, POST save|unsave
│   │   ├── opportunities/        # GET list (with save counts), GET :id
│   │   └── coach/                # POST chat (Gemini)
│   ├── lib/
│   │   ├── db.ts                 # Mongoose connection (cache-safe)
│   │   ├── geminiMatcher.ts      # LLM matcher + cache + stub fallback
│   │   ├── matcher.ts            # Deterministic stub matcher
│   │   └── socialProof.ts        # Network → "X from your network" string
│   └── models/                   # User, Opportunity, Save (Mongoose)
├── scripts/
│   └── seed.ts                   # Wipes + seeds 8 users, 35 opps, prior saves
└── frontend/                     # Vite + React app
    └── src/
        ├── App.jsx               # react-router-dom routes
        ├── lib/
        │   ├── api.js            # fetch wrappers around /api/*
        │   └── session.js        # localStorage user/role helpers
        └── pages/
            ├── auth/             # Login (volunteer | organizer toggle)
            ├── main/             # Dashboard, Discover, Profile, Impact
            ├── secondary/        # OpportunityDetail, OrgProfile, ...
            ├── overlays/         # AICoachChat, Settings, Notifications
            └── org/              # OrgHomePage (organizer dashboard)

Local setup

You'll need Node 20+, an Atlas free cluster, and a Gemini API key.

1. Backend

cp .env.example .env.local
# Fill in MONGODB_URI and GEMINI_API_KEY
npm install
npm run seed       # 8 users, 35 opportunities, prior saves
npm run dev        # http://localhost:3000

2. Frontend

cd frontend
npm install
npm run dev        # http://localhost:5173

Open http://localhost:5173. To sign in as a specific seeded user, use that user's first name as the email prefix — e.g. maya@anything.com → Maya Chen. Any password 6+ chars. Toggle to Organizer for the org dashboard (e.g. davisreads@anything.com).

API reference

Method Path Description
GET /api/users List seeded users
GET /api/users/:id Single user profile
GET /api/feed?userId=... Ranked + AI-explained feed (with saved flag, socialProof line)
GET /api/opportunities All opportunities + per-listing save counts
GET /api/opportunities/:id Single opportunity
GET /api/saves?userId=... A user's saved listings
POST /api/saves Body: { userId, opportunityId, action: "save" | "unsave" }
POST /api/coach Body: { userId, history, message } → Gemini-backed chat reply

What's not in v1

  • Real auth — login is a UI-driven user-picker. Production needs OAuth + sessions.
  • Real org accounts — organizers map to seeded org names by email-prefix match. No org owners in the schema yet.
  • Posting opportunities — the org "Post new" button is a stub.
  • Apply / scheduling flow — the volunteer "Apply" button just navigates; no real shift booking.
  • Notifications — the bell icon is decorative.

What we'd build next

  • Switch to real auth (NextAuth + email/Google) and add an Organization model with owner refs
  • Move the LLM cache to Redis so it survives serverless cold starts
  • Stream coach responses (currently we wait for the full reply)
  • Add Posthog so we can see whether the AI reasons actually drive higher save-through rates

Team

KTH-Sys · UC Davis · HackDavis 2026

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors