Skip to content

mserra0/pearUp

Repository files navigation

SwiPears

Decentralized dating powered by Federated Learning — your swipe data never leaves your device.

Built for HackUPC 2026 · Powered by Pear / Holepunch


Table of Contents


Overview

Traditional dating apps silo your behavioral data on their servers — building recommendation engines from raw swipe histories they own indefinitely. SwiPears flips this model entirely.

Each device trains its own local neural network from your swipes. The only data shared across the P2P network is a weight update — a compact array of floats encoding what patterns you prefer, not who you swiped on. Every peer in the swarm runs sample-weighted Federated Averaging (FedAvg) independently, building a collectively smarter recommendation model with no central server and no raw data exchange.


How It Works

User swipes on a profile
        │
        ▼
Local neural network trains on the swipe
(Adam optimizer, binary cross-entropy)
        │
        ▼
Weight tensors extracted as a JSON payload
{ layers, numSamples }
        │
        ▼
Broadcast to Hyperswarm peers
(no raw swipe data — weights only)
        │
        ▼
Epidemic relay propagates weights to peers
that cannot hole-punch directly
        │
        ▼
Each peer runs sample-weighted FedAvg
W_new = Σ (n_k / N) · W_k
        │
        ▼
Profile rankings update immediately

Features

Feature Description
Swipe & Learn Each swipe trains your local model in real time
Federated Learning Weight updates sync across the swarm — every peer improves together
P2P Chat Direct encrypted messaging between connected peers
User Profiles Persistent local identity shared with peers via epidemic relay
Zero-config join All devices auto-join a shared room on startup — no setup required
Network resilience Works across NAT, hotspot AP isolation, and CGNAT via gossip relay

Architecture

dfl-pears/
├── index.js              # Pear app entry point (Bare runtime)
│                         # Owns: Hyperswarm, ML model, WebSocket server, identity
├── ui/
│   ├── index.html        # Electron renderer — three-tab UI
│   ├── app.js            # Renderer logic (swipe, chat, profile)
│   └── assets/           # Images, avatars, icons
├── src/
│   ├── ml/
│   │   ├── model-bare.js # Pure-JS neural network (Pear/Bare runtime)
│   │   ├── model.js      # TF.js neural network (Node.js / tests / viz)
│   │   ├── fedavg.js     # Sample-weighted Federated Averaging
│   │   └── dataset.js    # Profile dataset (feature vectors)
│   └── network/
│       ├── swarm.js      # Hyperswarm peer discovery + epidemic relay
│       └── multiwriter.js# Autobase/Hypercore persistent weight logs
├── viz/
│   ├── server.js         # N-peer in-process DFL simulator (WebSocket)
│   └── dashboard.html    # Live Chart.js dashboard
├── demo-peer.js          # CLI peer simulator
└── test/                 # Unit + integration tests

Dual ML Implementation

The Pear runtime uses Bare — a minimal JavaScript runtime where native addons like @tensorflow/tfjs-node cannot load. SwiPears ships two compatible implementations:

Context File Backend
Pear app model-bare.js Pure-JS NN (Adam, backprop from scratch)
Node.js / tests / viz model.js TF.js sequential model

Both share the same weight serialization format ({ layers: [{shape, data}], numSamples }) so FedAvg works transparently across both runtimes.

Neural Network

Input (6) → Dense(16, ReLU) → Dense(8, ReLU) → Dense(1, sigmoid)

6 input features: catLover, dogLover, age, activityLevel, indoorness, creativity — all normalized to [0, 1].

Why These Pear Primitives?

Primitive Role in SwiPears
Hyperswarm Peer discovery via DHT + UDP hole-punching — no central server
Epidemic relay Application-layer broadcast — delivers weights, chat, and identity even when direct hole-punching fails
Hypercore Append-only weight log per peer — immutable, tamper-proof gradient ledger
Autobase Multi-writer linearization — merges weight logs from all peers into one causal timeline

Tech Stack

Layer Technology
Runtime Pear / Bare
GUI pear-electron + Chromium
P2P Networking Hyperswarm
Persistent Logs Hypercore + Autobase
ML (Pear app) Pure-JS neural network (model-bare.js)
ML (Node / tests) TensorFlow.js
Visualization WebSocket + Chart.js

Getting Started

Prerequisites

  • Node.js ≥ 18
  • Pear CLI
npm install -g pear
pear run pear://runtime   # completes the Pear installation

Add Pear to your PATH (macOS / Linux):

export PATH="$HOME/Library/Application Support/pear/bin:$PATH"
# Add to ~/.zshrc or ~/.bashrc to persist

Install from source

git clone https://github.com/mserra0/dfl-pears.git
cd dfl-pears
npm install
pear run --dev .

Run instantly with Pear

No cloning needed — run the published release directly:

pear run pear://zwyp9gbi84htjjg1nqbji8q3f9bxw36yxsxkde6kptekheg33ooy

Usage

The app auto-joins a shared room on startup. Open it on multiple devices and they discover each other automatically — no topic copying required.

  • Swipe tab — Like (♥) or pass (✕) profiles using the buttons or ← / → arrow keys
  • Chat tab — Message any connected peer; unread badge updates in real time
  • Profile tab — Set your display name, avatar, age, bio, and interests

CLI peer simulator

Simulate an auto-swiping peer from the terminal:

node demo-peer.js <64-char-topic>

Auto-swipes every 2.5 seconds and broadcasts weight updates to the GUI app — useful for demo runs without a second device.


Federated Averaging

Local training (per swipe):

Adam optimizer minimizes binary cross-entropy
over the local dataset for E epochs

Aggregation (on receiving peer weights):

W_new = Σ_k (n_k / N) · W_k

where  n_k = samples peer k trained on
       N   = Σ n_k  (total samples across all peers)

Each peer aggregates independently — there is no central parameter server.


Privacy

Shared across the network Never leaves the device
Weight tensors (floats) Which profiles you swiped on
Sample count Like / pass labels
Display name, avatar, bio Raw behavioral data
Chat messages (recipient only) Your cryptographic key

Weights encode what feature combinations you prefer — not who you swiped on. For production deployments, differential privacy (DP-SGD) and secure aggregation would be layered on top.


Visualization Dashboard

Watch N in-process peers train and converge in real time:

npm run viz -- --peers 4 --rounds 10
# then open viz/dashboard.html in a browser
Option Default Description
--peers N 3 Number of simulated peers
--rounds N 8 Training rounds
--swipes-per-round N 50 Local samples per round
--delay MS 1500 Milliseconds between rounds

Testing

npm test                # all tests
npm run test:model      # TF.js model unit tests
npm run test:fedavg     # FedAvg correctness tests
npm run test:dfl        # end-to-end 3-peer DFL over real Hyperswarm
npm run test:mesh       # 4-peer full-mesh formation on loopback
npm run test:gossip     # star-topology relay propagation proof

Connectivity diagnostics

npm run check     # DHT bootstrap + topic announce + peer discovery probe
npm run vanilla   # minimal Hyperswarm mesh — isolates network vs. app issues

Roadmap

  • Real distributed profiles — load profiles from an Autobase-backed distributed ledger instead of a local dataset
  • Differential privacy — DP-SGD to prevent weight inference attacks
  • Secure aggregation — cryptographic guarantees on FedAvg inputs
  • Multi-window support — multiple GUI instances per machine for local multi-peer demos

License

MIT

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors