A browser-based computer vision app that detects facial expressions and hand gestures to match you with viral memes in real-time. Built with React, MediaPipe, and pure chaos.
Brainrot is a CV program that maps human facial expressions (and hand gestures!) to popular meme reactions in real time.
Using your webcam and the MediaPipe library, the system tracks key facial landmarks and displays a corresponding meme when it detects specific expressions.
Supported Expressions (~14):
- Shock, Scream, Tongue, Happy, Sad, Wink
- Glare, Suspicious, Sleepy, Eyebrow raise
- Confused, Pout, Disgust, Kissy, Neutral
Supported Gestures (9):
- Middle Finger, Peace, Thumbs Up/Down
- OK, Rock On, Wave, Fist, Pointing
- Your webcam feed is processed in real time using MediaPipe Tasks Vision.
- Facial landmarks and hand gestures are extracted directly in the browser.
- Heuristics determine which expression is active.
- A matching meme is displayed instantly.
git clone https://github.com/monuit/brainrot-cv.git
cd brainrot-cvMake sure you have Node.js 18+ or Bun installed.
# using bun (recommended)
bun install
# using npm
npm install# using bun
bun run dev
# using npm
npm run devOpen http://localhost:3000 in your browser.
You can customize sensitivity and thresholds in src/lib/config.ts.
Adjust how easily expressions are triggered:
export const CONFIG = {
thresholds: {
eyeOpening: 0.03, // Shock detection
mouthOpen: 0.025, // Tongue/mouth open detection
squinting: 0.018, // Glare detection
smile: 0.012, // Smile detection
// ...
},
// ...
}Control how fast memes switch:
transitions: {
holdTime: 300, // Must hold expression for 300ms
debounce: 500, // Wait 500ms before next switch
crossfadeDuration: 300, // Visual fade duration
},- Fork the repository
- Create a new branch (
git checkout -b feat/new-meme) - Add new memes to
src/assets/expressions/ - Commit your changes
- Push your branch and open a pull request
This project is licensed under the MIT License. See the LICENSE file for details.
Have fun 🧠⬛
