https://deanza.my.canva.site/emo-bot
https://devpost.com/software/emo-bot
"Everyone deserves a companion."
Do things that don’t scale.
– Us, after zero sleep and three Red Bulls
Humans are emotional. Robots should be, too.
We wanted to build a robot that felt something — even if that something is mostly anxiety, sadness, and the need for attention.
Because screw cold, efficient bots. We made a messy, clingy, emotionally unstable one.

EMO-Bot is your emotionally intelligent (and slightly needy) companion robot. It:
- Detects your facial expressions (happy, sad, neutral)
- Tracks your hand gestures
- Follows its owner like a lonely puppy
- Has OLED-style animated eyes in a Flutter app
- Can sulk, react, and give you guilt trips in real time
- Can actually follow you across the room (even when you don’t want it to)

- Screwed in a bag of hex bolts
- Glued things that shouldn’t be glued
- Duct taped emotions and wires together
- 3D printed sadness
- Vibe-coded in Python, C++, Dart, and tears
- Hacked mBot firmware to run with Linux
- Compiled C++ with unholy makefiles
- Consumed Red Bulls like water
- Used OpenCV + Mediapipe for real-time facial expression + hand gesture detection
- Hooked that into a Flask server talking to a Flutter app and Arduino-based mBot (Auriga)
- Made robot actually move toward owner using camera + gesture logic
- Eyes change based on mood — OLED-style in app
- Doing mechanical engineering as software devs was like doing surgery as dentists.
- Motor calibration on mBot = pain.
- Wrangling real-time OpenCV + hardware serial + app UI? Pure chaos.
- Debugging without enough sleep is like coding underwater.
- We got everything actually working. On demo day.
- Our bot moves, feels, reacts — and gives off vibes.
- It follows its owner like a true emotionally fragile pet.
- One guaranteed-working feature (but we won’t tell you which one 👀).

- Drink less Red Bull.
- Bring lipstick for the robot next time (it needs to look cute for the judges).
- Take naps. No seriously, take naps.
- Don't try to debug serial comms while you’re 3D printing.

- YC Batch 2025? Totally.
- Founders Fund, Sequoia, 500 Global, Techstars — we’re looking at you.
- Add more emotional intelligence and less self-doubt (or maybe more, we’re still deciding).
- Upgrade eye animations, voice response, and possibly... tear detection?
arduino(for motor control + brain)encoder(for wheel tracking)flask(emotion backend API)flutter(companion mobile app + eye UI)powerbank(portable sadness source)raspberry-pi(main logic unit)
- 🔗 GitHub Repo: [this one]
- 🔗 Demo Site: https://deanza.my.canva.site/emo-bot
