Skip to content

codejapoe/SFHacks

Repository files navigation

🖤 EMO-Bot: The Emotionally Broken Companion Robot

https://deanza.my.canva.site/emo-bot

https://devpost.com/software/emo-bot

"Everyone deserves a companion."
Do things that don’t scale.
– Us, after zero sleep and three Red Bulls ezgif-7c29bae587639e


🧠 Inspiration

Humans are emotional. Robots should be, too.
We wanted to build a robot that felt something — even if that something is mostly anxiety, sadness, and the need for attention.

Because screw cold, efficient bots. We made a messy, clingy, emotionally unstable one. ezgif-7086deb82bb6b5


🤖 What It Does

EMO-Bot is your emotionally intelligent (and slightly needy) companion robot. It:

  • Detects your facial expressions (happy, sad, neutral)
  • Tracks your hand gestures
  • Follows its owner like a lonely puppy
  • Has OLED-style animated eyes in a Flutter app
  • Can sulk, react, and give you guilt trips in real time
  • Can actually follow you across the room (even when you don’t want it to) ezgif-33353d7742d271

🔧 How We Built It

  • Screwed in a bag of hex bolts
  • Glued things that shouldn’t be glued
  • Duct taped emotions and wires together
  • 3D printed sadness
  • Vibe-coded in Python, C++, Dart, and tears
  • Hacked mBot firmware to run with Linux
  • Compiled C++ with unholy makefiles
  • Consumed Red Bulls like water

🎥 OpenCV + Emotion Detection

  • Used OpenCV + Mediapipe for real-time facial expression + hand gesture detection
  • Hooked that into a Flask server talking to a Flutter app and Arduino-based mBot (Auriga)
  • Made robot actually move toward owner using camera + gesture logic
  • Eyes change based on mood — OLED-style in app

🧩 Challenges We Ran Into

  • Doing mechanical engineering as software devs was like doing surgery as dentists.
  • Motor calibration on mBot = pain.
  • Wrangling real-time OpenCV + hardware serial + app UI? Pure chaos.
  • Debugging without enough sleep is like coding underwater.

✅ Accomplishments We're Proud Of

  • We got everything actually working. On demo day.
  • Our bot moves, feels, reacts — and gives off vibes.
  • It follows its owner like a true emotionally fragile pet.
  • One guaranteed-working feature (but we won’t tell you which one 👀). ChatGPT Image Apr 5, 2025, 10_30_34 PM

💡 What We Learned

  • Drink less Red Bull.
  • Bring lipstick for the robot next time (it needs to look cute for the judges).
  • Take naps. No seriously, take naps.
  • Don't try to debug serial comms while you’re 3D printing. 20250406_100503

🚀 What’s Next for EMO-Bot

  • YC Batch 2025? Totally.
  • Founders Fund, Sequoia, 500 Global, Techstars — we’re looking at you.
  • Add more emotional intelligence and less self-doubt (or maybe more, we’re still deciding).
  • Upgrade eye animations, voice response, and possibly... tear detection?

🛠️ Built With

  • arduino (for motor control + brain)
  • encoder (for wheel tracking)
  • flask (emotion backend API)
  • flutter (companion mobile app + eye UI)
  • powerbank (portable sadness source)
  • raspberry-pi (main logic unit)

🎮 Try It Out

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 3

  •  
  •  
  •