Community-Powered AI for Blind Accessibility
A fully autonomous, community-driven object identification system using GitHub Actions, Bluesky, and Grove Vision AI V2.
OurIdentify is an open-source, portable device that helps blind and low-vision users identify objects in their daily lives. What makes it unique is that the community directly improves the AI model through natural conversation on Bluesky.
- You use the device - Point at an object, press the power button, and hear what it is
- Community helps improve it - When the device is uncertain, press a button to ask the community on Bluesky
- AI learns automatically - Every day, the model trains on community feedback and updates your device
- Everyone benefits - As the community helps, all devices get smarter together
- β Portable & Battery-Powered - Take it anywhere (kitchen, bathroom, office, travel)
- β 3-Minute Sessions - Continuous identification mode with auto-power-off
- β Privacy-First - You control what gets posted; images only shared when you press a button
- β Community-Powered - Blind users and allies directly improve the model through Bluesky
- β Fully Autonomous - GitHub Actions automatically trains and deploys new models
- β Zero Cost Forever - Runs entirely on free GitHub infrastructure ($0/month)
- β Transparent & Auditable - Everything is open source, no black boxes
- β Built to Last - Survives individual developers leaving; community-owned forever
Status: Currently in Phase 0 (Hardware Validation)
We're validating the hardware and building the initial prototype. If you're interested in being an early tester:
- β Star this repository to stay updated
- π Read the Complete Guide to understand the system
- π¬ Join the conversation on Bluesky: @ouridentify.bsky.social
- π Open a GitHub Issue to express interest in testing
We need your help with:
- π§ Hardware validation (ESP32-S3 + Grove Vision AI V2)
- π€ YOLOv8 training pipeline development
- π± ESP32 firmware development
- π GitHub Actions workflow creation
- π Bluesky API integration
- π Documentation and accessibility guides
- π¨ 3D-printed enclosure design
See CONTRIBUTING.md for how to get started.
| Phase | Status | Description |
|---|---|---|
| Phase 0 | π‘ In Progress | Hardware validation & initial dataset |
| Phase 1 | π΄ Not Started | Hardware assembly & firmware MVP |
| Phase 2 | π΄ Not Started | Full automation (GitHub Actions + Bluesky) |
| Phase 3 | π΄ Not Started | Optimization & production-ready |
| Phase 4 | π΄ Not Started | Community growth & expansion |
Current Milestone: Validating ESP32-S3 β Grove Vision AI V2 communication at 921,600 baud
- ESP32-S3 - Main processor with WiFi
- Grove Vision AI V2 - Camera with on-device AI inference
- 4 Buttons - Power, "Too Generic", "Incorrect", Spare
- Speaker - Audio feedback via text-to-speech
- Portable - Battery-powered, ~3-hour runtime per charge
1. User presses Power button
β
2. Device runs for 3 minutes, identifying objects every 5 seconds
β
3. If device is wrong or too generic, user presses a button
β
4. Device posts image to Bluesky asking community for help
β
5. Community replies with correct identification
β
6. GitHub Actions collects replies daily (2 AM Pacific)
β
7. When 10+ images across 3+ categories collected, model retrains
β
8. Devices auto-update at midnight (local time)
β
9. Everyone's device is now smarter!
Everything runs on free, public infrastructure:
- GitHub Actions - Automated training (2,000 min/month free)
- GitHub Releases - Model distribution (free CDN)
- Bluesky - Community feedback (free, decentralized protocol)
- ESP32 - Local inference (no cloud costs)
Total Cost: $0/month. Forever.
- Independence - Identify items in your kitchen, bathroom, office without assistance
- Privacy - On-device processing; you control what gets shared
- Trust - Open source means no exploitation, no hidden agendas
- Ownership - Community-driven means it's built for you, by you
- Sustainable - Designed to outlive any single developer or organization
- Accessible - Built specifically with accessibility in mind, not retrofitted
- Transparent - Every decision, every line of code, every model update is public
- Empowering - Blind users directly improve the AI through natural interaction
- Novel Architecture - Reference implementation for sustainable, community-driven AI
- Zero-Cost ML Ops - Demonstrates how to run ML training/deployment on free infrastructure
- Ethical AI - Model where users have agency and control over their data
- Extensible - Easy to fork for other use cases (education, specialized workplaces, etc.)
| Document | Description |
|---|---|
| ouridentify_guide.md | Complete technical guide (architecture, hardware, workflows) |
| docs/SETUP.md | Hardware assembly and firmware flashing guide |
| docs/CONTRIBUTING.md | How to contribute to the project |
| docs/FIRMWARE.md | ESP32 firmware development guide |
| docs/ARCHITECTURE.md | System design details |
| docs/BLUESKY_WORKFLOW.md | How Bluesky integration works |
Total Cost Per Device: ~$60
| Component | Purpose | Cost |
|---|---|---|
| ESP32-S3-DevKitC-1 | Main processor | $5 |
| Grove Vision AI V2 | Camera + ML accelerator | $30 |
| 4 Buttons | User input | $5 |
| Speaker with Amp | Audio feedback | $5 |
| USB-C Cable | Power + data | $5 |
| Power Bank | Portable power | $15 |
See docs/SETUP.md for detailed assembly instructions.
- ESP32-S3 - Main MCU with WiFi/Bluetooth
- Grove Vision AI V2 - Himax WE2 chip with on-device inference
- UART @ 921,600 baud - Communication between ESP32 and Grove
- YOLOv8 Nano - Object detection model
- TensorFlow Lite - Inference on embedded devices
- Ethos-U Vela - Model optimization for Grove's NPU
- Arduino/ESP-IDF - ESP32 firmware framework
- GitHub Actions - Automated training and deployment
- GitHub Releases - Model distribution CDN
- Bluesky ATProto - Community feedback protocol
- Python 3.11+ - Training scripts
- Bluesky: @ouridentify.bsky.social - Community engagement
- GitHub Issues: Report bugs & request features
- GitHub Discussions: Architecture & design questions
- Email: your-email@example.com - Direct contact
We are committed to providing a welcoming and inclusive environment. Please read our Code of Conduct before contributing.
- β Hardware validated
- β UART communication working
- β YOLOv8 exports to UF2
- β ESP32 can flash models
- π‘ 50+ kitchen items trained
- π‘ End-to-end automation working
- Medications and toiletries
- Safety-critical items (allergens, expiration dates)
- Builds on kitchen success
- Office supplies, documents, equipment
- Workplace accessibility
- Community-maintained forks for specialized use cases
- User-trained personal collections
- Multi-language support
This project is designed to run forever, even if all original developers leave.
- No Personal Servers - Everything runs on GitHub (free tier) and Bluesky (free protocol)
- Open Source - Anyone can fork, maintain, and improve
- Community-Owned - Blind users directly control model improvements
- Graceful Degradation - Devices work offline; updates optional
- Fork-Friendly - Modular architecture makes extending/adapting easy
- Move to GitLab (similar free tier)
- All code/models are portable
- No vendor lock-in
- Community maintains independence
The goal: Build something that outlives us.
This project is open source under the GNU Affero General Public License v3.0 (AGPL-3.0).
What this means:
- β Use it for any purpose
- β Study and modify the source code
- β Distribute copies and modifications
- β Strong copyleft: Modifications must also be AGPL-3.0
- β Network use = distribution: If you run a modified version on a server, you must make the source available
- β Protects community ownership forever
- Built for and with the blind and low-vision community
- Thanks to Seeed Studio for Grove modules and documentation
- Thanks to Ultralytics for YOLOv8 and making ML accessible
- Thanks to Bluesky for the open ATProto protocol
- Thanks to the open source community for TensorFlow, Ethos-U, and all supporting tools
- Project Lead: [Your Name]
- Email: [your-email@example.com]
- Bluesky: @ouridentify.bsky.social
- GitHub: yourusername
"Our community, our tool, our independence"
β Star this repo to support accessible AI π Share with anyone who might benefit π¬ Join the conversation on Bluesky