SyntheticSense is an open-source assistive technology project designed to provide deaf-blind individuals with integrated spatial awareness and communication capabilities. This project combines AI-powered computer vision with haptic feedback in a wearable system that anyone can build, modify, and improve.
This project is freely available for everyone to use, study, modify, and distribute. We believe assistive technology should be:
- โ Accessible - No licensing fees or proprietary restrictions
- โ Collaborative - Built by and with the community it serves
- โ Educational - A learning platform for students and researchers
- โ Adaptable - Customizable for individual needs and preferences
No patents. No restrictions. Pure innovation for social good.
The system provides two essential features in a unified, wearable platform:
- Sony IMX500 AI camera with embedded neural processing detects objects in real-time
- Directional haptic feedback alerts users to obstacles (left, center, right positioning)
- Edge AI processing ensures low-latency responses without cloud dependency
- Privacy-first design - all processing happens locally on device
- Grid of vibration motors arranged to represent Braille characters
- Wireless messaging via MQTT/Wi-Fi for remote communication
- NumPy-based encoding converts text to haptic Braille sequences
- Customizable patterns - users can define their own tactile codes
Hardware Components:
- Primary Computing: Raspberry Pi Zero 2W (analyzes camera data, generates alerts)
- Haptic Control: Raspberry Pi Pico (drives vibration motor arrays with precise sequencing)
- Vision Processing: Sony IMX500 (provides on-sensor AI inference)
- Communication: MQTT/Wi-Fi (enables remote message transmission)
- Power: Modular, low-power design for extended battery operation
Software Stack:
- Python 3.x for main control logic
- NumPy for message encoding/storage
- OpenCV/IMX500 SDK for vision processing
- MicroPython for Pico firmware
- MQTT client for wireless communication
Design Philosophy:
- ๐ Low-power edge computing
- ๐ Privacy-preserving (no cloud required)
- ๐งฉ Modular architecture (easy to modify/extend)
- ๐ฐ Affordable components (~$150-200 BOM)
- ๐ Well-documented for learning
SyntheticSense addresses a critical gap in assistive technology. While existing devices typically focus on either:
- Navigation (ultrasonic canes, guide systems), OR
- Communication (electronic Braille displays)
This system integrates both in a single wearable platform, enabling:
- Safer navigation with real-time obstacle alerts
- Tactile communication without external displays
- Independence through combined awareness and messaging
- Affordable, hackable design anyone can build
- Assistive navigation for deaf-blind individuals in daily environments
- Emergency communication systems for sensory impairments
- Augmented spatial awareness for rehabilitation and training
- Edge AI demonstrations for computer science students
- Haptic interface studies in HCI research
- Accessibility technology curriculum development
- Multi-modal feedback system experiments
- Virtual reality accessibility interfaces
- Industrial safety alert systems
- Silent communication in specialized environments
- Prototype platform for custom assistive devices
# Hardware (estimated cost: $150-200)
- Raspberry Pi Zero 2W
- Raspberry Pi Pico
- Sony IMX500 AI Camera
- Vibration motors (6-8x)
- Motor driver board
- Battery pack
- Miscellaneous wiring/components
# Software
- Python 3.8+
- NumPy
- OpenCV (or IMX500 SDK)
- MQTT client library# Clone the repository
git clone https://github.com/creativeacttech/SyntheticSense.git
cd SyntheticSense
# Install dependencies
pip install -r requirements.txt
# Follow setup guide in /docsNote: Detailed setup instructions, hardware assembly guides, and code documentation coming soon!
This project is in active development. Current status:
โ Completed:
- System architecture design
- Component selection and testing
- Conceptual diagrams and documentation
- NumPy-based Braille encoding system
๐ง In Progress:
- Hardware integration testing
- Python control software
- Haptic feedback calibration
- User interface refinement
๐ Planned:
- Full code release (Q2 2026)
- Assembly instructions with photos
- Video demonstrations
We welcome contributions from everyone! Whether you're:
- An accessibility advocate with user insights
- A hardware engineer with design improvements
- A software developer with code contributions
- A researcher with testing data
- Someone who wants to build and test the system
- Report Issues: Found a bug or have a suggestion? Open an issue
- Improve Documentation: Help make this more accessible to builders
- Code Contributions: Submit pull requests with improvements
- Test & Provide Feedback: Build the system and share your experience
- Spread the Word: Share with accessibility communities and researchers
- Check existing issues before creating new ones
- Document your changes clearly
- Test your contributions when possible
- Be respectful and inclusive
This project is ideal for:
- University courses in assistive technology, embedded systems, or AI
- Hackathons focused on accessibility or social good
- Capstone projects for engineering students
- Research platforms for haptic interface studies
- Maker spaces and accessibility workshops
Educators: Feel free to use and adapt this project for your curriculum. Contact us if you'd like educational support materials.
- Hardware Bill of Materials - Component list with suppliers (coming soon)
- Assembly Guide - Step-by-step build instructions (coming soon)
- Software Setup - Installation and configuration (coming soon)
- API Reference - Code documentation (coming soon)
- Discussions: GitHub Discussions
- Issues: Report bugs or request features
- Inspired by the deaf-blind community's resilience and feedback
- Built with open-source tools and libraries
- Special thanks to accessibility advocates
If you use SyntheticSense in your research or project, please cite:
@software{syntheticsense2025,
author = {Gonzalez, Julian Anival},
title = {SyntheticSense: Open-Source Haptic Navigation and Communication System},
year = {2025},
url = {https://github.com/creativeacttech/SyntheticSense}
}This project is licensed under the Apache License 2.0 - see the LICENSE file for details.
What this means:
- โ Free to use, modify, and distribute
- โ Commercial use allowed
- โ Patent grant included
- โ Changes must be documented
- โ Original attribution required
No patents held or claimed on this design. This is intentionally open for everyone to use and improve.
- Complete Python control software
- Hardware assembly guide with photos
- Basic haptic Braille library
- MQTT communication implementation
- Battery optimization
- Machine learning-enhanced object detection
- Customizable haptic patterns
- Mobile app for remote messaging
- Multi-language Braille support
- Voice-to-haptic conversion
- Integration with smart home systems
- Fall detection and emergency alerts
- Location tracking for caregivers
- Your ideas here!
This is an assistive aid, not a replacement for:
- Mobility training
- White canes or guide animals
- Professional orientation services
- Medical devices
Always prioritize safety and consult with accessibility professionals when implementing assistive technology.
This is a research concept and educational platform. While we strive for quality and safety:
- System reliability needs extensive real-world testing
- Users should conduct their own safety assessments
- No warranties or guarantees are provided
- Community feedback is essential for improvement
Assistive technology should be open, accessible, and community-driven.
Whether you're building this for yourself, contributing code, sharing feedback, or simply spreading awareness - you're helping make technology more inclusive for everyone.
Star this repository โญ to follow our progress and show your support for open assistive technology!
This diagram illustrates a wearable haptic interface that helps users detect obstacles through vibration. When the AI camera spots an object, the Raspberry Pi 5 activates motors on the side closest to the objectโvibrating to alert the user.
This is the central configuration, which various configurations and features can be added in future iterations.
This diagram presents a multi-directional haptic navigation interface designed to assist deaf-blind individuals with basic spatial awareness. When an object is detected, specific motors vibrate in directional patternsโforward, back, left, right, or multi-directionalโalerting the user to the obstacle's location. The tactile feedback helps guide navigation without relying on sight or sound.
These designs can be modified as needed!
Built with โค๏ธ for the accessibility community
Made by Julian Anival Gonzalez | IBM Champion 2025


