Skip to content

Gagan004-coder/SignBridge

Repository files navigation

AccessiComm

AccessiComm is a comprehensive mobile Android application designed to facilitate seamless communication between deaf, blind, and speech-impaired individuals, as well as with the general public. The app leverages machine learning, computer vision, and multimodal interaction to break down communication barriers.

🎯 Mission

To create an inclusive communication platform serving 466M+ deaf individuals and 285M+ visually impaired individuals globally, establishing AccessiComm as the leading accessibility communication tool.

✨ Key Features

🔤 Sign Language Translation

  • Real-time sign language recognition using advanced ML models
  • Text/Speech to Sign Language animation with 3D avatars
  • Interactive learning mode with accuracy feedback
  • Support for multiple sign languages (ASL, ISL, BSL)

👁️ Visual Assistance

  • Object & Scene Recognition with spatial audio feedback
  • OCR Text Reading for documents, menus, and signs
  • Face Recognition for familiar people
  • AI-powered scene description with natural language

💬 Multi-Modal Communication

  • Text, Voice, Sign Video, and Image messaging
  • Real-time video calls with live translation
  • Quick communication mode with preset phrases
  • Group communication with mixed accessibility needs

🚨 Emergency Features

  • SOS Emergency Alert with location sharing
  • Medical ID & Information display
  • Emergency contact management
  • Quick emergency phrases

♿ Accessibility Features

  • WCAG 2.1 Level AA compliance
  • High contrast mode and large text support
  • Screen reader compatibility (TalkBack)
  • Voice commands for all features
  • Customizable haptic feedback
  • Offline mode for critical situations

🏗️ Architecture

Tech Stack

  • UI Framework: Jetpack Compose with Material 3
  • Architecture: MVVM with Clean Architecture
  • Dependency Injection: Hilt
  • Database: Room (SQLite)
  • Networking: Retrofit + OkHttp
  • ML/AI: TensorFlow Lite, MediaPipe
  • Backend: Firebase (Auth, Firestore, Storage, Messaging)
  • Navigation: Navigation Compose

Project Structure

app/
├── src/main/java/com/accessicomm/app/
│   ├── data/                    # Data layer
│   │   ├── local/              # Room database
│   │   ├── model/              # Data models
│   │   └── repository/         # Repository implementations
│   ├── di/                     # Dependency injection
│   ├── ui/                     # UI layer
│   │   ├── components/         # Reusable UI components
│   │   ├── navigation/         # Navigation setup
│   │   ├── screens/            # Screen composables
│   │   └── theme/              # App theming
│   └── service/                # Background services

🚀 Getting Started

Prerequisites

  • Android Studio Hedgehog or later
  • Android SDK 26+ (Android 8.0+)
  • Kotlin 2.0.21+
  • Gradle 8.13.0+

Installation

  1. Clone the repository
  2. Open in Android Studio
  3. Sync project with Gradle files
  4. Run on device or emulator

Configuration

  1. Add Firebase configuration (google-services.json)
  2. Configure ML model files in assets/ml_models/
  3. Set up API keys in local.properties

📱 User Personas

Sarah - Deaf Sign Language User

  • Uses ASL for communication
  • Needs real-time translation for hearing individuals
  • Requires visual alerts for sounds

Rajesh - Blind Technology User

  • Uses screen readers and assistive technology
  • Needs object detection and scene description
  • Requires OCR for reading documents

Maya - Speech-Impaired User

  • Uses text-to-speech communication
  • Needs quick phrase libraries
  • Requires emotion expression in messages

James - Hearing/Sighted Caregiver

  • Learning sign language basics
  • Facilitating communication for family members
  • Managing emergency contacts and medical info

🎨 Design Principles

  • Accessibility First: Every screen usable by all disability types
  • Minimal Complexity: Maximum 3 taps to reach any feature
  • Clear Feedback: Visual + Audio + Haptic for all actions
  • Consistency: Unified design language across all screens
  • Forgiveness: Easy undo, clear confirmations for destructive actions

🔧 Development

Building

./gradlew assembleDebug

Testing

./gradlew test
./gradlew connectedAndroidTest

Linting

./gradlew lintDebug

📊 Performance Targets

  • App launch: <3 seconds (cold start)
  • Feature activation: <1 second
  • ML inference: <500ms per frame
  • Message delivery: <2 seconds
  • RAM usage: <400MB during active use
  • Battery drain: <20% per hour of active use

🔒 Security & Privacy

  • End-to-end encryption for messages
  • Local processing for camera/ML (no image upload by default)
  • Encrypted storage for sensitive data
  • GDPR and CCPA compliant
  • Granular permissions with clear explanations

🌍 Internationalization

  • Phase 1: English, Spanish, Hindi, Mandarin
  • Phase 2: +10 additional languages
  • Sign languages: ASL, ISL, BSL (phased rollout)

📈 Success Metrics

  • 50,000+ downloads within 6 months
  • 4.5+ star rating on Google Play Store
  • 60%+ user retention at 30 days
  • Featured in Google Play's accessibility collection
  • Partnership with 3+ major accessibility organizations

🤝 Contributing

We welcome contributions! Please see our Contributing Guidelines for details.

📄 License

This project is licensed under the MIT License - see the LICENSE file for details.

📞 Support

🙏 Acknowledgments

  • Google's accessibility team for guidance
  • The deaf and blind communities for feedback
  • Open source ML model contributors
  • Accessibility advocates and organizations

AccessiComm - Breaking down communication barriers with AI-powered accessibility.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages