Skip to content

An AI based app for visually impaired people that helps with object detection, face recognition and currency recognition.

Notifications You must be signed in to change notification settings

ShahSomething/A-EYE

Repository files navigation



Platform Docs

Overview

A-EYE provides a platform where the visually impaired users of our app can easily do their daily work without depending upon their sighted friends or family members. The App has voice instructions feature that helps the users navigate to different parts of the App without any difficulty. A-EYE provides three main features: Object Detection, Face Recognition, and Currency Recognition (Pakistani Currency). Apart from these main features, the App also has an Emergency shake feature which sends the location of the user to their emergency contacts when the user shakes their mobile phone.

Key Features

  • Multi-platform Support for Android, iOS, Windows, Mac, Linux.
  • Object Detection.
  • Face Recognition.
  • Currency Recognition.
  • Voice Instructions.
  • Emergency Shake.

(Important) Initial setup : Add dynamic libraries to your app

Android

  1. Place the script install.sh (Linux/Mac) or install.bat (Windows) at the root of your project.

  2. Execute sh install.sh (Linux) / install.bat (Windows) at the root of your project to automatically download and place binaries at appropriate folders.

    Note: The binaries installed will not include support for GpuDelegateV2 and NnApiDelegate however InterpreterOptions().useNnApiForAndroid can still be used.

  3. Use sh install.sh -d (Linux) or install.bat -d (Windows) instead if you wish to use these GpuDelegateV2 and NnApiDelegate.

These scripts install pre-built binaries based on latest stable tensorflow release. For info about using other tensorflow versions follow instructions in wiki.

iOS

  1. Download TensorFlowLiteC.framework. For building a custom version of tensorflow, follow instructions in wiki.
  2. Place the TensorFlowLiteC.framework in the pub-cache folder of this package.

Pub-Cache folder location: (ref)

  • ~/.pub-cache/hosted/pub.dartlang.org/tflite_flutter-<plugin-version>/ios/ (Linux/ Mac)
  • %LOCALAPPDATA%\Pub\Cache\hosted\pub.dartlang.org\tflite_flutter-<plugin-version>\ios\ (Windows)

Examples

Object Detection

The very first screen that appears on the app is Object Detection Screen. The app informs the visually impaired user about the current screen and how to navigate to another screen using Voice instructions. It also tells the user how to activate the Object Detection Model.

View

When the user taps on the Object Detection screen, a camera view is opened and it starts detecting the objects in front of the camera and the app starts to speak the results out loud to the visually impaired user.

Currency Recognition

The second screen in the app is the Currency Recognition Screen. The app informs the visually impaired user about the current screen and how to navigate to another screen using Voice instructions. It also tells the user how to activate the Currency Recognition Model.

View

When the user taps on the Currency Recognition screen, a camera view is opened and the app asks the user to tap anywhere on the screen to take a picture of the currency note. When the picture is taken, the app comes back to the home screen with the results and speaks it out loud.

Face Recognition

The third and final screen in the app is the Face Recognition Screen. The app informs the visually impaired user about the current screen and how to navigate to another screen using Voice instructions. It also tells the user how to activate the Face Recognition Model.

View

When the user taps on the Face Recognition screen, a camera view is opened and the app starts to recognize faces in front of the camera and the app starts to speak the results out loud to the visually impaired user.

Emergency Shake

The app saves emergency contacts of the user and in case of emergency, if the user shakes their mobile phone, it will send the time and location of the user to the emergency contacts.

The message feature is added through shake detection because the visually impaired person will have to just shake their phone to send a text instead of manually typing a message.

About

An AI based app for visually impaired people that helps with object detection, face recognition and currency recognition.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published