This is a project to aid Touch and visually impaired people. This project utilises Deep Learning where the uniqueness of this project lies in no GPU computation, this project is made to work on lightweight devices and mobile devices with less computation power, while it gave 95% accuracy in our tests.
I tried to detect a human hand while reading a Braille text over an A4 sheet. so the computer can detect the motion and movement of the index fingers of both hands, without juggling and confusion around both fingers.
I implemented MobileNet (a type of ResNets) on the M1 Apple chip without using any GPU cores. This project is a PoC to enable the development of CPU-driven Deep Learning for mobile computation. Some salient features are as follows.
- Non GPU driven process.
- Model developed clear disambiguation in the detection of both hand's index fingers uniquely.
- Good implementation of cutting-edge MobileNets and DeepLabCut open source models.
- Fast computation and better result.
Follwing are the requirments of this project:-
- Python 3.7
- CPU cores (M1 Apple Siilicon used in the project)
- Tensorflow
- 2 cameras to produce 3D video setup (Smartphone works)
- Posture Estimation toolbox
To install this project you can drectly download this repo or you can use git clone, Here we go:-
[1] From Download Button: Click on the Clone or download
and the Download Zip
. By this downloads should be started.
[2] From commands: git clone https://github.com/priyank001/Braille_Pose_Estimation
. By this downloads should be started.
This project is done under supervision of Prof. Reilly, Executive Vice-Dean of MIEC, Fuzhou University. Thanks!