The project aims to embed a Deep Neural Network in charge of recognize characters from a touchscreen on a Low Power Board, specifically the LandTiger NXP LPC1768 based on ARM Cortex-M3 ucontroller
- Github link: This repo
- Google Colab link: Google colab notebook
- Mbed repo: Mbed repo
- DNN Accuracy > 95%
- Low degree of redundancy, small DNN size
- Develop keeping in mind that reliability tests will be performed
- Pixel flip during transfer is a fault
- How many faults can the DNN afford?
- Noise injection will be performed
- Occlusion learning has to be considered
- Retrieve character from the touchscreen
- Provide to the user a mechanism to confirm the input (low priority)
- If user confirms, the character is compressed and then ready to be sent via USB/RS232 port (if we manage to use USB, great)
- The character is sent to the DNN running on a PC
- The DNN makes the prediction and it is displayed on the touchscreen
- The user can, at this point, give a feedback on the correctness of the predicted character
- Decide how we want to display the character which passed all steps
- Provide to the user a mechanism to stop the input phase (stop: y/n)
- y ? stop : goto 1
- No PC is needed then no serial connection is needed anymore
- The DNN has to be reduced in order to embed it on the board
- With the extra material try to expand the concept of 'reduce the DNN'