React application built by CFG degree students
- About this project
- Instructions
- Maintainers
able speak is a non-verbal communication application that gives the user the ability to drag and drop images to form a basic sentence. Once the images are in place you can click the microphone and the sentence will be read out to you. This web application should also be accessible on a mobile or tablet device, but its functionally to be fully adaptive and responsive in smaller device is still in development. To restrict ourselves to small sentences that are easy to build, we have decided that the app should be used to communicate basic needs in a home setting, however the app could be expanded to other settings, such as a classroom or a voyage.
What does it do and what does it solve?
We understand how frustrating it is for non-verbal communicators to connect with loved ones and those around them, so we've developed the able-speak app to bring people together and help transform their lives. This app was designed not only to help boost confidence and reduce isolation for non-verbal people, but also to give them a voice. It’s time for everybody to be heard.
At this stage, the user has limited options to what they can choose, but with further development, the user (or their carer) would be able to create a customizable profile and populate their options based on their likes and dislikes, etc.
What are the key features of our system?
-
Selected images can be dragged and dropped to form a sentence. Once they are placed in the dropped zone, a microphone button speaks the title of the relevant images when clicked. A user can choose to select only one image if they want to.
-
Using the reset button, the images will go back to their original spots and the user can choose new images to form a new sentence.
-
To view its prototype go to: https://www.figma.com/proto/zpHoU2NPGep09MVKjVV83H/Group5-Project-Able_Speak?node-id=4%3A3&scaling=scale-down&page-id=0%3A1&starting-point-node-id=4%3A3
-
Instructions for the prototype:
Anytime you need to return to the homepage you can press “R” and navigate to the rest of the pages. In some cases, it might take a little while to load. Preferably use a browser but if from a mobile phone, on the food/drinks page, you would need to press and hold in order to get to the relative page and you can use its navigation system to go back. Also on the food/drinks page, you would need to click in order to “scroll down” to see more.
To start with, clone this repository locally on your computer:
In your chosen IDE, e.g. VSCode, in the project directory, open a new terminal and you can run:
This will install all the required node_modules.
Runs the app in the development mode.
Open http://localhost:3000 to view it in your browser.
The page will reload when you make changes.
You may also see any lint errors in the console.
Launches the test runner in the interactive watch mode.
See the section about running tests for more information.
Current maintainers:
- Abigail Unwin
- Despoina Tsounaka
- Justyna Janiszewska
- Sirad Bihi
- Zeynep Kurugollu