Skip to content

⛱️ Explain Life is an interactive AI-driven iOS mobile application for autism spectrum disorder people as a companion. Designed and Developed by Dylan da Silva for IDV 304

Notifications You must be signed in to change notification settings

dylandasilva1999/explain-life-ios-app

Repository files navigation


GitHub repo size GitHub watchers GitHub language count GitHub code size in bytes LinkedIn Instagram Behance

Dylan da Silva IDV 304


Logo

Explain Life.

Spectrum Autism Disorder Companion App
Explore the docs »

View Demo · Report Bug · Request Feature

Table of Contents

About the Project

image1

Project Description

According to an article by mental health, autism spectrum disorder is “an umbrella term that covers everyone with conditions within the spectrum of autism”. The umbrella is a metaphor for shielding autistic people from over stimulation. Explain life would be a tool/companion mobile application which ASD people can use to communicate, learn, and express themselves.

The reason for choosing the mobile application name “Explain Life” for the duality of it’s meaning. In the one sense it explains life to autistic people such as social context and in the other it explains what autistic people understand to non-autistic people. It emphasises the importance on how autistic people are viewed and how this mobile application can aid the users (ASD people) to learn and improve on conversational, emotion expression, emotion detection etc. in their daily lives. Hence it being a sort of companion to the user.

Explain Life is an interactive AI-driven iOS mobile application for autism spectrum disorder people as a companion.

Built With

                                

Getting Started

These instructions will get you a copy of the project up and running on your local machine for development and testing purposes.

Prerequisites

Requires iOS 13 and Xcode 11

Installation

  1. In Xcode go to
File -> Swift Packages -> Add Package Dependency
  1. and paste in the repo's url: https://github.com/dylandasilva1999/explain-life-ios-app

Open in Xcode 12 or later.

  1. Install Cocoapods
cd your/directory
pod install

Artificial Intelligence Integration

Apple Speech-To-Text

Speech-to-text technology converts spoken words into digital text on a screen. Speech-to-text was implemented and used in Explain Life, to record a sentence in a converstation and input and display the text in the app for analysis.

Apple Text-To-Speech

Text-to-speech is a type of assistive technology that reads digital text aloud. Text-to-speech was used in Explain Life when a user wants to type how they feel, as well as for the emotion expression clicking on an emotion card.

IBM Tone Analyzer

The IBM Watson® Tone Analyzer uses linguistic analysis to detect emotional and language tones in written text. An API call is made to IBM on the text that is inputted with the Speech-to-text and then perform analysis on that text, which then displays the emotion and tone conveyed in the sentence.

Features and Functionality

Features

Speak

image2 Using the text-to-speech AI the user has the ability to type in the textbox what they want Explain Life to say out loud, expression their emotions in difficult social interaction. The speak features is to assist the user with communication, and minimise anxiety and stress in social interaction.

Record & IBM Tone Analysis

image3 Using the speech-to-text AI the user can record a sentence while in a social conversation/interaction. Thereafter making an API call to the IBM Tone Analyzer services to analyse the text that was inputted through speech, and provide what emotion/tone was conveyed in the user's sentence. The record & IBM tone analysis feature is used in Explain Life to assist autism spectrum disorder users to better understand emotions that other people are conveying in their sentences.

Emotion Expression

image4 Using the text-to-speech AI the user can click on any of the emotion cards, which Explain Life then reads out loud how the user is feeling.

Secure Firebase Authentication

image5 Secure log in and register with email, password, and fullname with forget password functionality.

Full Onboarding

image6 A full onboarding upon first time launch, which explains the core features of Explain Life.

Edit Profile Details

image7 Edit your username, and authentication email, with which you use to sign in to Explain Life.

Settings

image8 In the settings, there are links to donate to Autism South Africa, as well as a link to how to come in contact with Autism South Africa. The user also has the ability to reset settings which resets that the onboarding shows again and it signs the user out.

Functionality

  • Firestore Database for storing the user info, which include the email, and fullname.
  • Firebase Authentication for secure email & password log in (with forget password functionality).
  • Cocoapods for dependency management and adding additional frameworks and SDK's.
  • ScrollView, HStack, VStack, ZStack were used for creating layouts.
  • IBMWatsonToneAnalyzerV3 was the pod used to be able to make use of the IBM Tone Analyzer.
  • SwiftyJSON was the pod/library that helps to read and process JSON data from an API/Server.
  • Alamofire was the pod used as an elegant and composable way to interface to HTTP network requests.
  • @EnviromentObjects, @State, @StateObject, @ObservedObject for linking functions and files.

Concept Process

The Conceptual Process is the set of actions, activities and research that was done when starting this project.

Ideation

image9 image10

Wireframes

image11 image12

Development Process

The Development Process is the technical implementations and functionality done in the backend of the application.

Implementation Process

Design Architecture

MVC design architecture used for structuring Explain Life.

Highlights

  • The biggest highlight for Explain Life was all the research that went into every single aspect of the application 🤩.
  • One major highlight was getting the Firebase database and authentication working.
  • Adding the ability to forget and reset your password 😅.
  • The UI/UX design that was based of a persona for the specific autism spectrum disorder audience.
  • Custom Validation on the sign in and register views.
  • Another major highlight was integrating all the AI functionality, and combine them as well 👏.

Challenges

  • One major challenge was understanding how to integrate three different types of AI.
  • There being a bug that the user does provide access to the microphone, but at random times it does not record.
  • Working the JSON data that came from the IBM Tone Analysis service.

Reviews & Testing

The Reviews & Testing was done through a video demonstration, and a google form with questions related to the application.

Feedback from Reviews

Peer Reviews were conducted by my fellow students and lecturer. The following feedback I found useful:

  • "Yes, the use of light colours and white space makes the application seem very clear and non-distracting".
  • "I think the design is made in such a way that its very easy to understand and take in upon entering the app. The largeness of the shapes just works really well with the whole "easy" part of the app".
  • "Text-to-speech already works well, perhaps have a small animation play as the text is being read back.".
  • "Maybe when the user is recording their speech, the "click to record" could change to "recording.." or something just to indicate even more clearly it is happening".
  • "Love the use of emojis, will be relatable to the users. Maybe just use colours as well to display mood".
  • "The application could remember your name and previous moods, and ask if you are feeling different from previously. It could also prompt you to explain why you are feeling a certain way".

Future Implementation

  • One simple but important future functionality would be to host the application on the iOS Store.
  • Adding the ability to change the voice accent to another person in the settings.
  • Creating and integrating a web dashboard for parents to view data tracked within the Explain Life mobile application.

Final Outcome

Mockups

image13 image14

Video Demonstration

To see a run through of the application, click below:

View Demonstration

Promotional Video

To see the promotiomal video, click below:

View Promotional Video

Roadmap

See the open issues for a list of proposed features (and known issues).

Contributing

Contributions are what makes the open-source community such an amazing place to learn, inspire, and create. Any contributions you make are greatly appreciated.

  1. Fork the Project
  2. Create your Feature Branch (git checkout -b feature/AmazingFeature)
  3. Commit your Changes (git commit -m 'Add some AmazingFeature')
  4. Push to the Branch (git push origin feature/AmazingFeature)
  5. Open a Pull Request

Authors

License

Distributed under the MIT License. See LICENSE for more information.\

Contact

Acknowledgements

About

⛱️ Explain Life is an interactive AI-driven iOS mobile application for autism spectrum disorder people as a companion. Designed and Developed by Dylan da Silva for IDV 304

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published