Chat bot for connecting individuals to wellbeing resources, links, data, etc.
Over CodetheCity8 weekend team ALISS focused on three use cases, focused on one of those, documented the User Experience of the human / chatbot interaction. Based upon the first use case, a template chatbot was selected, UI and Voice, a fixed or rigid set of rules/logic was coded to produce the workflow and finally data was retrieved from the ALISS data API based upon keyword and location. This produced a lot of learning and basis to free up the chatbot to many use cases.
The Use case Demo
- type yes (small letters only) type yes
- type yes type no
- type no type anxiety type Aberdeen (note capital A)
Updated outstanding UI updates and logic workflows listed in GITHUB issues.
FREEing up the chatbot.
- Review use experience again, start, interaction, stop etc.
- Extract logic / workflow i.e. the AI engine for the bot to its own class.
- Extract out UI / html / css building code e.g. for yes no insert map widget etc.
- Extract preparing data from ALISS rest api, probably requires a CHATBOT rest API upgrade to save time in clients browser.
- Think more about blind, deaf, disability interaction with chatbot.