Skip to content

12. Intuitive design

Mark Barlow edited this page Jun 17, 2016 · 34 revisions

12. Create a service that is simple and intuitive enough that users succeed first time.

Link to corresponding item in JIRA: https://jira.informed.com:8443/browse/FCOLOI-165

Questions

  • What evidence can you provide that users are, in the majority of cases, succeeding first time?
  • Were less digitally minded and non subject area experts able to use the beta service?
  • How were the design and content decisions made in beta?
  • Have you checked that the content used within the service aligns with the content published on relevant GOV.UK pages?
  • Has the current version of the service been tested for accessibility?
  • Are you demonstrating just the happy path - what thought has gone into other paths and can you show us they work?
  • What prototype testing have you done so far?
  • What did and do you plan to test?
  • How did and do you test the prototype with end users?
  • What have you learnt?
  • What did you change?
  • What didn't you change and why?
  • How many other versions of the prototype did you try?
  • Why did you choose this version?

Assisted digital support:

  • Which routes of assisted digital support will you be testing in beta?
  • How do they meet user needs?
  • What are your plans to test your assisted digital support during beta?
  • Can you iterate your assisted digital support across all routes and providers, for the full end-to-end user journey?

Evidence

Service Manager able to:

  • show the majority of users of the service are succeeding first time.
  • explain how the design and content decisions for the service were made, and relate back to user research, usability testing and analytics.
  • show the service is accessible.
  • explain other paths in the service and demonstrate that they work.
  • show videos of usability testing.
  • talk through substantial iteration in the design and content of the service.

Assisted digital support:

Service Manager able to:

  • talk through how the assisted digital support has been designed to meet user needs, including routes and providers. If not providing all types (telephone and face by face, talk through and on behalf of), explain why.
  • explain the end-to-end user journeys for assisted digital support, including identity assurance (e.g. Verify) if required.
  • explain how you will test your assisted digital support in beta, with users with the lowest level of digital skills and access.
  • explain how you will test the end-to-end user journey for each route, including identity assurance (e.g. Verify) if required.
  • explain how you are able to iterate your assisted digital support across all routes and providers, for the full end-to-end journey?


What evidence can you provide that users are, in the majority of cases, succeeding first time? Were less digitally minded and non subject area experts able to use the beta service?

  • Evidence from usability testing, which we have been doing end-to-end since early on in alpha including all journeys - so no late surprises. This testing includes first timers i.e. non subject area experts, users lower on the digital inclusion scale, and users with accessibility needs
  • Direct feedback during private beta - fixed 2 issues with account creation (phone number formats); fixed 1 issue with submitting additional information and 1 with payment which temporarily stopped applications; no other evidence of failure to use service
  • Survey feedback during private beta - 379 responses as of 17 June, 341 successfully submitted (89.9%) remaining 38 covered by above and/or period when premium service not offered on the private beta
  • Survey 90.49% satisfaction rate, NPS of 67 (76% promoters). We have seen both these figures increase during the private beta with the addition of further enhancements and features e.g. account saving functionality. Satisfaction has gone up 4.5% since our March assessent, and the NPS has increased from 55 (65% promoters)
  • Survey shows a wide range of users from regular experts to first timers - good even spread on frequency of usage
  • Piwik analytics allows us to track journeys and completion rates - 70% successful completion rate to date in June

Survey snapshots for type of applicant and frequency:

Survey - capacity

Survey -  frequency

Free text comments from the survey so far align well with our key goals of speed and simplicity:

User feedback

How were the design and content decisions made in beta?

Summary: a mixture of working collaboratively, referring to service manual/hackpad/style guide, and user testing

  • Usability testing
  • Prototyping and wireframing - Cyril initial ideas working with Mark, then workshop with Mike and Robert, prototype/build and test best
  • Content - creating prototype pages with text on to actually use them as designed pages
  • Content - using plugins like Chrome Page Editor to edit text in situ and try out variations
  • Content - GOV.UK style guide cross referencing
  • Analytics to validate journeys and monitor drop offs

Have you checked that the content used within the service aligns with the content published on relevant GOV.UK pages?

Yes - in face much of the detailed content (about documents) originated on the GOV.UK smart answer. We have built a clearer, step by step process based on user feedback on the smart answer at the outset of the project.

Has the current version of the service been tested for accessibility?

Yes, a combination of methods

Internal testing:

  • WAVE toolbar - used to structurally evaluate content of pages for accessibility issues
  • NVDA screen reader - used to test the portal as a user with accessibility needs would experience it. Full end-to-end journey followed using just screen reader for navigation
  • Apple Voicover - used to test the mobile views of the portal as a user would experience it

External testing:

  • Accessibility audit completed by Digital Accessibility Centre, judging the portal against A/AA/AAA WCAG 2.0 standards. The reported highlighted 6 A high priority areas and 4 AAA low priority areas, comprising 9 and 19 specific items respectively, all of which have been addressed. Report available on request.
  • See 1. User needs for details of the face-to-face usability testing with users with accessibility needs

As our starting point for the production service was the GOV.UK front-end toolkit and associated elements, we had a good baseline of accessibility. The email address and phone number of the Legalisation Office are displayed on the service start page and can be found easily by users who visit the portal with a screen reader.

Are you demonstrating just the happy path - what thought has gone into other paths and can you show us they work?

Some examples:

  • If documents are not certified, there is a deliberate 'deadend' page to save the user wasting time and money
  • If you get zero results when searching for documents, we suggest broader category searches and also offer an A-Z browseable list of all documents which can be legalised
  • Payment failure - we have a dedicated page, and have identified some improvements we can make
  • Expired email validation link - if you click a link that has expired, you immediately get informed and sent another one
  • Account locked - if you, or someone else, locks your account by virtue of 5 consecutive failed login attempts, an email is immediately sent to the verified account email address informing them, with instructions on how to unlock it by resetting the password
  • Forgotten password - we have a standard password reset process, and notify users after it has been changed

What prototype testing have you done so far? What did and do you plan to test?

See 1. User needs for detail - we have covered all user types and end to end journeys, desktop and mobile

We plan to continue testing iterations, including but not limited to:

  • account sign up
  • using saved details in an application
  • account dashboard and IA
  • improvements to the eligibility checker
  • more mobile testing

How did and do you test the prototype with end users?

See 1. User needs for detail, a mixture of methods:

  • In person testing with customers in 3 different office locations
  • In person testing with users with accessibility needs (product not prototype)
  • Remote testing
  • Survey based feedback with dispersed customer groups such as notaries

What have you learnt?

Tons! See summaries of results from tests in 1. User needs. Too many to repeat but things like:

  • The need for speed for regular customers
  • The need for simplicity and step by step breakdowns for first time customers
  • The need to make any additional handholding skippable for the notaries and well-versed users
  • People may use categories simply because they are there, rather than because they provide a better interface than alternatives - a search-based approach outscored categories on both task completion speed and user preference, despite earlier success with category-based testing
  • Card sorting can help generate lots of useful synonyms for search
  • As ever, many people will not read the page. They may just read the titles and the field names and go for it. Judicious use of radio and checkbox choices can help focus attention.
  • The check your answers design pattern has some accessibility drawbacks owing to the positioning of the Change links
  • Our payment provider seriously lacks configurability in the UI, and we look forward to moving to GOV Pay later this year
  • The challenge of address and country mapping when you have the FCO country register, an address lookup provider's address and country taxonomy, an internal caseworking address taxonomy, and a despatch service address taxonomy
  • The challenge of dealing with accounts which are at once optional for some users and mandatory for others
  • An assessment gets you looking at the world in different ways

What did you change?

Most parts of the service have been iterated substantially, which we can explain in the demo. For example:

  • Start page
  • Service choice page
  • Eligibility checker - more than 17 iterations
  • Address input pages
  • Postage choice pages
  • Number of documents page simplification (no countries, no signatories)
  • Additional / optional info on one page
  • Check answers is only about checking answers, declaration separate
  • Payment jump page content - more to come based on latest testing
  • Application confirmation page - several versions until we got one that communicated a. online part is over, b. what to do next
  • Cover sheet - structure and contents
  • Confirmation email - simplify and get QR code in
  • Registration and sign up flow - major changes
  • Applications dashboard simplification

What didn't you change and why?

  • Payment pages - because we can't, much as we'd like to
  • Start page: we have not substantially changed the start page as users reported it had what they needed. We have just updated the instructions to fit the changes, made the start button more visible, and included (and explained) the word 'apostille' as we have evidence users are searching for this
  • Name of signatory: we considered adding this and had it in our first version of the prototype, but it put too much onus on the user
  • Additional data in general: the focus groups gave us a clear steer on the importance of speed and simplicity, which has been a good steer and prevented us from adding additional fields

How many other versions of the prototype did you try?

See above - almost every page has been through several revisions from minor to major

Why did you choose this version?

Combination of many factors, with validation through user testing being the ultimate one

  • Standard patterns where possible, deviation only when required for service need (e.g. Finder UX did not quite work)
  • Guidance such as one thing per page
  • Alpha design feedback
  • Design hackpad discussions and recommendations
  • Expert feedback
  • Usability testing

Assisted digital support:

Which routes of assisted digital support will you be testing in beta? What are your plans to test your assisted digital support during beta?

These are our routes:

  • Telephone talk through - the starting point, talking a user through while they enter their own details
  • Telephone on behalf of - the second level option, for users who do not have access or do not feel able to complete the online application themselves
  • Offline form - we will fall back to this option if the user is not comfortable being assisted over the phone (although all research to date suggests users want a phone based option). In this scenario, we send the legacy offline application form to them in the post and they return it having filled it in themselves. We then enter the data on receipt.
  • Face to face talk through - this is an option for customers of the premium service and business drop off, where we interact directly with customers. This is not an option for postal customers, and has not been requested in research or in private beta feedback.
  • Face to face on behalf of - this is also an option for customers of the premium service and business drop off, but we have no indications from research and private beta feedback that any users want this

How do they meet user needs?

See 1. User needs for detail on the research and the challenges we had, but the insights we have on our customers indicate a strong preference to be assisted over the phone.

Users without an email will not receive an email confirmation of their assisted applications, but this does not in any way prevent them successfully legalising their documents.

Can you iterate your assisted digital support across all routes and providers, for the full end-to-end user journey?

Yes, and as with all areas of the product we expect to learn and improve over time.

For example:

  • We would like to explore whether any repeat AD users would like us to create an account on their behalf which - as with applications - we are able to do over the phone. We have not had any demand for this yet.
  • If volumes were to change, we would explore the role of Serco vs our internal staff in assisting users to make applications
Clone this wiki locally