Skip to content

02. Ongoing research

lauraczapiewski edited this page Oct 4, 2016 · 54 revisions

2. Put a plan in place for ongoing user research and usability testing to continuously seek feedback from users to improve the service.

Link to corresponding item in JIRA: https://jira.informed.com:8443/browse/FCOLOI-155

The main objective is to ensure that you have someone on the team who is dedicated to doing the user research, that there are plans to continue doing user research, and that there is evidence that outcomes from the user research will be fed into the ongoing development/design of the service. Responses should cover both digital and assisted digital support. When doing user research for assisted digital, ensure that research is done specifically with (potential) users of this particular service who have the lowest level of digital skills. Recruitment and research with this audience will need to be done using offline methods. Accessibility testing with people who have particular access needs should be done throughout the service design process and not outsourced as a separate activity at the end of the design process.

Questions

  • Are the resources in place to do regular user research and usability testing?
  • Who in the team is doing user research and usability testing?
  • How often are you doing user research and usability testing?
  • Are you testing with a full range of end users, including those with low or no ability to use the digital service?
  • Are you doing regular usability testing with people who have particular access needs (accessibility testing)?
  • How is the analytics data feeding into the research plan for the service?
  • How do the results feed into the design of the service?
  • What is the user research plan for the next stage and are there resources for user research and usability testing?

Evidence

User Researcher and/or Service manager able to:

  • explain who is doing user research and usability testing and how it is being resourced.
  • talk through the research plan for the next stage of the project.
  • talk through the plan to ensure the research covers the full range of end users including accessibility and assisted digital users.
  • explain how the results from user research and usability testing are incorporated into the design of the service.

Assisted digital research

We will undertake testing to validate whether users with lower digital skills and/or language barriers can use the service.

The testing will cover the different cascading AD needs we have identified to date in our research:

  • Talk a user through on the phone
  • Fill in on behalf of user over phone
  • Send a form in the post

During public beta the FCO will also continue to monitor the actual support numbers. All AD applications will be logged in the legalisation caseworking tool and reported in MI, as well as reported to the performance platform.

We have agreed with John Waterworth at GDS that in a context such as this service, where overall volumes are relatively small, testing does not have to be with proven, actual users of legalisation. Testing with likely users is acceptable as long as the scenarios are plausible. As such, our plan is as follows:

Users Research
Users with UK documents and low digital skills who travel a lot We will use the new digital outcomes and specialists framework to recruit users. The following two agencies have been recommended in particular for hard-to-find users with access needs, but the procurement searches are likely to find more options:
  • Acumen Fieldwork
  • Criteria.
    Consular Insights Team have existing contracts with recruitment agencies who we can use to recruit users with low digital skills and/or accessibility needs who travel, have homes overseas and have UK documents. Need to establish budget for this.
  • Users with UK documents and low digital skills who are planning to get married We will engage with the following 3 wedding agencies, who are our largest customers from that sector:
  • AV Ltd
  • Ionian Weddings
  • The Bridal Consultant
    We will ask these agencies whether they are prepared to promote our research efforts e.g. help us identify users with low digital skills and offer them the opportunity to provide feedback or participate in research to help shape the future of the service.

    We will also look at opportunities to carry out research at wedding fairs. We will engage with town halls locally to see if they would be happy for us to perform pop up research with users when they visit to arrange certificate of no impediments and to identify peak times to perform such research.
  • Foreign language & International users, students We will arrange with a local university to test with their international students, as discussed with John Waterworth in the review call. Bedford University approached to date. We will contact a school with an international intake to assess opportunity for user research with their students and speak to personal contacts who teach foreign students
    Foreign language & International users, other We will arrange with two MFA Embassies in the UK to test the service with some of their nationals who have to go to the Embassy when they register a birth etc.
  • Lithuania - confirmed (large volumes). Agreed to share outreach message to the community via the Lithuanian Embassy newsletter
  • Other - tbc
  • Legalisation users over the phone / post Ensure logging and feedback for any users who:
  • Are talked through applications by Serco
  • Are contacted by LO who complete application on their behalf over the phone
  • Are sent a form in the post. Only one user in two weeks has requested a phone application
  • Existing legalisation users We previously sent 40,000 leaflets out to encourage users with low digital skills to contact us, with a very low response rate (3 users). During the public beta phase we will trial sending a new leaflet with amended wording for 1 month to see if we get a better response rate. This leaflet will be sent to all customers when returning their documents in the post. We expect to send up to 20,000 leaflets out.
    Users with low digital skills and UK documents who are purchasing or considering purchasing a home overseas We will consider opportunities to attend overseas property shows to perform research with users and contact event managers to assess the feasibility of this.

    Accessibility research

    Users Research
    Digital Accessibility Centre accessibility audit Accessibility audit being completed during private beta phase. Checking compliance with W3C WCAG 2.0 AA level
    Digital Accessibility Centre disabled user testing 3 days of user testing with users with the following disabilities:
  • Blind
  • Low Vision
  • Colour Blind
  • Dyslexia
  • Limited limb mobility
  • Learning disabilities
  • Deaf
  • Aspergers (ASD)
  • Anxiety/Panic disorder

    Testing included the following assistive technologies:
  • JAWS screen reader
  • NVDA screen reader
  • VoiceOver, Mac native screen reader
  • ZoomText screen reader and magnification application
  • MaGic screen magnification application
  • SuperNova magnifier and screen reader
  • WindowEyes screen reader for Office
  • Dragon Naturally Speaking voice activated software
  • Keyboard only input in lieu of mouse or other pointing devices
  • Disabled members of the public 2-3 days of face-to-face testing with users with a range of impairments and impairment levels.

    End-to-end testing to help identify issues such as unreadable printed information or the need for visual assistance to look up offline information. Home or office based testing is anticipated so that the participants can use their own equipment.

    Recruitment via specialist agencies as above, to include potential or likely users of legalisation e.g. have UK docs and travel a lot.
    FCO Enable accessibility network Continue testing with disabled volunteers from the FCO Enable network. Confirmed by John that this group of diplomats are legitimate test participants for legalisation. They are both actual and potential users of the legalisation service for their postings abroad.

    Standard postal and business service usability testing

    In-person usability testing

    • Continue on-site testing at the premium office in Victoria, using the prototype and/or product depending on the nature of the test, ensuring we continue to reach a representative range of customers by size of business. Thsi will be a combination of:
    • By invitation, targeted recruitment
    • Drop-in testing days
    • Continue on-site testing and surveying of the 25 drop-in customers at the office in Milton Keynes:
    • By invitation only
    • Following an iteration to legalisation, adding a feature to track application status, we will do more pop up testing at the premium office in London Victoria to test and obtain feedback about this new feature. |

    Remote usability testing

    • Using WhatUsersDo's managed service, our target remains an average of 1 test per month, and we have secured additional budget for this resource for the new financial year. This reaches:
    • First time users, tapping into existing panel to match various demographics and personas
    • Elderly and lower skilled users matching our "Ethel" persona, via qualifying questions about confidence and support. 19% of the panel are at points 4-6 on the digital inclusion scale, compared to 4% of legalisation customers
    • Private panel of existing legalisation users covering the different services

    Notaries research and testing

    Private beta

    • Notaries are heavily engaged as users of the private beta service, and are the community who provide the most proactive feedback (96 pieces of direct feedback)

    Notaries Society (NS)

    • Meet Committee on 6 monthly basis, and provide regular email communication - presentation coming up on 28 June
    • Approximately 750 members (out of 900 in total)
    • This route was the first one we used to invite users to try the private beta
    • Andy Hamilton, Service Manager, wrote an article for The Notary magazine on modernisation at the start of the project and contributed an update during beta. Andy has invited comments and is writing a new piece in July
    • Andy has lined up a speaking opportunity at the next Notaries Society event in October
    • Committee members are in regular contact, providing feedback on the service

    Notary Talk Forum

    • Dedicated website/forum for Notaries
    • Andy posts information on the site (through the administrator) about issues that are coming up, inviting views/comments etc

    Notary Training (NT)

    • Group providing training for Notaries
    • Andy spoke at their conference in Cambridge last year, in Birmingham in May this year, and we've been invited to speak again in October
    • Andy approached the Head of NT as part of the prototype testing

    Customer satisfaction tracker, Consular Insights team

    • The FCO’s customer satisfaction tracker survey is a representative survey that gathers feedback from a wide range of consular assistance customers and FCO fee paying customers
    • Feedback from fee paying customers is gathered from a database of legalisation customers who have shared their details for feedback
    • Fee paying customers are contacted monthly, with the aim of gathering feedback from 1,000 fee paying customers per year
    • We assess digital capability as part of the customer satisfaction tracker survey using GDS’ assisted digital scale
    • The results from the fee paying surveys and assistance surveys are combined to produce one overall metric of customer satisfaction. This metric is reported on bi-annually, in April and October
    • This will help us monitor impact, and ensure we are not adversely impacting the 86% customer satisfaction rate for legalisation
    • The survey is currently being iterated which gives us an opportunity to review the questions and add anything new following the move to live beta. The new survey will go to customers from October, November and December 2016.

    Example report: available on request, confidential


    Summary - ongoing plan

    Usability testing Qual / Quant
    Assisted digital
  • Face-to-face testing
  • Depth interviews
  • Capturing feedback from real users using the beta service
  • Digital inclusion scale questions now in customer satisfaction tracker so that we can measure responses, and continue to attempt to identify more AD users for usability research as part of BAU research process
  • Accessibility
  • Face-to-face testing with recruited participants
  • Face-to-face and remote testing with FCO Enable community disabled staff
  • Usability testing covering wide range of disabilities and assistive technologies via DAC
  • Completed an accessibility audit during beta
  • Qual feedback collected as part of usability testing
  • First time users
  • WhatUsersDo standard panels, filtered to match representative demographics including lower confidence; additional budget secured for this having proven the value
  • Customer satisfaction tracker
  • Beta feedback survey
  • Repeat users, non-specialist
  • WhatUsersDo private panel
  • Customer satisfaction tracker
  • Beta feedback survey
  • Notaries
  • Face-to-face
  • Direct engagement
  • WhatUsersDo private panel
  • Direct engagement
  • Customer satisfaction tracker
  • Beta feedback survey
  • Businesses
  • Face-to-face testing in premium service waiting room and MK office
  • WhatUsersDo private panel
  • Direct engagement
  • Customer satisfaction tracker
  • Beta feedback survey
  • MK drop off customer survey
  • MK drop-off customers also have the direct e-mail address of Gill Marshall, who runs that part of the service, and for the premium service it's the same with Josie Farrell.

  • How is the analytics data feeding into the research plan for the service?

    Analytics: at this stage our focus is on:

    • Drop off points and number of users without documents ready (i.e. not certified)
    • Monitoring answers to the "have you an account already?" page which is a potential slow-down for regular users (redesigned routing as a result)
    • Monitoring impact of design changes to ensure intended affect e.g. registration completion rate
    • Monitoring answers to feedback consent which is important for our research plans and customer satisfaction reporting
    • Monitoring interactions on confirmation page with print cover sheet link and help for users without a printer. Feeds back into question of how printing is presented on start page (discussed with GDS content design for start page)
    • Monitoring search queries for documents - anything unexpected? do we need more synonyms?
    • Unexpected data which might suggests a misunderstanding (or a tracking bug) e.g. address or postage answers

    Some areas we wish to explore as a result, based on early analytics:

    • Should the start page be less definitive about printing, given we offer an alternative? or would there be a shock factor getting to the end and seeing print options?
    • Do we need to talk about certification sooner, or is the current position optimal (has a logic to it as we need to know document format first, and comes before entering any personal details)?
    • Already actioned - Should we have a sign in route which bi-passes the account check page? Yes, implemented

    How do the results feed into the design of the service?

    Same process throughout - review research results with team, identify issues, create user stories (or bugs) and prioritise - then re-test in next round to validate changes have made the improvements without intended side effects


    Clone this wiki locally