Skip to content
Mark Barlow edited this page Jan 10, 2018 · 141 revisions

1. Understand user needs. Research to develop a deep knowledge of who the service users are and what that means for the design of the service.

Link to corresponding item in JIRA: https://jira.informed.com:8443/browse/FCOLOI-154

In the beta phase, the main purpose is to establish that the team has continued to build the service in a way that meets user needs, and this has been done in a way that makes the service easy for all users. We are particularly looking to see that findings from the user research are reflected in the design of the service as it progresses through this phase. Responses should cover both digital and assisted digital support. When doing user research for assisted digital, ensure that research is done specifically with (potential) users of this particular service who have the lowest level of digital skills. Recruitment and research with this audience will need to be done using offline methods.

Questions about user needs:

  • Who are the users?
  • What have you done to understand your users’ needs?
  • Tell us about what users are trying to do when they encounter your service?
  • What are the needs that they have when they use this service? How do they meet those needs now?
  • How do they meet those needs now?
  • What are the pain points?
  • Which users have the most challenging needs?
  • How have you been learning more about these challenging user needs?
  • What are the particular design challenges for this service with this audience?
  • Tell us about what you’ve learned about the particular needs of people who are less confident online or not online?
  • What did you do to discover this?

Questions about usability:

  • how many rounds of usability testing have you done so far?
  • who have you done usability testing with?
  • what were the tasks you set for participants and what materials did you supply to help them complete the task (if relevant).
  • give us some specific examples of how aspects of the interface design has changed in each iteration in response to usability testing? (show us your build/measure/learn cycles, what hypotheses did you test? what happened? what did you do?)
  • can most people get through the service end-to-end without assistance?
  • which users have the most challenging needs? How have you been learning more about these challenging user needs?
  • have you started testing methods for supporting people who do need assistance to get through the service?
  • who have you done this testing with and what were your findings?
  • has your testing included the supporting content and proposed start page for the service?
  • what issues have you not yet resolved?
  • what are your hypotheses around how you might solve those issues? How will you test that?
  • what issues are you unable to test/resolve in beta thus far? How are you going to handle that as you move into public beta?
  • have you tested whether the name of the service makes sense to your users?

Evidence

It is very useful to include your user researcher in the team presenting at the assessment to answer assessor's questions. The user researcher and/or service manager should be able to answer questions from the assessment panel by showing and referring to some or all of these artefacts of user research (for the onscreen service and assisted digital support), which include:

  • user research output of discovery that describes how users (including assisted digital users) are currently meeting the needs that this service will meet, (e.g. a customer journey map or user needs map), key pain points in the current journey and description of the user research that has informed this output.
  • stories of people you have met, persona, profiles or some other way of telling the stories of the users (including assisted digital users) who will be using this service in the future.
  • the user needs you have identified for this service, including any specific needs of assisted digital users.
  • any key insights you have gained from the research that describes significant service design challenges for this project to overcome.
  • your research schedule for beta thus far (who you did research with, when and where, including assisted digital users).
  • user research videos and accompanying user research analysis outputs for recent sprints.
  • examples showing how the design of various elements in the service have changed from iteration to iteration in response to user research.
  • findings from user research undertaken for assisted digital support, with potential assisted digital users of the specific service, with the lowest level of digital skills, confidence and access and through offline methods.

Facts and figures
Focus groups & survey
Baseline usability tests
Prototype usability test 1
Prototype usability test 2
Prototype usability test 3
Prototype usability test 4
Popup usability tests
Notaries usability test 1
Payment and user account research
Account registration and payment usability test
Premium service popup usability test
Accessibility tests
Assisted digital research
Card sorting
Anonymous feedback
Transactional customer satisfaction survey
Personas
Browser data
Google analytics


Facts and figures

Inspired by the user research mailing list, here are the numbers....

Research Numbers Audience Insights
Focus groups 3 groups, 15 people 1. Postal customers, mainly one-off users 2. Regular London premium service customers 3. Notaries Speed, simplicity, different journeys for different users; clarity - difference between services (and access in one place); clarity - which documents accepted; saving details such as payment card; confirmation email, status communication; reconciliation - use own client ID; business users needs not met by Verify at present (personal account)
Depth interviews 9 4 postal users (3 business, 1 consumer), 5 staff " "
Semi-structured interviews 14 Milton Keynes drop-off customers " "
Quant survey about changes 479 User: 47% individuals, 41% businesses, 12% notaries; Frequency: 38% one-off, 32% regular with multiple docs, 30% more than once but not regular " ". Validated points above, plus rejected some other ideas from focus groups e.g. paypal, direct debit
Usability tests, WhatUsersDo 10 tests, 90 people to date 1. 5 non-British nationals "Sanjay" persona;
2. 5 business employees "Angela" persona;
3. 5 mixed gender British nationals aged 30-50;
4. 5 female less confident 50-70 "Ethel" persona;
5. 10 mixed demo users for variant test;
6. 5 aged 21-50;
7. 22 users Sanjay/Angela/Michael inc 6 mobile;
8. 20 users Sanjay/Ethel, inc 10 mobile;
9. 5 users, non-native English speakers;
10. 8 users, non-native English speakers
1. & 2. Current site not meeting needs: terminology, usability and accessibility *************************************** 3. Prototype 1, better than form, intuitive, but work needed still on doc checker and confirmation page *************************************** 4. Prototype 2, document checker and deadend page performing better, but still issues with multiple categories UX, selection feedback and confirmation page *************************************** 5. Prototype 3, variant test - simple search variant clear winner over categories version, on both user-expressed preference and speed of task completion *************************************** 6. Prototype 4, design iterations since GDS alpha design feedback validated with users; users expecting a visible document basket; pre-auth text still not crystal clear *************************************** 7. Create account and save card - some junk email issues; users want sign in on the start page *************************************** 8. Eligibility search update with synonyms, top searches and ZRP - to write up *************************************** 9. SAE iteration 1 - to write up *************************************** 10. SAE iteration 2 - to write up
Usability tests, Pop up 3 tests, 19 people to date 1. Old Admiralty Building, premium business users, 28 September 2. Milton Keynes, drop-off users, 9 October 3. Victoria, premium business users, 18 March Validated: very high ratings for early prototype, no preference to continue with paper, speed and simplicity goal met, service choice and confirmation pages successful; issues: misunderstanding of account status page, v high expectations of how comprehensive, some fields need to be added e.g. number of documents
Usability tests, notaries 4 Notaries Validated: ease, speed, appeal of tracking info in account
Payment & user account research, businesses 16 MK drop-off (12) & Premium business (4) customers Confirmed: number of users involved in making the application varies between 1 and 5, but it is most common for one person to complete the entire end to end application process; nobody expressed concern over potential change
Accessibility 14 Actual and potential postal customers Many issues identified and subsequently addressed - page structure, dynamic content, labels, hidden text assistance
Assisted digital research 3 by IFF telephone, 3 by Nick face to face; See below on outreach volumes > 40,000 Existing users, 3 consumers, 1 solicitor, 2 notaries IFF: Friends/family completing application, and/or completing over the phone; Nick: phone / postal applications
Private beta feedback >150, 122 survey answers so far and c.30 direct feedback emails Good spread of repeat and new users, see data in point 12 Speed and simplicity needs being met; demand for further time savings through expanded account settings
Card sorting 37 30 Consumer users via optimalsort; 7 experts group categorisation Groupings by users & experts, and cards users found difficult to categorise
Transactional quant survey Q1 174 Consumer and business users, 174 in Q1 survey 86% service satisfaction rating, highest of all FCO transactions, something we need to maintain; also highest acceptance of moving services online: 93% positive, 3% don't know, 4% negative/use intermediary (base: users who completed transaction)
Transactional quant survey Q2-3 551 Consumer and business users, 551 in Q2-3 survey out of 1,277 Satisfaction key drivers: 1.Outcome 2. Professionalism of staff 3. Quality of information; Dissatisfaction key drivers: 1. Negative outcome 2. Incorrect information 3. Poor service; GOV.UK rating - 27% of legalisation users did not find everything or found with difficulty; Phone contact- Legalisation phone ratings in line with consular contact centre ratings
Transactional survey, November 1,648 Consumer and business users By now common issues: navigation, form, clarity, apostille missing, tracking
Anonymous feedback GOV.UK 1,125 Consumer and business users, all GOV.UK feedback to date Consistent with usability test and other qual insights, many issues with navigation, order of the steps, downloadable form, clarity about what to do, terminology can't find 'apostille'; requests for confirmation email and better status comms

IFF Focus Groups and Survey

About:

  • The aims of the research were to understand the experiences of people who use the FCO’s document Legalisation service, and to explore ideas about the future delivery of the Legalisation service to customers.
  • Qualitative research: 5 depth interviews with staff, 3 customer focus groups, 4 depth interviews with postal customers
  • Quantitative research: online survey of 479 customers, 14 semi-structured interviews with MK drop off users

Key lessons, top user needs:

  • Speed and Simplicity are of the essence: changes must protect those 2 factors and current high satisfaction with this. Wave 1 of our Consular customer satisfaction research showed Legalisation customers were significantly more satisfied than the transactional customer average (86% vs 80% rating their satisfaction as 8, 9 or 10 out of 10).
  • Speed of document turnaround is the key user need, affects satisfaction above all measures (and the only influencing factor for notaries)
  • Status updates: for individuals, being kept informed about processing times is significant in driving satisfaction
  • One size does not fit all: different audiences need a tailored experience in order to maintain their high satisfaction
  • Measure as we go: our customer satisfaction tracking (see 2. Ongoing research) will allow us to maintain an overview of satisfaction for Legalisation customers versus the transactional customer average, and compare against other customers of other transaction types. We will be able to see if there are significant rises or dips in satisfaction levels, and may pick up additional evidence as to what may be driving any changes.

Improvements:

  • Pricing: reduced fees is most mentioned
  • Saved details: storing commonly used details for repeat users
  • Clarity: clearer detail about requirements; improved web site design
  • Summary page: being able to edit application details before submitting
  • Business information: make it clear where the information for business users is, a mishmash at present
  • Reconciliation: make it easier to reconcile applications with the client’s other paperwork
  • Confirmation email: receive a confirmation email summarising application
  • Status comms: confirmation, turnaround time, comms about delays

Challenging lessons:

  • Parcelling out of different parts of application to different staff members – integrated online application must not prevent this
  • Perception that filling out cover sheet by hand may be quicker (business services), and for some satisfied customers any change is a hard sell
  • 49% have experienced at least one problem, most commonly one of: 1. Delays / timing issues, 2. Wrong payment made, 3. Document not eligible or requires further confirmation

Least appeal:

  • Identity verification not relevant to staff member of a business / confusing given legalisation is all about checking a (normally 3rd party) identity / offensive to some notaries, “you’re questioning whether I am authorised to do this?”. Findings suggest Verify could reduce digital takeup of the service.
  • Paypal and debit card payment options have the least appeal compared to other improvements

Full report:

  • Available on request from markbarlow

Baseline usability tests

About the 2 tests:

  • Testing existing application pages and flow on GOV.UK, end to end with dummy payment
  • 1 consumer group, non-British nationals with a UK degree (Persona: Sanjay)
  • 1 business user group, legal/accounts dept employees (Persona: Angela)

Results summary, key issues:

  • Personal/Business: clarity needed
  • Navigation: confusing, users get lost and don't complete steps in intended order if at all. Need to be guided through with Next buttons
  • Eligibility checker: links to further resources on GOV.UK take users down rabbit holes
  • Photocopies/originals: users not clear at all on when they can send a photocopy and when they can't
  • Terminology: names of documents not always matching users' definitions
  • Payment: Barclaycard branding is confusing, single decimal is not user friendly
  • Confirmation page: make it clearer you need to print the page
  • Downloadable form: numerous issues, confusion about version names, not accessible on tablet, asked to circle a field when using editable version, no postal address

Note: this is consistent with the findings from reviewing the anonymous feedback, below

See full test result details here:


Prototype usability test 1

About the test:

  • Testing the prototype, completing an end to end standard application, then comparing that experience to using the form
  • Consumer group, British nationals with a UK degree aged 30-50, mixed gender
  • Sprint 2

We validated:

  • Intuitive flow: 'Clear' and 'clean' v1 of the prototype, 'intuitive', 'self-explanatory', 'good process flow'
  • Start page: information clear
  • What to send UI: checkbox and radios work well for focusing users on what they can send, copy/original/etc
  • Summary page: including edit detail & return flow - went down well, very clear, and edit task completed without problems
  • Preference: the prototype version of the process is preferred by all users (compared to existing form) and there is a general preference for an online method.

Key issues:

  • Document checker: make it clear you can select multiple and re-test. Work through combinations in workshop.
  • Certification question: make a clearer page title, indicate full explanation will be provided
  • Terminology: apostille needs explaining
  • Timing: one user comments that there is no indication of how long you have to send in the documents once you have submitted your form - need to repeat this on confirmation page (only on start page at time of test)
  • End pages: 2 of 5 users did not follow correctly, not clear enough yet; 1 did not understand they still need to act (post docs); and email confirmation needs to be shown a page earlier to reassure/reinforce alternate printing options

See full test result details here:


Prototype usability test 2

About the test:

  • Focusing on the new Finder style of the document checker with a deeper exploration; getting feedback on the deadend 'not certified' page to ensure this is understood; getting users to add docs for more than 1 country; then run through application to end
  • Consumer group, "Ethel" persona, female aged 50-70
  • Sprint 3

We validated:

  • categories are a must: all 5 users like having the category filter options available right away and start with them, even when document already visible towards top of list
  • document checker: some related issues, but using checkboxes in the main column wasn't as problematic as we expected
  • different address for errors: fine
  • postcode list alongside button: fine, more visible
  • add country: fine, discoverable and easy to use
  • deadend page: clearly understood that you need to go and get doc certified first, works well
  • confirmation page: works better, much clearer than last version
  • service choice: works well, clarity

Key issues:

  • multiple category selection: not clear that it is meeting user expectation, leading to confusion; one user was particular about the count which would be a bigger issue if/when we repeat 1 doc in several categories; saw examples where adding a category didn't visibly change the list and looked like not working (because the doc was being added out of screen view, based on alphabetical order); overall a one-by-one category view might be clearer
  • document checker: do we need to allow users to add number of copies of a document too? 2 users suggested it. The upcoming add/remove UI version would facilitate this more easily if it is needed
  • phone number: intl phone code selector confused a user a lot, plan to remove this
  • confirmation: need number of documents added at minimum (some users expect country too, which we should keep an eye on this as feedback / decision on that field evolves)

See full test result details here:


Prototype usability test 3

About the test:

  • 10 users of mixed ability and demographic
  • We tested two variants of the document checker, to try to get a clearer insight on the best direction:
  • Variant A - Categories and search, with full details of selected docs
  • Variant B - Search only, with total number of selected docs only
  • All users completed tasks on both versions, split evenly into 5 who saw A then B, and 5 who saw B then A
  • Sprints 4-5

Key findings:

  • Preference for variant B: 7 prefer search, 2 categories, 1 on fence.
  • Speed advantage of variant B: average 6 seconds faster per task; total for all tasks 18 seconds faster; some particular tasks greater variance - 14 secs faster for single document, 2 seconds faster for cross-category selection, 5 seconds faster for within category
  • Categories confusing more than they are helping for many users
  • Search box on categories version is lost amongst all the other components
  • Variant B "much much clearer" to understand, simpler, easier to use, "much better", less cluttered. more precise

Feedback on why categories variant is less preferable:

  • "too confusing", "too busy", "they just seem to overcomplicate"
  • too many categories
  • "I didn't get what I wanted so I decided to use search", "I should use the search box, that might been easier, but I didn't notice it immediately, because it's smaller" (search box smaller and left rail in this category version, not spotted at first by several users)
  • 1 user mixed up selecting category with selecting document
  • "Didn't quite know which one to click"

Key issues for variant B, the clear winner:

  • Main missing item from Variant B is visible feedback on the actual documents selected (only gives a summary; users who noticed this preferred the detail of variant A)
  • Document list is long and can be daunting, but alphabeticisation of list is again commented on appreciatively

Prototype usability test 4

About the test:

  • 5 users aged 21-50
  • End to end test with targeted questions on pages we had iterated since GDS design feedback e.g. service choice, styling of deadend page, Back link style (moved to top), feedback question (moved and reworded), text about pre-auth payments
  • Sprints 7-8

We validated:

  • All of the users immediately found the back button on the top left of the web page without any problems
  • Users all understood the copy for the ‘Standard Service’ with four of the five users specifically mentioning that the ‘Premium Service’ was intended for businesses.
  • The search and selection of documents was natural for users with some use of the search bar and some users scrolling through the document list
  • Deadend page: majority of users realised they must get their docs certified and noted or explored the links to find one
  • Number of documents page clear
  • Sending your docs page clear, no comments
  • New Feedback question answered without comment
  • On the check your answers page, users are most concerned about cost, their personal details, return address, number of documents

Key issues:

  • Document checker - several users expected a list/basket of the documents selected to build at the top of the page with one user initially mistaking the search bar for a field which would become populated with the documents chosen.
  • One user wanted the selected documents to be listed on the check your answers page, not just the total number
  • While most users understood payment would not be taken until after docs were legalised, it could still be more clearly / prominently stated

Popup usability tests

About the tests:

    1. Old Admiralty Building, premium business service users, Monday 28 September, 10 users
    1. Milton Keynes, MK drop off service users, Friday 9 October, 3 users
  • Objectives: capture first impressions; observe completion of end to end fast track application; explore account section in detail

Test results:

We validated:

  • Speed on track so far, with one exception
  • Simplicity and ease of use
  • Service choice page is an improvement, can compare services and find them easily
  • Countries being optional was welcomed; 1 user said remove it
  • Confirmation page all-in-one (payment, app ref) is an improvement over cover sheet
  • Account feature resonated strongly

Key issues:

  • Need number of documents per application in account
  • Need business name on registration
  • Terminology needs refining, confusion about "In progress" - before application submitted, or after submitted?
  • Expectations of account go beyond what we can deliver initially

See full test result details here:


Notaries usability test 1

About the test:

  • Remote testing of prototype, combined with questions and answers document
  • Invited 10, 4 responded

We validated:

  • Clarity
  • Speed
  • Appeal of account page, particularly tracking

Key issues:

  • Suggestion: show date and method of return for closed cases
  • Status expectation post-submission: information such as ‘received’ ‘processing’ ‘despatched’ ‘query’ etc. Tracking number for courier brought up again

See full test result details here:


Payment and user account research

About the test:

  • 16 users
  • Probing of user accounts, roles, payment to better understand needs for account

Key findings:

  • Number of users involved in making the application varies between 1 and 5
  • It is most common for one person to complete the entire end to end application process
  • Although procedures vary from business to business, nobody expressed concern over potential change

See full test result details here:


Account registration and payment usability test

About the test:

  • 20 users: Michael, Angela, Sanjay personas
  • Devices: 15 desktop, 5 mobile
  • Validate usability of account sign up process.
  • Forgotten password: once set up, test the process to retrieve a password

Initial issues report:

  • 'Start now' not being associated with creating an account - users expect the option to be prominent on the landing page
  • 'Create an account' link too discreet and overlooked with users beginning the application instead (select 'Standard service' then click 'Continue')
  • Not being able to exit the T&Cs once opened - user have to click back which clears previously entered form data
  • Password format not obvious - most initially enter invalid passwords

Premium service popup usability test

About the test:

  • 6 users, 18 March 2016
  • Testing new applications for logged in users, using the private beta product

Key findings:

  • Some frequent users of the premium some felt the start page was an extra unnecessary step for them. They will likely bookmark the next page. One user struggled to find the premium service. May need information higher up start page on different services (this relates to other usability testing findings on register/sign in links)
  • The written information about saving card details was ignored by 3 users who just quickly clicked the 'pay' button
  • 1 user misinterpreted Barclaycard brand name, as happened in an earlier test

See full test result details here:


Accessibility tests

Test 1

About the test:

  • 5 users, visually impaired and dyslexic
  • Actual and potential customers
  • Assistive technologies used: Magic version 11.0, Supernova screen reader combined with screen magnifier, Read and Write 10 Gold
  • Testing standard application process and business application process

Key findings:

  • One, left column design pattern works well overall for linear page reading
  • Tabbed navigation worked well on form pages
  • When design deviates from one column pattern, some right hand screen content missed
  • Linear content experience and ability to only see small part of screen at a time threw up a few content sequencing issues.

See full test result details here:

Test 2

3 days of usability testing via DAC with 9 users with the following disabilities:

  • Blind
  • Low Vision
  • Colour Blind
  • Dyslexia
  • Limited limb mobility
  • Learning disabilities
  • Deaf
  • Aspergers (ASD)
  • Anxiety/Panic disorder

    Testing included the following assistive technologies:
  • JAWS screen reader
  • NVDA screen reader
  • VoiceOver, Mac native screen reader
  • ZoomText screen reader and magnification application
  • MaGic screen magnification application
  • SuperNova magnifier and screen reader
  • WindowEyes screen reader for Office
  • Dragon Naturally Speaking voice activated software
  • Keyboard only input in lieu of mouse or other

    Testing report available on request


    Assisted digital research

    Identifying users

    Methods used and success. Note: leads do not all have consent and/or agree to participate when contacted

    Method Number of users screened or contacted Resulting AD leads
    Serco customer phone calls July-Nov 2015 >1,000 3 with consent, 2 could not be contacted, 1 interview secured
    IFF survey analysis Nov 2015 >1,000 4 leads with consent, 2 interviews secured
    Leaflet mailshot Jan-Mar 2016 >40,000 as of 23 March 3 with consent
    IFF survey analysis Jan 2016 998 20, but only 2 with consent to be contacted
    Consular Insights customer analysis Feb 2016 >1,000 117 potential but unconfirmed AD leads. 96 permitted contact, 1 identified as AD - but unwilling to participate
    Premium counter direct intercepts 52 2 AD users, discounted as acting in courier only role
    Professional participant recruitment company n/a Said participants could only be reached through FCO provided lists so route discounted

    1. IFF Telephone research

    • Interviews with existing legalisation customers who were identified as AD and agreed to be contacted for feedback. See test details for more info

    Key findings:

    • Reliance on friends/family to help
    • Some desire for the reassurance of talking to people
    • Concern about security of data, payment, etc
    • User with zero access happy to use phone, or ask friend/family again
    • For the cautious users who do have access, it is the first time scenario that contributes to the problem - once understand the process better then state more willingness to use the online service next time and see some advantages to choosing that channel

    See full test result details here:

    Assisted Digital pain points and user needs:

    2. Face-to-face research

    3 face to face, 1 consumer 2 notaries

    Key findings:

    • Concern that online payments are not secure

    • Concern about sending payment card details in the post

    • Users lower on digital inclusion scale prefer telephone or written correspondence

    • Frustrated by limitations of present automated telephone service

    • Evidence 01. Assisted digital research 2

    Survey data

    Digital inclusion data from the Q2/3 offline transactional survey, 577 legalisation customers:

    Digital inclusion statement % of users
    I have never used the internet, and I don’t intend to 0%
    I used to use the internet, but I don’t plan on doing so again 1%
    I’d like to use the internet, but I feel unable to do so <1 %
    I do use the internet, but I wish I didn’t have to 1%
    I use the internet, but I’m learning how to as I go 1%
    I can use the internet, but only for specific tasks 2%
    I have a basic set of digital skills which allow me to use the internet 6%
    I can confidently use the internet 62%
    I’m an expert user of the internet 26%
    • See also 14. Digital takeup and 12. Intuitive design
    • Postal solution currently in place for those without access - when they phone they are posted a hard copy of the current application form - but we recognise this does not meet DbD standard on its own
    • Serco golden rule is to always direct the customer to the website and talk them through the steps
    • Based on research findings, we plan to extend that to completing the form online for someone over the phone and taking payment
    • We're also going to provide face-to-face support at our counters to those who need it

    Card sorting

    About the tests:

    • We have category filters to make it easier to navigate the long list of documents. How would users categories the 60+ documents? Two exercises to provide input into terminology we will use:
      1. Optimal sort card sorting test
      1. Workshop of 7 expert internal staff, categorising documents based on their knowledge (inc. legal expertise)

    Key findings:

    • Common groupings identified for both users and experts
    • Data to feed into search synonyms given UX route we are now taking focused on search, not categories

    See full card sorting result details here:


    Anonymous feedback

    Anonymous feedback from the various existing legalisation pages on GOV.UK can be downloaded here.

    This contains all feedback to date, collected on August 28th 2015. The common issues are:

    Get a document legalised:

    • Navigation: confusion about the separate steps / completed in wrong order / unable to find payment page
    • Printing: unable to print, no alternative instructions
    • Postage: seeking clarification on postage
    • Form: problems downloading form
    • Payment: unspecified problems paying online
    • Status & Timing: looking for status or turnaround time information

    Document checker:

    • Clarity: advice not clear enough
    • Certification: trying to understand certification
    • Terminology: no mention of “apostille” a term some users are trying to find
    • Specifics: lots of references to specific documents

    Application form (50% of the feedback):

    • Can’t download: many reports of this
    • Can’t open or print: many reports of this
    • No payment reference: users have no payment reference as they have completed steps in wrong order
    • Need help: can’t fill in the form

    Payment

    • Confirmation: no confirmation email
    • Payment: unspecified problems paying

    Transactional Customer Satisfaction Survey

    1,648 responses in the Transactional Customer Satisfaction survey, November report covering the period April-September 2015.

    Common issues all familiar now:

    Clarity of instruction e.g. eligibility, certification, etc

    • "the website, however, is not very specific in its instructions is to which type of delivery i should have chosen and what the risks are."
    • "Not clear about certifying a document by a Solicitor. I did twice the process of legalization."
    • "clearer statements on gov.uk about what they can and cannot do."
    • "explain that original documents from other government departments need a notary stamp before an "apostille " can be issued"
    • "their website could be improved with clearer information with regards to criteria that need to be satisfied for the legalisation of documents."

    Missing crucial word 'apostille'

    • "Because the one piece of information that I needed (ie "apostille") wasn't on there. (Or if it was there, it wasn't very prominent because I still can't find it.)"
    • "when it comes to legalizing a university degree certificate, i discovered i needed an apostille first. this was not clear enough on the website, so the document was returned to me."
    • "because on the website there was no mention of it being an "apostille". so when i received it and realised that it was an apostille i found that i had spent the money on the legalisation and transit for nothing because canada is not a signatory to the apostille convention and i needed to send them for immigration there. in the end i had to pay a notary public another 50 pounds to notarise them."

    Application form not up to scratch

    • "really frustrating to have to use an odf document format for application. needed to load an app onto ipad to be able to use"
    • The application form for the legalisation of documents has been amateurishly put together. I expect a form that can be filled in online and then be printed.

    Disjointed nav, separate steps

    • "the service was very expensive, and the forms are a bit awkward in that you have to pay online, then print the form totally separately and write on the transaction number, so even though the outcome was perfect, i wasn't as happy as i could have been."
    • "Orientation on the site is difficult. information is broken down on too many different pages and windows"
    • "It would help greatly if the information on your website was (a) ordered with more easily understood logic, (b)signposting on the website was better labelled, and (c) the information was written in plain English, not civilservice!"

    Tracking

    • "offer a more detailed tracking service (with a smaller window of delivery) for returning documents. have an accurate and detailed system for tracking a documents progress once it has been received."
    • "better tracking of posted parcels, improve communication on reception of documents and better instructions on website as to what kind of postage should be used for return envelopes."

    Personas

    Personas from Discovery can be downloaded here

    Summary:

    • Sanjay, 31, non-British with UK degree returning to UAE to work, postal service
    • Ethel, 60, getting married abroad, postal service
    • Angela, 23, Accounts department of HSBC, premium business service and postal service
    • Michael, 56, notary, representing clients, postal service

    Browser data

    Latest browser data available in FCO Piwik account. To be demo'd in assessment.

    Starting point: July 2015 GA data for legalisation pages on GOV.UK

    Devices:

    • 68% desktop
    • 22% mobile
    • 10% tablet

    Browsers:

    • Browsers with > 1% share total to 85% of the audience
    • Long tail of 15%

    Detailed breakdown:


    Google Analytics

    Headline stats September:

    'Homepage' at https://www.gov.uk/get-document-legalised

    • 92,000 PVs
    • 36,000 users
    • 36% bounce rate

    Application form page https://www.gov.uk/government/publications/legalisation-application-form

    • 29,000 PVs
    • 14,000 users
    • 19% bounce rate

  • Clone this wiki locally