Skip to content
Mark Barlow edited this page Mar 22, 2016 · 20 revisions

15. Use tools for analysis that collect performance data. Use this data to analyse the success of the service and to translate this into features and tasks for the next phase of development

Link to corresponding item in JIRA: https://jira.informed.com:8443/browse/FCOLOI-168

Questions

  • What have you instrumented and why?
  • Have you modelled user journeys, and are you able to track progression through your service so you can identify completions and areas of poor performance?
  • Where appropriate, have you built a funnel, exit paths?
  • What tools are you using to collect data?
  • Has the SIRO signed these off?
  • Where appropriate, have you anonymised the user IP address, have you opted out of data sharing with 3rd parties?
  • What analysis has been carried out on the service so far and how has this impacted on the backlog?
  • What is the ongoing roadmap for performance analysis, including performance of assisted digital support?
  • Who in the team is responsible for identifying actionable data insights, including for assisted digital support?
  • What is the next performance analysis user story?
  • Have you started discussions with GOV.UK about start and end pages?

Evidence

Service Manager able to:

  • explain what data sources and analysis have been undertaken in the alpha stage.
  • explain how the shape of the service has influenced the choice of metrics, data points and data sources.
  • explain the choice of analysis tools used in the beta (and alpha if appropriate).
  • show that appropriate information security and privacy issues have been addressed.
  • explain how they have modelled user journeys and will track progression through the service so they can identify completions and areas of poor performance in the beta.
  • talk clearly about evidence from qualitative and quantitative data, what they learned from these sources and what changes to user needs/improvements they identified.
  • talk through how these were prioritised and what features were changed or implemented.
  • talk about the ongoing roadmap for performance analysis, and explain who in the team is responsible for identifying actionable data insights during the beta, including for assisted digital support.
  • explain the next performance analysis user story.
  • show they have discussed and agreed start and end pages with GOV.UK and these are optimised

What have you instrumented and why?

Demo piwik

  • Standard metrics - users, pageviews etc
  • Events - granular answers and interactions not necessarily reflected in URLs, and either not captured at all in backend caseworking tool, or else not in a way that can be reported
  • Campaigns - helps give insight on URLs shared by different services i.e. the breakdown
  • Goals - combines the above to create funnels

Have you modelled user journeys, and are you able to track progression through your service so you can identify completions and areas of poor performance?

Yes - demo piwik

Where appropriate, have you built a funnel, exit paths?

Yes - demo piwik - done within constraints of piwik as far as chaining goals together. Covering:

  • standard application - successful submission
  • premium application - successful submission
  • business drop-off application - successful submission
  • standard application - successful prevention (documents not certified)
  • registration

What tools are you using to collect data?

Piwik for web analytics
Casebook for application data

Has the SIRO signed these off?

Yes.
The whole service has been through the usual privacy, data, security signoffs with accompanying RMADS / assurance case documentation.

Where appropriate, have you anonymised the user IP address, have you opted out of data sharing with 3rd parties?

Yes, we anonymise IP addresses by masking 2 bytes e.g. 192.168.xxx.xxx
See also http://piwik.org/privacy/

What analysis has been carried out on the service so far and how has this impacted on the backlog?

At this stage our focus is on:

  • Drop off points and number of users without documents ready (i.e. not certified)
  • Monitoring answers to the "have you an account already?" page which is a potential slow-down for regular users
  • Monitoring answers to feedback consent which is important for our research plans and customer satisfaction reporting
  • Monitoring interactions on confirmation page with print cover sheet link and help for users without a printer. Feeds back into question of how printing is presented on start page
  • Monitoring search queries for documents - anything unexpected? do we need more synonyms?
  • Unexpected data which might suggests a misunderstanding (or a tracking bug) e.g. address or postage answers

With appropriate research to accompany it, this is feeding into:

  • Tracking bug fixes and enhancements to what we are tracking
  • Start page conversations with GDS
  • Iteration of the search function of the eligibility checker e.g. synonyms, top searches
  • Iteration of create account/sign in distinctions

What is the ongoing roadmap for performance analysis, including performance of assisted digital support?

The next phase of the roadmap is going to be focused on our backend caseworking tool, with the ability to create apostilles due by end of April and integration with the payments and despatch backends being built after that. We will be looking at metrics such as:

  • Submission channels including Assisted Digital and legacy/offline
  • Turnaround time
  • Cases processed per day
  • Standard vs. non-standard cases
  • Validating time savings of the modernisation project

As we do this, we expect we will also identify changes or additions we want to make to the Piwik tracking, as the full end-to-end picture emerges.

For AD performance in particular, the casework reporting we are implementing will capture AD submissions which don't come through the front end digital service and allow us to capture data on the route used:

  • assisted application over the phone
  • offline paper form application

Who in the team is responsible for identifying actionable data insights, including for assisted digital support?

Mark Barlow
Ian McBride
Alison Diaz (AD)

What is the next performance analysis user story?

  • Improved instrumentation of the accounts section, including registration funnel
  • Fixes to some tracking where analysis suggests we are not capturing all the data e.g. service choice page
  • Fixes to some event tracking where the mapping does not look quite right

Have you started discussions with GOV.UK about start and end pages?

Yes:

  • Start page - ticket raised and in discussion with Gavan in GDS content team
  • End page/ feedback survey - ticket raised, Gavan has set it up, to go live pending result of assessment. The url will be /done/get-document-legalised

Alpha phase answers

What data and analysis have you done during discovery from existing or legacy systems or the wider digital landscape?

  • Anonymous GOV.UK feedback for all the legalisation pages: analysed and summarised (see point 1)
  • Google Analytics on GOV.UK pages: browser info and general metrics e.g. 92k pvs, 36k users (see point 1)
  • Peter Jordan's report on Google Analytics, Trends, Hitwise: how users find the legalisation pages, highlighting:
    • Confusion about navigation
    • "apostille" is the #1 search term on all the key legalisation pages -> added in alpha
    • rank well for legalisation searches, but results also cover drugs (especially news results)
  • Content: request outstanding with Gwen / GOV.UK content team for refreshed path and on-page search analysis for business users, wrt conversations about start page (due in August, currently on content team todo still)
  • Analysis of audience numbers, sampling
  • Detailed IFF qual user research per point 1
  • Internal process report: data on process timings, re-typing and numbers of errors/un-legalisable docs (10%) from commissioned report

What analysis have you made on the alpha?

  • Usability testing per point 1, on both existing pages as baseline and our prototype. Remote and pop up in person.
  • Detailed IFF quant user research per point 1
  • Data collection and initial interviews for assisted digital needs

Who in the team is responsible for analysis during the alpha, including for assisted digital support?

  • Prototype: led by Mark and Mike
  • Assisted digital: Alison Diaz, lead for AD and digital takeup; IFF research & Consular Insights team
Clone this wiki locally