Skip to content

10. Test environment

mjrix edited this page Mar 24, 2016 · 16 revisions

10. Be able to test the end-to-end service in an environment identical to that of the live version, including on all common browsers and devices, and using dummy accounts and a representative sample of users.

Link to corresponding item in JIRA: https://jira.informed.com:8443/browse/FCOLOI-163

Questions

  • What environments do you have?
  • How quickly and easily can you create a new environment?
  • What data exists in your pre-production environments?
  • How are you gaining confidence that your service will perform under expected loads?
  • How are you checking that your system works on all the supported devices?

Evidence

Service Manager able to:

  • explain what environments they have.

Environment overview diagram - https://docs.google.com/presentation/d/1T4_04OpLwoS5BOUPPDmIvPR1sp5MbMLDq_GqKO16UZ0/edit?pref=2&pli=1#slide=id.p

  • Development Environment – Used by the dev team to build new features and refine existing ones.
  • Build / Integration Environment – Automatic Build, Deploy and Test pipeline. Environment used to verify build, execute automated unit tests, deploy code and complete end-2-end automated tests.
  • UAT Environment – Used to verify new and refined features, predominantly functional and UX testing.
  • Live Environment – High availability public facing environment. Deployment orchestrated by Jenkins with manual trigger.

With the exception of the Case Management API, all external services are available in all environments.

  • explain how quickly and easily they can create a new environment.
  • The environments are within a Cloud infrastructure (skyscape).
  • Commissioning of new environments can be performed by the team once the initial set-up is complete.
  • Due to the security of the platform, some actions (e.g. networking) do require support from kainos.
  • explain what data exists in their pre-production environments.
  • Application data within the service is transient and stored for a maximum of 60 days after the application has been processed.
  • The test environment contains sample accounts across each service.
  • Accounts also exist with various levels of associated data (e.g. existing applications, multiple addresses etc).
  • These are used as part of regression testing to verify each build (prior to deployment to UAT)
  • explain how they are gaining confidence that their service will perform under expected loads (including assisted digital routes).
  • Load tests using the Jmeter tool have been developed, based on typical user journey's.

  • Load testing executed initially from behind the firewall. Planned test prior to public beta involve tests from outside of the firewall using the same Jmeter scripts executed using the Blazemeter cloud service.

  • Initial results have been positive, giving confidence the service can meet its target usage requirements:

     * Capable of storing up to 5GB of application data
     * Support up to 50 concurrent user sessions
     * Capable of handling up to 50,000 page hits per day
    
  • describe testing environments, systems, and approaches for non-digital parts of the service (including assisted digital routes).
  • Non-digital applications are created in the back-office case management system for which there is a dedicated test environment.
  • Assisted Digital customers will also use the back-office channel, facilitated by FCO staff and Serco.
  • explain how they are checking that their system works on all the supported devices.

Analytics:

Latest data in the FCO Piwik account

Original starting point was July 2015 GA data for legalisation pages on GOV.UK

Devices:

  • 68% desktop
  • 22% mobile
  • 10% tablet

Browsers:

  • Browsers with > 1% share total to 85% of the audience
  • Long tail of 15%

Browser and device compatibility testing:

  • Wide distribution of browsers across the development team
  • Selenium automated testing - executed via Sauce Labs on multiple platforms.
  • Videos recorded and being reviewed to verify presentation on different platforms as well as functional correctness.
  • demonstrate their service in a live-like environment.
  • Today's demonstration is within the UAT environment, which is fully representative of the live environment.
  • Connected to a test version of the back-off system and Barclaycard site.

What environments do you have?

  • currently an environment where we host versions of the Design Prototype (developed using the GOV.UK prototype toolkit)
  • build server hosting Jenkins CI tool where containers are unit tested
  • integration environment where microservices will be automatically deployed to allow end-to-end integration testing
  • other environments will be provisioned for beta and live systems

Automated Testing Tools and Techniques

  • Mocha framework for automated unit testing of Node microservice applications as part of CI pipeline
  • automated testing of UI with Selenium (managed through SauceLabs) which also allows us to test browser and device compatibility
  • load testing using JMeter and an online tool called BlazeMeter

Browser data

Latest data in the FCO Piwik account

Original starting point was July 2015 GA data for legalisation pages on GOV.UK

Devices:

  • 68% desktop
  • 22% mobile
  • 10% tablet

Browsers:

  • Browsers with > 1% share total to 85% of the audience
  • Long tail of 15%

Detailed breakdown:

Clone this wiki locally