Skip to content
mjrix edited this page Mar 24, 2016 · 16 revisions

4. Build the service using the agile, iterative and user-centred methods set out in the manual.

Link to corresponding item in JIRA: https://jira.informed.com:8443/browse/FCOLOI-157

Questions

  • Talk us through how you have worked in an agile way during alpha and how you are doing so in beta?
  • What tools and techniques have you used during alpha to enable this way of working?
  • How are you reviewing and iterating your processes?
  • How are you communicating within the team?
  • Can you give an example from alpha of how you have responded to user research and usability testing?
  • How are you governing the service?

Evidence

Service Manager able to:

  • clearly explain how the service has worked in an agile way during alpha and will continue to do so in beta, giving examples of using agile tools and techniques.
  • Scrum framework
  • 2 week sprints, with the usual ceremonies (daily stand-ups, planning, review, retro, in-sprit backlog refinement).
  • Self-organising team, with blockers taken up by most appropriate team member.
  • Use JIRA to manage backlog and sprints.
  • explain how the service has reviewed and iterated their processes to be responsive during beta.

Each sprint completes with an all team retrospective to review agile processes and tools (Start -> Stop -> Continue). Examples of improvement made as a result include:

  • Instigated joint show and tell with the casebook team to help the FCO legalisation get a handle on what the end-to-end system would look and help them prepare for the change.
  • For stories involving UX, we incorporated a step in the development chain for Developers, UX and the Product owner to align expectations.
  • Adopted a User Story map view of the backlog to plan releases, communicate with stakeholders and allow those not so familiar with JIRA to engage.
  • explain how the team has used agile tools and techniques to communicate within the team.
  • Distributed team, frequent team get-togethers – at least once per sprint.
  • Sharing of user research and outputs of user testing across entire team (WUD outputs, private beta feedback, write-ups of pop-up testing, our Wiki)
  • Collaboration tools – skype, whiteboard and screen sharing
  • Single source of truth and progress within JIRA
  • Test sessions captured through JIRA Capture
  • give an example of how the service has responded to user research and usability testing during beta.
  • Please refer to point 3 (Ongoing Research)
  • show that governance is proportional, not imposed, is based on clear and measurable goals, “go and see” rather than “wait and hear”, a clear focus on managing change and risk in real time rather than at arbitrary points, human centred not process centred.
  • We have taken care to ensure that delivery governance controls are proportional to risk/complexity.
  • Risks, assumptions, issues and/or dependencies are raised to the relevant person as and when they occur as part of Daily Scrums - verbal communication and collaboration is given precedence over written reports.
  • Managed inter-team blockers and external dependencies through a weekly session and actions log.
  • At the end of each Sprint, a short 'Sprint and Highlight Report' is produced to measure performance against agreed Sprint objectives, and to document headline risks, assumptions, issues and dependencies for reference.

Original alpha answers below:

Questions

Talk us through how you have worked in an agile way during Alpha and how you are doing so in Beta?

  • We continue to adopt the Scrum framework as our Agile delivery methodology.
  • We continue to deliver in 2 week Sprints with the typical Scrum events.
  • We continue to have a clear set of objectives that we want to achieve during the Beta Phase but the way that we achieve these objectives flexes depending on what we learn from our users and our own reflections.
  • We hold collaborative Product Backlog grooming sessions throughout each Sprint.
  • We have a comprehensive Definition of Done and process for acceptance of bugs ('Verified' label in JIRA).
  • Each Sprint follows a prototype, test with users and refine cycle.

What tools and techniques have you used during Alpha to enable this way of working?

  • Tools

    • Atlassian JIRA Agile for issue tracking and management
    • Atlassian JIRA Capture for recording testing and acceptance feedback
    • GitHub Wiki for collaboration
    • During the Alpha Phase, we made extensive use of the GDS Alpha Front-end Toolkit for rapid prototyping
  • Techniques

    • User Story Mapping to understand and communicate how delivery of the Product Backlog is distributed across Releases.
    • Example Mapping with FCO Subject Matter Experts to collaboratively understand the detail of complex/uncertain user stories, and to identify options that should be prototyped and presented to users.
    • Behaviour Driven Development so that we have a common language for articulating needs and acceptance criteria.

How are you reviewing and iterating your processes?

  • We hold a Sprint Retrospective at the end of each 2 week Sprint.
  • We consider what worked well from a team behaviours, working practices and process point of view.

How are you able to adapt your processes to be responsive and iterate?

  • We use Atlassian JIRA Capture to maintain an active dialogue that allows us to elicit feedback quickly and incorporate this into the prototype.
  • Each update to the prototype is managed as a distinct release with accompanying release notes so that the whole team is aware of and can understand iterative changes that are being made and focus their review process.
  • As mentioned above, we hold a Sprint Retrospective at the end of each 2 week Sprint and consider what worked well from a team behaviours, working practices and process point of view (https://jira.informed.com:8444/display/FW/Sprint+Retrospectives)

How are you communicating within the team?

  • Daily Scrums.
  • Sprint Planning, Review and Retrospective workshops.
  • Sharing of information and ideas using Atlassian JIRA and GitHub Wiki.
  • Test feedback and bugs communicated via JIRA (using Capture to create annotated screenshots, e.g. https://jira.informed.com:8443/browse/FCOLOI-377)

Can you give an example from Alpha of how you have responded to user research and usability testing?

  • Please refer to point 3 (Ongoing Research)

How are you governing the service?

  • We have taken care to ensure that delivery governance controls are proportional to risk/complexity.
  • We have selected a team with relevant skills and experience that is capable of self-managing.
  • Risks, assumptions, issues and/or dependencies are raised to the relevant person as and when they occur as part of Daily Scrums - verbal communication and collaboration is given precedence over written reports.
  • At the end of each Sprint, a short 'Sprint and Highlight Report' is produced to measure performance against agreed Sprint objectives, and to document headline risks, assumptions, issues and dependencies for reference.

Evidence

Service Manager able to

  • clearly explain how the service is working in an agile way, using agile tools and techniques.
  • explain how the service has reviewed and iterated their processes to be responsive.
  • explain how the team are using agile tools and techniques to communicate within the team.
  • give an example of how the service has responded to user research and usability testing.
  • show that governance is proportional, not imposed, is based on clear and measurable goals, “go and see” rather than “wait and hear”, a clear focus on managing change and risk in real time rather than at arbitrary points, human centred not process centred.