Skip to content

Applying the Government of Canada's Digital Standards in digital service design, development, and delivery

License

Notifications You must be signed in to change notification settings

DTS-STN/digital-service-reviews

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

62 Commits
 
 
 
 
 
 

Repository files navigation

What are Digital Service Reviews?

Applying the Government of Canada's Digital Standards in digital service design, development, and delivery.

Digital Service Reviews mark the end of each phase in the service design lifecycle. Their outcome is to decide how the team and work should evolve going forward – whether to move to the next phase, make some changes before proceeding, or pivot to pursue a new direction.

Reviews are an opportunity for the team to reflect on what they’ve learned or created during the last phase and what they want to do similarly or differently for the next phase.

During a review, the product team shares the work they completed during the last phase of service design and demonstrates how they respond to users’ needs by meeting the Digital Standards.

It also helps our team at DECD gather lessons learned so we can share those experiences to help guide future teams.

Each review will be a little different to meet the varying needs and contexts of product teams.

The Digital Standards are “the foundation of the government’s shift to becoming more agile, open, and user-focused.”

What are the Digital Standards?

The 10 Digital Standards are outlined in the Government of Canada Digital Standards Playbook:

  1. Design with users
  2. Iterate and improve frequently
  3. Work in the open by default
  4. Use open standards and solutions
  5. Address security and privacy risks
  6. Build in accessibility from the start
  7. Empower staff to deliver better services
  8. Be good data stewards
  9. Design ethical services
  10. Collaborate widely

Product Approach to the Service Design Life Cycle

Each assessment will be a little different to meet the varying needs and contexts of product teams.

What it isn’t

  • Criticizing or poking holes in each other's work (e.g. Yeah, but you can’t do that because X)
  • One-size fits all, fixed, compliance exercise
  • Unilateral, where assessors give directions and teams listen and implement

What it is

  • Constructive feedback that helps build up each other’s work (e.g. Yeah, and have you thought about X instead? How could we address that together?)
  • An ongoing process designed with product teams and adapted to meet their varying needs or contexts as they evolve
  • A reciprocal relationship where product teams work with assessors to identify and make changes

We tested an Alpha Assessment with the Digital Experience and Client Data (DECD)’s Secure Client Hub team:

Alpha
End-of-phase Goal: Decide whether the service design team is ready to proceed to Beta to start building and testing a functional prototype.
Expectations:
  • Tested prototypes that demonstrate the potential design of the service
  • Vision and plan for how service will be built in Beta
  • Understanding of what's required for the service to be supported
Responsibilities:
  • Alpha Learning Report
  • Alpha Standards Self-Assessment
  • Product Demo
Potential Outcomes:
  • Proceed to Beta
  • Respond to gaps and next steps before proceeding to Beta
  • Pivot and re-enter Discovery or Alpha
  • Stop development

About

Applying the Government of Canada's Digital Standards in digital service design, development, and delivery

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages