Skip to content

Definition Of Quality

Anthoula Wojczak edited this page Aug 26, 2021 · 9 revisions

Browser Support

All functionality must work in any browser listed in the following queries:

UX Requirements

Viewport

Test only against the 360x460 viewport (portrait mode). Landscape mode and desktop mode are not prioritized.

Appearance

Chief call to action must be "above the fold" in the viewport.

Controls

All UI controls must respond to user input unless it is indicated that they are disabled.

Technical Requirements

PWA Checklist

All PWA checklist items must be supported.

  • ServiceWorker installs and caches
  • manifest.json exists and all assets it lists exist
  • All sizes of supported icon exist and are listed in manifest

Definition of Done

This defines the criteria of items that must be completed before a user story can be considered complete. This is applied consistently and serves as an official gate separating things from being In Progress to Done. The Definition of Done (DoD) ensures that the features delivered are complete in functionality, and of high quality.

Code is written.

The most basic thing that needs to be completed for any user story or issue to be Done is that it's built. It should satisfy all Acceptance Criteria defined in the user story. The code should also be written with the Test Plan in mind, to cover all known user cases. All new code must also be covered with appropriate automated tests where available; and existing code must have updated tests. Code must also be performant, and compliant.

Code must meet the following standards:

  • Satisfies all Acceptance Criteria
  • Satisfies the Test Plan
  • New code is covered with appropriate automated tests
  • Existing / refactored code must have updated tests
  • Code meets performance standards
  • Code meets security compliance standards
  • Translations have been added for new strings.

Code is performant.

The code should not decrease the storefront performance. This can be measured by using Lighthouse.

Payload Size

  • No single asset on a landing page should be larger than 400KB. Use bundlesize to measure this metric.
  • Total JavaScript payload should not be larger than 500KB. Check this manually or use either Webpack Bundle Analyzer or bundlesize.

User Responsiveness

Code is peer-reviewed, and approved.

During code review, developers evaluate a PR from the perspectives of architecture and implementation, answering the following questions:

Functionality

  • Does the proposed solution satisfy the acceptance criteria?
  • Does the proposal cover all use cases? Did review uncover any new ones?

Approach

  • Is the proposed solution clear and obvious?
  • Could any parts be made more clear or obvious?
  • Does the proposal utilize our existing tools?
  • Does the proposal differ from previous solutions? Why?
  • What implications would accepting the proposal have?
  • Does the proposal follow the Open/Close Principle (Open for extension, and closed for modification)?

Hygiene

  • Is the code efficient?
  • Are there any errors, gaps, or redundancies?
  • Does the code adhere to our accepted best practices?
  • Does the code follow our own patterns and precedents?
  • Does the code style match our own?
  • Have translations been added for new strings?

Code is deployed to test environment.

Functionality is deployed to a cloud test environment so that stakeholders can review a demo of the functionality and provide their approvals.

Feature is OK'd by stakeholders.

During UX review, a UX designer evaluates the story from a UX perspective, asking the following questions:

  • Does the UX match the provided mockups closely enough?
  • Is the end result successful or does it need revisions?

During PO review, the Product Owner evaluates the story, asking the following questions:

  • Is this important to do now?
  • Does this meet the Acceptance Criteria?

Code passes functional, regression, and smoke testing.

During QA, a QA engineer (another team member that has not written, nor peer-reviewed the code), evaluates the code from a functional perspective, answering the following questions:

Functional

The QA engineer goes through a Functional PR checklist, and can approve the code once all checklist items are satisfied.

Tests

  • Have unit tests been included for new code?
  • Have unit tests been updated for existing code?
  • Have E2E Functional tests been included for new critical user flows?
  • Have E2E Functional tests been updated for existing user flows?
  • Have manual Zephyr test cases been created for new critical use flows that cannot be automated?

Regression

  • Does the regression suite pass?
  • Do manual workflows work as expected?
  • Are there any new build or runtime errors?
  • Does the proposal have any unintended side effects?
  • Have Lighthouse and WPT scores changed?

Code is documented.

Functionality and backwards incompatibility is document in necessary developer documentation.

  • If the code creates a new public API, does it have the necessary jsdoc blocks?
  • If the code changes an existing public API, are the jsdoc blocks updated as necessary?
  • If the code introduces a new feature, are there notes in a PR, wiki, or markdown page somewhere that answers:
    • What it does
    • How does it work (at a high level)
    • How can storefront developers use it

Help documentation is updated.

Functionality and backwards incompatibility is document in necessary user documentation.

Code is delivered.

When we merge our code to the delivery branch, we consider this to be Ready for Release. Once our code is ready to be released to market, we publish release packages managers, and then we can call our story "Done".