Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Content author test cases #33

Open
25 tasks
danielnaab opened this issue Jan 26, 2024 · 1 comment
Open
25 tasks

Content author test cases #33

danielnaab opened this issue Jan 26, 2024 · 1 comment

Comments

@danielnaab
Copy link
Collaborator

danielnaab commented Jan 26, 2024

As a content author, I would like to define a test specification for my interviews, so I may feel confident they behave at I intend without extensive manual testing.

(As a developer, I would like to define assertions for our partner's real-world forms, so as we iterate quickly on feature development, we can feel confident we are not breaking behavior in our primary deliverable.)

Open questions:

  • What kind of guarantees would a content author be looking for?
  • What is an appropriate level of granularity for user-generated test specifications?
  • What would an assertion look like?
  • What kind of fixtures are required to set up a test case? Is mock user-entered data enough?

Acceptance criteria:

  • A "no code" interface to define test cases exists
  • Test scenarios may be run in-browser, with user-friendly pass/fail notifications
  • The main form builder interface communicates the pass/fail status of the form
  • Form test specs should not be dependent on form backend features (should work with multiple implementations)

Overview

As a _, I would like _, so that I can _.

Context

Optional: Any reference material or thoughts we may need for later reference, or assumptions of prior or future work that's out of scope for this story.

  • [ ]

Acceptance Criteria

Required outcomes of the story

  • [ ]

Research Questions

  • Optional: Any initial questions for research

Tasks

Research, design, and engineering work needed to complete the story.

  • [ ]

Definition of done

The "definition of done" ensures our quality standards are met with each bit of user-facing behavior we add. Everything that can be done incrementally should be done incrementally, while the context and details are fresh. If it’s inefficient or “hard” to do so, the team should figure out why and add OPEX/DEVEX backlog items to make it easier and more efficient.

  • Behavior
    • Acceptance criteria met
    • Implementation matches design decisions
  • Documentation
    • ADRs (/documents/adr folder)
    • Relevant README.md(s)
  • Code quality
    • Code refactored for clarity and no design/technical debt
    • Adhere to separation of concerns; code is not tightly coupled, especially to 3rd party dependencies; dependency rule followed
    • Code is reviewed by team member
    • Code quality checks passed
  • Security and privacy
    • Automated security and privacy gates passed
  • Testing tasks completed
    • Automated tests pass
    • Unit test coverage of our code >= 90%
  • Build and deploy
    • Build process updated
    • API(s) are versioned
    • Feature toggles created and/or deleted. Document the feature toggle
    • Source code is merged to the main branch

Decisions

  • Optional: Any decisions we've made while working on this story
@danielnaab danielnaab changed the title As a user, I would like to create test cases for my interviews, so I may feel confident they behave at I intend without extensive manual testing. Content author test cases Jan 26, 2024
@JennyRichards-Flexion
Copy link
Collaborator

9/6/24: In conversation with Jim, this item is MEDIUM priority.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants