Skip to content

WIP that captures Axis Group's framework for evaluating dashboards during quick design sprints

License

Notifications You must be signed in to change notification settings

jessielian/evaluation-toolkit

 
 

Repository files navigation

Why do we test?

As designers we are often confronted with the question “How do we know our design works?”, whether it's a matter of introspection or presenting and justifying our design decisions.

While testing seems like the obvious answer, it is important to realize that testing is not a single summative phase in the entire project (as in SDLC). Rather it is a formative process that is distributed throughout the design process.

Testing, creativity, and empathy are the fundamental pillars of the user-centered design sprint process at Axis.

Using these fundamentals we are able to iteratively deliver applications that are useful, usable, and delightful at every stage.

Cupcake Analogy

Testing allows us to step back and backup assumptions made along the way with empirical evidence. All in all, it helps us-

  • Mitigate the risk of major redesign later in the process(small changes are bound to happen)

  • Increase likelihood of User adoption

But how do we REALLY test during a sprint?

Because testing is distributed among our sprint phases, it can manifest in many ways ranging from casual coffee shop studies to controlled experiments depending on the research context, time at hand, and access to participants.

And while testing is a core philosophy of the UCD process, a point of tension emerges during Lean Design Sprints, where time and resources are limited and compromises must be made in the name of fast delivery.

How then can designers balance the need for robust testing versus quick turnaround times?

Heuristic approach

Heuristic techniques often called simply a heuristic. It is any approach to problem solving, learning, or discovery that employs a practical method not guaranteed to be optimal or perfect, but sufficient for the immediate goals.

Where finding an optimal solution is impossible or impractical, heuristic methods can be used to speed up the process of finding a satisfactory solution.Heuristics can be mental shortcuts that ease the cognitive load of making a decision.

Examples of this method include using a rule of thumb, an educated guess, an intuitive judgment, guesstimate, stereotyping, profiling, or common sense.

Source: Wikipedia

Know your guidelines before you break them

The goal of this repository is to serve as a toolkit

  • To help identify pitfalls in our current evaluation process, so designers can be more cognizant of them and control for them when possible.
  • To provide enough guidelines to designers to reduce the cognitive load of decision making at every stage.

This repository will also attempt to identify tactics designers can employ to improve their evaluation process.

Specifically, the approach advocated by this repository is three-pronged:

   	1. Follow best practices, don't re-invent the wheel
    	2. Know what to measure?
     	3. Know how to measure?

With these guidelines designers can determine success criteria for their project, prioritize what to test versus what not to test, and identify shortcut mehtods to test for them.

This is a WIP living document. Please read our Contribution Guidelines to help refine this repository.

About

WIP that captures Axis Group's framework for evaluating dashboards during quick design sprints

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages