In order to compare viewers we propose an adaptable evaluation framework that considers different user profiles. The basis of the framework are a set of criteria weighted according to user needs.
Criteria can be decomposed in subcriteria and grouped in more general categories making a multi-level hierarchical structure that can be analyzed at different levels of detail to ease scores interpretation. In our case the considered categories are: technical, archive (open, save, and networking), workflow (the productivity and efficiency), visualization (volume rendering, MPR, DICOM conformity…), tools (annotations and measures), other modalities, and processing.
The evaluation framework is useful to understand and prioritize new development goals, and can be easily adapted to express different needs by altering the weights.
You can contribute by making pull requests or sending us modified versions of the spreadsheet with new viewers, criteria, etc.
Please, document the changes you do in order to ease the merging task.