This repo contains the results of an ESIP Products & Services Testbed Proposal to develop a heuristic approach the evaluating the effectiveness / usefulness of the communication vehicles that could be used by a research project or tool.
Switch branches/tags
Nothing to show
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Permalink
Failed to load latest commit information.
Heuristics_Guidelines_DiscussionOfMethods.md
LiteratureReview.md
README.md
Recommendations_NextSteps.md
Report on Usefulness Assessment.docx
SummaryReport.pdf
UseCaseDescription.md
UseCase_Scenarios_final.xlsx

README.md

Project_Assessment_Heuristics

This repo contains the results of an ESIP Products & Services Testbed Proposal to develop a heuristic approach to evaluate the effectiveness / usefulness of the communication vehicles that could be used by a research project or tool. It is designed to be a cross-disciplinary approach using both "usability" concepts and "technology assessment" concepts.

Problem Statement

The ESIP Products & Services Committee:

The ESIP Products & Services (P&S) Committee’s Testbeds are some of the main venues within ESIP in which solutions to common problems faced by the Earth sciences data communities can be explored. While the original P&S Testbed is focused upon the first phase of research and/or product development, considered the “incubation” phase, more recently, additional testbeds associated with the P&S have been focused upon later stages of research and product development in which the successful incubator projects move toward “infusion” into more established, sustained and financially supportable environments. Projects at this infusion stage that ESIP is helping to evaluate are those of NASA’s Advanced Information Systems Technology (AIST). See: https://esto.nasa.gov/info_technologies_aist.html.

Need for practical heuristics for evaluation of projects:

With all of these projects, community evaluation is an important aspect of the P&S’s work. To date, however, practical heuristics have not been identified by which evaluators can assess the overall effectiveness i.e., usefulness of the means by which a research project conveys information about itself, such as the research problem being studied and its findings and/or proposed solution. For instance, how well can another researcher who has a problem similar to the one being explored determine the specific use case(s) being addressed by the project? How easy is it to navigate the web pages of the research project in order to find the information that s/he needs to decide whether the research product is promising for his/her own use? These kinds of questions are asked by ESIP evaluators, but also by ESIP members (and anyone perusing the information). Rules of thumb that can be applied by research PIs, product developers as well as evaluators to check that communication vehicles being created are effectively conveying the most important information would be generally useful to the ESIP community.

Goal of Research

To develop a heuristic approach to evaluating the effectiveness / usefulness of the various means of communication by which a research project informs one or more targeted audiences about the project’s goals, objectives, technologies, methods, workflow or any other information pertinent to those interested in adopting or adapting them.

[Note that: An easily understood and seemingly quite apt definition of a “heuristic approach” from Wikipedia states that “A heuristic technique (/hjᵿˈrɪstᵻk/; Ancient Greek: εὑρίσκω, "find" or "discover"), often called simply a heuristic, is any approach to problem solving, learning, or discovery that employs a practical method not guaranteed to be optimal or perfect, but sufficient for the immediate goals.” (https://en.wikipedia.org/wiki/Heuristic ). ]

Brief discussion of methods

The methods used for this research involved the following:

  • Literature review of both the "usability" literature and the "technology assessment" literature related to NASA TRLs
  • Development of use cases
  • Development of ways to rank the effectiveness ("Usefulness") of Communication Vehicles such as project web pages
  • Development of Evaluator Checklists
  • Receipt of feedback on the draft Evaluator Checklists
  • Draft Guidelines in support of user interface usability assessment More information about each of these methods and the results can be found as related documents in this repo, including the full report.

Discussion of the findings and Recommendations / Next steps at completion of project are also included as documents as part of this repo.