Be notified of new releases
Create your free GitHub account today to subscribe to this repository for new releases and build software alongside 40 million developers.Sign up
This release is to parallel the formal publication of the FAIR Metrics rubric and exemplar Metrics set (https://www.nature.com/articles/sdata2018118). It contains the latest updates to the Metrics, based on responses to the Issues raised in GitHub. It contains the results of the evaluation study. It also contains the strawman version of the Metrics Evaluator code (for access to the Evaluator interface, contact any of the authors)
The initial metrics have been updated based on github user comments and corrections. In addition, we executed an evaluation of the Metrics via a questionnaire. The results of this evaluation are in the "Evaluation Of Metrics" folder in the Git, and discussions about the questionnaire, the answers provided, and what is/not an "acceptable" answer, is ongoing within the GitHub issues pages.
The FAIR Metrics Authoring group (Wilkinson, Dumontier, Sansone, Schultz, Doorn and Olavo Bonino) has produced this initial set of generally-applicable metrics spanning all of the FAIR sub-principles. These are intended to act as a starting point for broader community discussion about both the authoring of Metrics (the rubric) and about the proposed metrics themselves.