CrowdTruth framework for crowdsourcing ground truth for training & evaluation of AI systems
Branch: master
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
crowdtruth
test
tutorial
.coveragerc
.gitignore
.scrutinizer.yml
.travis.yml
LICENSE
README.md
setup.cfg
setup.py

README.md

CrowdTruth

PyPI version Build Status codecov Scrutinizer Code Quality

This library processes crowdsourcing results from Amazon Mechanical Turk and CrowdFlower following the CrowdTruth methodology. A full description of the metrics is available in this paper. For more information see http://crowdtruth.org.

If you use this software in your research, please consider citing:

@article{CrowdTruth2,
  author    = {Anca Dumitrache and Oana Inel and Lora Aroyo and Benjamin Timmermans and Chris Welty},
  title     = {CrowdTruth 2.0: Quality Metrics for Crowdsourcing with Disagreement},
  year      = {2018},
  url       = {https://arxiv.org/abs/1808.06080},
}

Useful links:

Installation

To install the stable version from PyPI, install pip for your OS, then install package using:

pip install crowdtruth

To install the latest version from source, download the library and install it using:

python setup.py install

Tutorial

The following tutorial is a collection of slides, exercises and Jupyter notebooks that explains what is the CrowdTruth methodology, and how to use it in practice. If you are already familiar with CrowdTruth, you can skip straight to the guide on how to run this library.

Introduction to CrowdTruth

Task Design & Building CrowdTruth Annotation Vectors

Data Processing & CrowdTruth Metrics