Skip to content

poloclub/RECAST

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

RECAST

An Interactive System to Understand End-User Recourse and Interpretability of Toxicity Detection Models

arxiv badge

For more information, check out our manuscript:

Enabling User Recourse and Interpretability of Toxicity Detection Models with Interactive Visualization. Austin P Wright, Omar Shaikh, Haekyu Park, Will Epperson, Muhammed Ahmed, Stephane Pinel, Duen Horng (Polo) Chau, and Diyi Yang Proc. ACM Hum.-Comput. Interact.

This project consists of two components: the frontend and backend.

Both have different setup procedures. Click on either folder to view the README for setting up each component.

To begin, clone or download this repository:

git clone 'repo'

# use degit if you don't want to download commit histories
degit 'repo'

Consent Form

RECAST's "alternative suggestions" feature generates non-toxic alternatives to toxic text. In some instances, however, Recast recommends human-labeled toxic alternatives that are not detected as toxic by the backend model. Completing this form indicates that a user/researcher agrees to not use Recast for the purpose of maliciously circumventing deployed toxicity classifiers.

As a precaution, we've withheld the trained Jigsaw model that Recast uses to generate these alternatives. Though it is possible to train a model on your own, completing this form will notify an author to send over a trained model file.

Credits

RECAST was created by Austin P Wright, Omar Shaikh, Haekyu Park, Will Epperson, Muhammed Ahmed, Stephane Pinel, Polo Chau, and Diyi Yang

Citation

@article{10.1145/3449280,
author = {Wright, Austin P. and Shaikh, Omar and Park, Haekyu and Epperson, Will and Ahmed, Muhammed and Pinel, Stephane and Chau, Duen Horng (Polo) and Yang, Diyi},
title = {RECAST: Enabling User Recourse and Interpretability of Toxicity Detection Models with Interactive Visualization},
year = {2021},
issue_date = {April 2021},
publisher = {Association for Computing Machinery},
address = {New York, NY, USA},
volume = {5},
number = {CSCW1},
url = {https://doi.org/10.1145/3449280},
doi = {10.1145/3449280},
journal = {Proc. ACM Hum.-Comput. Interact.},
month = apr,
articleno = {181},
numpages = {26},
keywords = {natural language processing, content moderation, interactive visualization, toxicity detection, intervention}
}

License

The software is available under the MIT License.

Contact

If you have any questions, feel free to open an issue or contact Omar Shaikh.