Skip to content

Latest commit

 

History

History
18 lines (9 loc) · 1.8 KB

README.md

File metadata and controls

18 lines (9 loc) · 1.8 KB

Replication Materials: Truth be told: How “true” and “false” labels influence user engagement with fact-checks

Replication materials for Aruguete, Bachman, Calvo, Valenzuela and Ventura "Truth be told: How “true” and “false” labels influence user engagement with fact-checks", forthcoming at New Media and Society.

Abstract: When do users share fact-checks on social media? We describe a survey experiment conducted during the 2019 election in Argentina measuring the propensity of voters to share corrections to political misinformation that randomly confirm or challenge their initial beliefs. We find evidence of selective sharing—the notion that individuals prefer to share pro-attitudinal rather than counter-attitudinal fact-checks. This effect, however, is conditioned by the type of adjudication made by fact-checkers. More specifically, in line with motivated reasoning processes, respondents report a higher intent to share confirmations (i.e. messages fact-checked with a “true” rating) compared with refutations (i.e. messages fact-checked with a “false” rating). Experimental results are partially confirmed with a regression discontinuity analysis of observational data of Twitter and replicated with additional experiments. Our findings suggest that factcheckers could increase exposure to their verifications on social media by framing their corrections as confirmations of factually correct information.

The latest pre-print can be found here. The published version is here

Tutorial

This README file provides an overview of the replications materials for the article. The R codes used in the article can be found under the folder Codes. The survey and behavioral data is under the folder data. Results are exported to the folder output.

Codes

Data