Skip to content

Quantifying the Persistence of Misinformation: A Case Study in R

Notifications You must be signed in to change notification settings

annacalla/pizzagate

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 

Repository files navigation

pizzagate

Quantifying the Persistence of Misinformation: A Case Study in R

As we saw firsthand in the 2016 presidential election, more users than ever are treating social media platforms as news outlets, which can influence election results. This has put increased pressure on platforms to implement fact checking and other corrective measures to ensure that future elections are fair.

An experiment conducted in 2018 by Ethan Porter, Thomas J. Wood, and David Kirby aimed to directly quantify the impact of issuing corrections on audience beliefs. This survey experiment randomly selected participants for the control and treatment groups. Members assigned to the control group were exposed to two randomly selected fake news stories. Members assigned to the treatment group were exposed to two randomly selected fake news stories and corrections to the fake news stories. All respondents were asked about the degree to which they agreed with the content of the news stories, with responses ranging across a 5 point scale of agreement, from Strongly disagree to Strongly agree. The conclusion of the experiment was the finding that respondents randomly assigned to the treatment group were less likely to believe the fake news stories.

I wanted to conduct an exploratory data analysis, balance test, and multilinear regression to calculate the average treatment effect for this experiment.

(View the pdf titled Testing the Persistence of Misinformation.pdf for more details)

About

Quantifying the Persistence of Misinformation: A Case Study in R

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published