MIT’s anti-misinformation accuracy prompts being tested by tech companies
The spread of inaccuracies on social media – including political “fake news,” and misinformation about COVID-19 and vaccines – has led to a plethora of problems in areas as disparate as politics and public health.
To combat the spread of false and misleading information, MIT Sloan School Professor David Rand and his colleagues have been developing effective interventions that technology companies can use to combat misinformation online.
A study published in The Harvard Kennedy School Misinformation Review by Rand and other researchers at MIT’s Sloan School of Management, Political Science Department and Media Lab, as well as the University of Regina, and Google’s Jigsaw unit, introduces a suite of such interventions that prompt people to think about accuracy before sharing.
“Previous research had shown that most social media users are often fairly adept at spotting falsehoods when asked to judge accuracy,” says Rand. “The problem was that this didn’t always stop them from spreading misinformation because they would simply forget to think about accuracy when deciding what to share. So, we set out to develop prompts that could slow the spread of false news by helping people stop and reflect on the accuracy of what they were seeing before they click ‘share.'”
In April and May 2020, the team showed 9,070 social media users a set of 10 false and 10 true headlines about COVID-19 and measured how likely they were to share each one. The team tested a variety of approaches to help users remember to think about accuracy, and therefore be more discerning in their sharing.
These approaches included having the users rate the accuracy of a neutral (non-COVID) headline before they started the study (thereby making them more likely to think about accuracy when they continued on to decide what news they’d share); secondly, providing a very short set of digital literacy tips, and third, asking users how important it was to them to only share accurate news (almost everyone think it’s very important).
By shifting participants’ attention to accuracy in these various ways, the study showed that it was possible to increase the link between a headline’s perceived accuracy and the likelihood of it being shared – that is, to get people to pay attention to accuracy when deciding what to share – thereby reducing users’ intentions to share false news.
This work isn’t just consigned to the ivory tower.
MIT is working with Jigsaw to see how these approaches can be applied in new ways. Anti-misinformation accuracy prompts offer a fundamentally new approach to reduce the spread of misinformation online by getting out ahead of the problem.
Instead of just playing catch-up with corrections and warnings, anti-misinformation accuracy prompts can help people avoid unwittingly engaging with misinformation in the first place.
“Most internet users want help navigating information quality, but ultimately want to decide for themselves what is true. This approach avoids the challenges of ‘labeling’ information true or false. It’s inherently scalable. And in these initial studies, users found accuracy prompts helpful for navigating information quality, so we’re providing people with tools that help accomplish pre-existing goals,” according to Rocky Cole, one of the Jigsaw researchers on the team.
“From a practical perspective,” says Rand, “This study provides platform designers with a menu of effective accuracy prompts to choose from and to cycle through when creating user experiences to increase the quality of information online. We are excited to see technology companies taking these ideas seriously and hope that on-platform testing and optimization will lead to the adoption of new features to fight online misinformation.”
Rand and Cole co-authored the paper “Developing an accuracy-prompt toolkit to reduce COVID-19 misinformation online,” which was published in the Harvard Kennedy School Misinformation Review, with MIT Ph.D. student Ziv Epstein, MIT Political Science professor Adam Berinsky, University of Regina Assistant Professor Gordon Pennycook, and Google Technical Research Manager Andrew Gully.
Inside Telecom provides you with an extensive list of content covering all aspects of the tech industry. Keep an eye on our Ethical Tech section to stay informed and up-to-date with our daily articles.