June 27, 2023

Fact-checkers gather evidence to verify or debunk claims in a wide variety of contexts and circumstances — so it makes sense that they would value similar methods in evaluating their own work. Likewise, academic research about fact-checking utilizes an evidence-based approach to analyze fact-checking as a journalistic field by investigating the specific processes used for verification and debunking, as well as for measuring belief and skepticism. 

Academic researchers have studied fact-checking since the recent wave of fact-checking organizations launched in the early 2000s. Many of their studies focus on fact-checking effectiveness — whether fact-checking corrects false beliefs and under what conditions. Other studies examine specific methods for debunking false information. Still others focus on topical fact-checking, such as the recent COVID-19 epidemic. 

Many academic researchers apply rigorous statistical measurements to empirically grounded samples. This research regularly appears in peer-reviewed journals, further validating these research methods.

As a member of the fact-checking community, first as a reporter and then as editor-in-chief of PolitiFact, I have long been fascinated by how academics study fact-checking, and in turn by the way that fact-checkers think about academic research. 

The fact-checking field is intellectually diverse and highly interdisciplinary. Some fact-checkers come from academia itself. Other fact-checkers started in narrative political journalism. Still others came to fact-checking from computer science, or the legal profession, or civic engagement. Some of the most experienced fact-checkers even conduct their own research on fact-checking and publish their own reports.

I wanted to know more about the ways that fact-checkers think about academic journalism in 2023, and I especially wanted to gauge whether there was consensus among journalists on what makes for helpful research. To answer those questions, I conducted a survey in spring 2023, querying fact-checkers and academics who work in the misinformation field. My goal was to ascertain the practical utility of academic research about fact-checking and to elicit suggestions of specific pieces of research that were most helpful to fact-checkers. I also asked secondary questions on future directions for research. 

Ideally, I hoped the survey would result in increased awareness of specific pieces of research and suggest future research questions. 

This survey was not conducted randomly, nor was it a scientific sampling of fact-checkers. Its results reflect the responses of those who chose to participate — primarily fact-checkers who are part of the International Fact-checking Network and academics who participate in the Combating Fake News listserv. The survey was conducted between March 29 and April 18, 2023, and generated approximately 101 unique responses. The preponderance of responses, 59%, were from the fact-checking community, with 26% of respondents identifying as members of the academic community and 15% identifying as members of both groups equally. Seventy percent of respondents claimed affiliation with the International Fact-Checking Network. 

Key findings

Overall, the results indicate that all participants found academic research highly useful to their work. Participants rated most types of research as either “very useful” or “useful.” Among the types of research most often deemed “very useful or “useful” were: 

  • Research on the effectiveness of new formats for fact-checking (video fact-checks, graphics, illustrations, multimedia, etc.) at 87%.
  • Studies of how misinformation spreads at 84%.
  • Studies of the audience for fact-checking (people’s openness to correction, demography, sharing behavior, etc.) at 82%.
  • Experimental studies testing the effectiveness of fact-checking in correcting false beliefs at 80%.

Participants also highly valued more topical research and selected studies on the fact-checking of elections and voting as the most valuable. The percentages of participants who found the value of topical studies included as “very useful” or “useful” were:

  • Elections and voting at 89%. 
  • Climate change at 85%.
  • Vaccines at 81%.
  • Health care at 78%.
  • Political polarization at 74%.

The survey also asked what topics need more fact-checking research and allowed respondents to answer any way they wished. The most commonly requested topics centered around questions of fact-checking effectiveness, such as the impact of fact-checking on media consumption; whether fact-checking is effective at countering propaganda; and how effective fact-checking is on social media. Other requests for more research centered on professional issues such as training and tools, artificial intelligence and gender issues/LGBTQ+ issues

Fact-checkers said they had little awareness of most research about fact-checking: only 17% of respondents strongly agreed or agreed that they were aware of most of the research that exists about fact-checking. Among fact-checkers, 78% said that academics could do more to make their work relevant to fact-checkers, and 81% of academic researchers agreed. Interestingly, only 54% of academic researchers agreed or strongly agreed that fact-checkers needed to do more to learn about research and incorporate it into their work, while 73% of fact-checkers felt this way. These findings indicate that fact-checkers understand the potential practical benefits of academic research and are eager for more exposure to it.

Five notable studies; 15 noted researchers

The survey asked about specific studies or publications that were useful to fact-checkers, but opinions ranged widely, and there was no consensus on “top” studies. Additionally, 63% said they could not suggest a single piece of research as “especially useful” to fact-checkers. 

Among those who did suggest research, respondents cited more than 100 different pieces of individual research as most useful. We have collected those works for public review into a list on Zotero, an academic citation tool. Most studies were cited only once. But five scholarly articles or reports were referenced by at least two or more participants. They included:

“The psychological drivers of misinformation belief and its resistance to correction,” in the online journal Nature Reviews Psychology, by Ullrich Ecker, Stephan Lewandowsky, John Cook, Philipp Schmid, Lisa Fazio, Nadia Brashier, Panayiota Kendeou, Emily Vraga & Michelle Amazeen. (2022)

This study synthesizes research on how people form false beliefs from both exposure to inaccurate information and also from social cues, emotions and political opinion. Social exclusion, for example, can increase belief in conspiracy theories. It analyzes the dynamics of both debunking and prebunking, especially when combined with social factors. Those might include corrective information shared by trusted friends, or offering fact-based alternative accounts. Best practices on social media include linking to expert sources and correcting quickly and early while avoiding harsh social confrontation. The authors emphasize that more study is needed to avoid overgeneralization about specific facets of combating misinformation. It urges researchers to rely less on short-term studies and to examine big-picture trends over longer periods of time.

“The Debunking Handbook,” by multiple authors including Stephan Lewandowsky, John Cook and Ullrich Ecker. (2020)

“The Debunking Handbook” is a concise and pithy guide to making decisions about fact-checking, yet it was compiled by more than 22 academic researchers and cites 108 footnotes. It offers advice on prebunking, deciding when and what to fact-check, and structuring fact checks for maximum impact. Fact-checking newsrooms might consider reading and discussing the report as a group.

“Architects of networked disinformation: Behind the scenes of troll accounts and fake news production in the Philippines,” by Jonathan Corpus Ong and Jason Vincent A. Cabañes. (2018)

This ethnographic study takes a deep dive into the coordinated disinformation campaigns that occurred in the Philippines before and after the 2016 presidential election of Rodrigo Duterte. Through interviews and observation, the study uncovers both professionalized political operators and everyday digital workers who produced the campaigns. The researchers find gaps in campaign finance laws and platform regulation, but also caution against oversimplified solutions. It warns that social media fact-checking programs, though well-meaning, do not address “the professionalized and institutionalized work structures and financial incentives that normalize and reward ‘paid troll’ work.”

“The global effectiveness of fact-checking: Evidence from simultaneous experiments in Argentina, Nigeria, South Africa, and the United Kingdom,” by Ethan Porter and Thomas J. Wood. (2021)  

In this comparative study, researchers conducted simultaneous experiments in four countries to test whether fact-checking could reduce false beliefs and if the effects were country-specific. The results show that fact-checking reduced false beliefs in all countries with surprisingly little variation by country. The study utilizes fact-checks from Full Fact, Africa Check and Chequeado.

“Correction format has a limited role when debunking misinformation,” by Briony Swire-Thompson, John Cook, Lucy Butler, Jasmyne Sanderson, Stephan Lewandowsky and Ullrich Ecker (2021)

Researchers conducted experiments that tested whether fact-checking formats mattered: if the truth was presented first, if the false fact was presented first, or if the false fact wasn’t mentioned at all and only a truthful account was presented. The results indicate that, provided the key ingredients of a correction were presented, the format did not make a considerable difference on the outcome. The researchers conclude that providing corrective information is more important than how the correction is presented.

In addition to specific studies, the survey also asked participants to list individual researchers whose work they consider useful. Participants mentioned the following 15 researchers three or more times:

  • Adam Berinsky
  • Leticia Bode
  • Joan Donovan
  • Ullrich Ecker
  • Lucas Graves
  • Stephan Lewandowsky
  • Rasmus Kleis Nielsen
  • Brendan Nyhan
  • Gordon Pennycook
  • Jonathan Ong
  • David Rand
  • Kate Starbird
  • Briony Swire-Thompson
  • Emily Vraga
  • Claire Wardle

Finally, we asked open-ended questions seeking suggestions for improvements to both research and fact-checking. Responses generally included the following ideas:

  • Researchers and fact-checkers should work together more closely in order to create studies that have practical applications to fact-checking work.
  • More research is needed for non-English-speaking contexts, especially in India and Latin America.
  • Research needs to be more accessible to fact-checkers, either by removing journal paywalls or by including action-step summaries.
  • Researchers and fact-checkers need more in-person or live interactions at conferences and other meetings to facilitate communication.

Conclusion and next steps

The opinions expressed in the survey clearly show that fact-checkers appreciate academic research and would welcome more familiarity with research findings. Yet confusion remains in how time-pressed fact-checkers might engage more with complicated research and integrate the sometimes subtle findings into their daily work.

Academic research on misinformation remains a discipline outside the natural rhythms of fact-checking newsrooms, and academic findings are often presented with long-winded language that fails to clearly emphasize key takeaways. Specific instructions to “do this, not that” are seldom present. Sometimes there are no takeaways; the research itself yields findings that are unclear, contradictory or highly nuanced.

One clear path ahead for the fact-checking community would be to more regularly use research that is crafted for general audiences, such as “The Debunking Handbook.” An editor could assign fact-checkers to read the material, which is very accessible, then convene a group discussion on how fact-checking reports might follow some of the handbook’s recommendations.

Additionally, the International Fact-Checking Network could report and write more regularly on academic research that is particularly applicable to fact-checking journalism. The network could conduct online question-and-answer sessions with individual researchers for the benefit of fact-checkers and other interested communities. The network could also serve as a convening power to bring academics and fact-checkers together for conferences or regular webinars.

Finally, fact-checkers themselves could more widely acknowledge that they need to apply more effort to finding and integrating research findings into their work, even when such efforts feel costly in terms of time and energy. Building more bridges between the academic and fact-checking communities could increase the impact of fact-checking and achieve more clarity of knowledge for all.

Related: Survey results highlights (pdf)

Correction: Due to a data error, this report originally overreported how much awareness fact-checkers said they had about academic research. The correct figure is that only 17% of respondents strongly agreed or agreed that they were aware of most of the research that exists about fact-checking.

Support high-integrity, independent journalism that serves democracy. Make a gift to Poynter today. The Poynter Institute is a nonpartisan, nonprofit organization, and your gift helps us make good journalism better.
Donate
Angie Drobnic Holan is the director of the International Fact-Checking Network, which supports and promotes fact-checking worldwide. Before assuming that role in June 2023, Holan…
Angie Drobnic Holan

More News

Back to News

Comments

Comments are closed.

  • That’s a good article to kick off the directorship of Angie Holan.

    On a separate note, I hope Holan’s stated intention to streamline the IFCN application process doesn’t mean the IFCN will either continue to enforce a lax standard or relax standards even further than they are already.

    If a complaint to the IFCN shows an error in a fact check article that the publisher failed to correct in response to a correction request, and that case was reported to the IFCN, how can the IFCN excuse the error continuing on for years afterward? That’s a failure for the IFCN as well as the signatory.