TEC Talks: Misinformation and Disinformation – How Social Media’s Obsession with Scale Supercharged Disinformation

Subscribe to the ThinkND podcast on AppleSpotify, or Google.

Featured Speakers: 

  • Joan Donovan, Research Director of the Shorenstein Center on Media, Politics and Public Policy, Harvard University
  • Mark McKenna ’97, John P. Murphy Foundation Professor of Law; Founding Director, Notre Dame Technology Ethics Center
  • Elizabeth M. Renieris, Professor of the Practice; Founding Director, Notre Dame-IBM Technology Ethics Lab

In the aftermath of the insurrection at the U.S. Capitol building on January 6th, 2021, Joan Donovan, PhD, Research Director of the Shorenstein Center on Media, Politics and Public Policy, wrote an article called “Everything Open will be Exploited” for the Harvard Business Review. Donovan was angry that “we could see it coming,” as the event had been presaged by escalating violent speech on the internet and violence at several state capitals. Despite researchers, journalists, and civil society organizations all sounding the alarm in advance, Donovan saw a willful ignorance of the problem of scale and how it tied into the events of the day.

Donovan turned to an example of harm related to scale, medical misinformation. It is problematic when people are seeking accurate, timely, relevant, local medical information and instead receive misinformation, because medical misinformation can lead individuals to change their health behaviors. Before diving into the discussion of medical misinformation, Donovan presented several important definitions, including media, manipulation, disinformation, and networked conspiracies.

Donovan noted that misinformation can be seen as a lifecycle that includes five stages: Organization and planning, seeding and development across the open web and platforms, responses from industry, politicians, and civil society, mitigation, and adjustment. Scale comes into play in the “responses” phase, when individuals with online audiences such as politicians, activists, and journalists may amplify the disinformation. Until recently, few strategies for mitigation were utilized, but can include journalists debunking the disinformation, or the subjects “clapping back” to refute the disinformation. Adjustments come from both the parties generating disinformation, and increasingly from internet platforms changing their policies, such as banning, removing, or flagging content that promotes disinformation.

Donovan stepped back to discuss the internet as a system around which society is ordered. While one may think of the internet as an “information commons,” it is more of an “advertising system that blankets the world.” Gatekeepers to the advertising system are extremely influential on political, social, and economic life. These gatekeepers, such as Facebook and Amazon, don’t see themselves as information services, and are under no public interest obligations in the way that other media, such as radio, are.

Donovan emphasized that data matters in the aggregate; organizations that are interested in data care not so much about individuals’ data, but about the connections between massive amounts of individual data. These organizations seek to avoid the politicization of this data, but in the lead-up to the 2020 election, different organizations took different policy approaches to the issue. For example, Facebook did nothing to either promote or censor political speech, while Twitter banned all political ads.

Circling back to COVID-19, Donovan became interested in internet searches about where COVID-19 came from, as this type of question became a “networked terrain” for disinformers. Her research team noted a spike in scientific publishing around COVID-19, and coined the term “cloaked science,” a tactic of presenting scientific jargon, technical language, and graphs to lend credibility to claims being made. Preprint servers, data repositories, journals, and baiting journalists are all methods of spreading cloaked science.

Donovan then dove into a specific example of cloaked science to illustrate the media manipulation lifecycle. A news story alleging that COVID-19 was a bioweapon developed in China was published on a Chinese news site with close links to right-wing news media in the United States (planning). A scientific paper by the scientist alleging that COVID-19 was created in a lab was posted to a scientific preprint site, and gained over one million views (seeding). This was unusual, as most scientific papers receive only a few hundred downloads, even if they are from top experts in the field. The story was covered in American right-wing news media, with the scientist portrayed as a whistleblower who was silenced by the Chinese government, fled to the United States, and was then silenced again by American social media companies, as Twitter had taken down her account for disinformation, as well as the links to her paper on the preprint site (response). Scientists and experts in the U.S. quickly debunked the contents of the scientist’s paper, and Facebook added interstitial warnings to her media interviews (mitigation). In the final phase, the scientist wrote more papers and posted them to the preprint server, essentially exploiting the server’s open platform to spread disinformation. In response, the preprint site’s response was to flag any content on their website to inform readers that the content could be misleading (adaptation).

Visit the event page for more.


  • Medical misinformation is dangerous because it can influence individuals who are looking for accurate, timely, relevant, and local information to change their health behaviors. (6:50)
  • Media manipulation is a lifecycle, consisting of planning, seeding, response, mitigation, and adaptation. (11:40)
  • The internet is a system around which society is ordered, but is better understood as an advertising system rather than an information commons. The parties that manage the gates of the advertising system wield massive political, social, and economic influence. (19:02)
  • Individual data doesn’t matter. Data only matters in the aggregate – that is, in relation to other data. Internet platforms seek to avoid the politicization of the data they gather. (22:00)
  • Everything open will be exploited, and being a good steward of information will require a reduction of scale, which is bad for the platforms’ business model. (36:09)

  • “Openness is at issue. Openness really only works as a value on the net when scale is a measure by which you judge everything, including profits.” (Joan Donovan, 3:13)
  • “When people respond is usually when [disinformation] becomes a thing. That is when it actually scales. And that’s what’s important in thinking about why scale matters here, because if nobody responds, and someone is just wrong on the internet, which happens every day, I guess the big key here is to not be the one that is wrong and achieving scale. … we really look at the responses by industries that are being affected, activists, politicians, journalists, that is, people who have audiences online and can make the piece of disinformation or the network behind it seem more important than they are.” (Joan Donovan, 12:35)
  • “As data capture and data harvesting has become in and of itself its own business, we just have to be wary that it’s not necessarily that we’re building an information commons. We’re building an advertising system that blankets the world. And whoever owns the gates to that advertising system, especially who gets to enter and what they get to see, they actually have a much bigger influence on our politics and our society as well as our economy because of that facilitation of coordination, like, massive social coordination.” (Joan Donovan, 19:00)
  • “What’s at stake here is not actually individuals and their uniqueness. We’re all digital snowflakes; however, data only matters in the aggregate. … That is, in a relationship with other kinds of data.” (Joan Donovan, 21:53)
  • “Everything open will be exploited. You have to think about what it means to run the platform. You have to care about the community that is engaging with the things you’re hosting. And what that actually implies is reducing the scale to a human level where you can be good stewards of information. Of course, that is terrible for business.” (Joan Donovan, 36:00)

Technology Ethics CenterUniversity of Notre Dame