Misinformation and Disinformation

Conspiracy theories and other false claims have always been part of our discourse, even (and perhaps especially) our politics. But modern technologies have changed the scale of the problem, with profound implications for our culture and for democracy. This lecture focuses on the role of technology in promoting mis- and disinformation, the ethical problems that creates, and the technical, legal, and institutional responses best suited to our modern challenges.
Apr 26, 2024

Top 9 Learning Moments

  1. Conspiracy theories and misinformation proliferate on the platforms because they maximize engagement.
  2. [At Facebook,] we didnā€™t want the public to think that we actually could clean up certain parts of the platform. And so youā€™ll hear a lot about how itā€™s too hard or thereā€™s not enough technical solutions, or all of these things. Iā€™ll tell you that some of the things that we proposed in our teams were 100% technically feasible, and [Facebook] still wouldnā€™t do it. ā€” Yael Eisenstat
  3. Narrative solutions are one way to the problem of disinformation, because they make the problem more approachable.
  4. Combating mis- and disinformation requires an interdisciplinary team, which in turn requires diverse voices on the team.
  5. We definitely deserve the right to anonymous communication privately, but for the sake of social contracts, where conversation and speech has an impact on society, we have an obligation to identify ourselves and identify our backgrounds and motivations. ā€” David Magerman
  6. People ought to be transparent about what their goals are, what their intentions are. If Google would just say ā€˜We want to be evil,ā€™ Iā€™d be OK with them. The problem is that they went through this whole thing of ā€˜Donā€™t be evil,ā€™ and then they went and were evil for a while. ā€” David Magerman
  7. The biggest challenge for mid-sized, ā€œsecond waveā€ platforms is scale and speed, as they have found themselves with tens of millions of users from all over the world in just a few years.
  8. Users taking on the burdens of content moderation may help them more fully realize their role in, and impact on, their online communities, but it also creates the danger of moderators becoming gatekeepers.
  9. One of the benefits of being a smaller company is being able to iterate on community guidelines, and we consistently engage with civil society groups and academics, getting input on what our community guidelines should be and that theyā€™re clear, and that they are translated into the languages that our users are interacting with us. ā€” Clint Smith

Interested in learning more?

This series is hosted by ThinkND, the University of Notre Dameā€™s online learning community that connects you with videos, podcasts, articles, courses, and other resources to inspire minds and spark conversations on everything from faith and politics to science, technology, and your career.

Listen to the Series

Suscribe to the ThinkND podcast on Apple, Spotify, or Google.

Featured Speakers

Joan Donovan, Research Director of the Shorenstein Center on Media, Politics and Public Policy, Harvard University

Roger McNamee, Author of Zucked

Ifeoma Ozoma, Founder and Principal, Earthseed

Danielle Citron, Professor of Law, University of Virginia School of Law

Yael Eisenstat, Formerly CIA, Facebook , NA

Siva Vaidhyanathan, Professor, University of Virginia

Ryan Calo, Lane Powell & D. Wayne Gittinger Endowed Professorship; Professor of Law, University of Washington

Mutale Nkonde, 2020-2021 Fellow , Notre Dame Institute for Advanced Study

David Magerman: Differential VC

Julie Owono, Executive Director; Member of the Facebook Oversight Board, Internet Sans Frontieres

Clint Smith, Chief Legal Officer, Discord

[At Facebook,] we didnā€™t want the public to think that we actually could clean up certain parts of the platform. And so youā€™ll hear a lot about how itā€™s too hard or thereā€™s not enough technical solutions, or all of these things. Iā€™ll tell you that some of the things that we proposed in our teams were 100% technically feasible, and [Facebook] still wouldnā€™t do it.

– Yael Eisenstat