TEC Talks: A Guide for Handling Mis- and Disinformation

Few contemporary problems can be addressed by reference to a single discipline. Misinformation — a critical issue of our time — is no different. Join us for a conversation between Center for an Informed Public co-founder Ryan Calo, AI for the People founder and CEO Mutale Nkonde, Notre Dame Technology Ethics Center Director Mark McKenna, and Notre Dame-IBM Technology Ethics Lab Director Elizabeth Renieris on the importance of interdisciplinary teams to understanding and resisting misinformation.

Read the biographies of the speaker here: Science and Technology Studies Toolkit: A Guide for Handling Mis- and Disinformation

Read the event recap, watch the video, or listen to the podcast below.


Highlights

  • Narrative solutions are one way to the problem of disinformation, because they make the problem more approachable. 
  • Combating mis- and disinformation requires an interdisciplinary team, which in turn requires diverse voices on the team. 
  • “Law professors – we don’t do a lot of interdisciplinary research necessarily, but when we do, we train our students to get a sense of a very complex set of issues well enough to be able to persuade and argue about it. We have a facility with trying to get our minds around it in context and in an efficient way and talk about it fluently. And in addition, more so than almost any other discipline, we’re interested in bridging that descriptive to proscriptive barrier.“ (Ryan Calo, 10:24)
  • “So much of the advocacy that’s going on in this space is really about the sky falling down…That is not how we galvanize people to action. We actually want to galvanize people to action through hope, through joy, through curiosity, and through hopefully having some fun so gamifying people into becoming warriors for truth.“ (Mutale Nkonde, 15:10)
  • It’s possible for researchers to develop a “hero complex” when working on problems that impact specific communities, without developing a relationship with the community for which they are purporting to solve the problem. Taking on the mindset of a subject matter expert, rather than a spokesperson, can mitigate that. 
  • Marrying harms from mis- and disinformation to existing civil rights protections may be a better solution than relying on ethics, which is difficult to legislate. 
  • “We developed a formal method called the Diverse Voices Project, in which we take our early stage policy recommendations and convene experiential experts from different communities that might be affected by it. And we present them with the document and we ask, what’s broken? That might mean we talk to people living with disabilities, we might talk to folks who have been formerly incarcerated. And in addition to people with that lived experience, our experiential panels also consist of people who study or advocate on behalf of those groups, because sometimes the individual might have the lived experience but not see the patterns that you see from advocating on behalf of that community.”  (Ryan Calo, 37:37)
  • “We’re a communications firm, we’re not a civil rights firm. We’re in service of translating this research. We found that we have to do a lot of world building. Just because I’ve been speaking about facial recognition for God knows how many years, it doesn’t mean that anyone in the communities I’m interested in – Black and brown communities – are thinking about these issues because there are closer quality-of-life issues they’re interested in.”   (Mutale Nkonde, 39:43)
  • Creating deterrents at the diplomatic and commercial levels may also help to mitigate mis- and disinformation. 

Event recap

Ryan Calo, Lane Powell and D. Wayne Gittinger Professor at the University of Washington School of Law, began discussing the novel efforts to try to educate the public and increase awareness regarding misinformation and disinformation tactics.  Mutale Nkonde, CEO of AI for the People, and Calo both stressed the importance of lawyers in the process of providing solutions for this problem, even though it is initially a technology and design problem.  

Nkonde highlighted that coming from a journalist background, she is keen to focus the narrative on solutions and make the problem of disinformation approachable rather than paint the narrative that the sky is falling.  If people can grasp disinformation in the context of a similar narrative, that can help make the problem approachable.  She used the example of specific instances where black people were being told not to vote on the presidential election, and then created a team of micro-influencers to combat that narrative. 

Creating a team to combat mis and disinformation requires an interdisciplinary group, according to Calo.  He stressed that there is a lot of work that his lab feels it cannot do well for a lack of diversity, and so it is also necessary to have diverse voices on the team as a result.  Nkonde added the example of systems of policing, and the work that is being done to study the way in which policing is becoming a technical project, with the increasing use of facial recognition technology and other algorithms.  This work studies the risk of each of these technical projects in collaboration with legal practitioners.

However, academics should avoid a hero complex when it comes to this work, Nkonde said.  It is easy to gain notoriety as an academic for the role in solving these problems, without having any relationship to the community for which the academic is purporting to solve the problem.  Academia and the tech industry both need to find more humility in their role in providing solutions, and recognize it is more of a subject matter expert role than it is in any way a role that enables academics or technologists to speak on behalf of the community the problem impacts.  Nkonde suggested the example of the Netflix film the Social Dilemma, which presented technologists, many of whom contributed to the creation of the problems discussed in the film, as the architects of the burgeoning solutions.  She suggested this was a dishonest narrative and we should avoid trying to find heroes or cultivate a single-genius narrative, but rather recognize that we are all in this together.  Calo echoed the same, and both urged for greater support from institutions and individuals for communities and folks on the ground working towards solutions to problems that impact those communities, even when the problems are technological in origin. 

Nkonde ended by suggesting that marrying harm to existing civil rights protections and relying on ethics, which doesn’t define harm and makes it difficult to legislate, is the biggest issue missing from the conversation right now.  Calo added that the role of deterrence needs to be more carefully considered with respect to dis and misinformation problems.  Quasi-state actors must be deterred at a diplomatic level.  Major contributors to misinformation are likewise incentivized by commercial goals.  All such deterrence considerations must be added to the conversation, Calo said.


View the discussion recorded on Monday, April 5, 2021, with Mark McKenna, Elizabeth M. Renieris, and special guests Ryan Calo and Mutale Nkonde.


Listen to the discussion wherever, whenever, on The ThinkND Podcast:


Register to participate in future discussions.

All TEC Talks Recaps
Health and SocietyLaw and PoliticsScience and TechnologyElizabeth RenierisData ScienceMark McKennaMisinformationNotre Dame Technology Ethics CenterCollege of Arts and LettersCollege of Science