TEC Talks: Misinformation and Disinformation – What Do We Value? The Ethics of Tech Accountability

Subscribe to the ThinkND podcast on AppleSpotify, or Google.

Featured Speakers: 

  • Roger McNamee, Author of Zucked 
  • Ifeoma Ozoma, Founder and Principal, Earthseed
  • Mark McKenna ’97, John P. Murphy Foundation Professor of Law; Founding Director, Notre Dame Technology Ethics Center
  • Elizabeth M. Renieris, Professor of the Practice; Founding Director, Notre Dame-IBM Technology Ethics Lab

There is a conflict of value systems between engineering, which optimizes for efficiency, and democracy which treasures enlightenment values such as self-determination, Roger McNamee, author of the New York Times bestseller, Zucked: Waking Up to the Facebook Catastrophe, opened his talk by stating.  When a software system reaches nationwide scale, a conflict emerges between efficiency and those systems which value personal choice and the ability to express oneself through self-determining democratic systems. Ifeoma Ozoma, founder and principal of Earthseed, opened her part of the talk with a related statement, that there is not a lack of values in tech today, simply just a value system that prioritizes profitability.  This, by default, has curtailed prioritization of other values such as safety and fairness.  She used the example of the platforms’ pursuit of engagement by defending the freedom of speech as a universal right and an excuse for not removing harmful content, suggesting that actually only creates a safe space for certain people and derails the engagement or speech opportunities for all others on the platform.  Ozoma agreed with McNamee that this structure, which functions essentially as a 50/50 power split between engineering and business, is also another way of showing that platforms inarguably prioritize growth over other values, and valuing growth over “actual human beings” creates the sort of problems we face on social media today.

On the topic of scale, McNamee pointed out that one of the major problems is that consumers facially enjoy the platforms’ engagement model-it ensures lots of dopamine hits.  He said the scale of these platforms however has undermined democratic values, and derailed democracies already as a result (using the examples of Brexit and several other elections worldwide).  He also discussed the “Real Facebook Oversight Board” which criticizes the Facebook Oversight Board, which he also criticized as overly legalistic and functioning merely as a complaint system.  That system does not sufficiently address the broader responsibility that Facebook must acknowledge they have in ensuring the information ecosystem is not diluted or polluted by problematic posts.  Ozoma agreed that the Facebook Oversight Board was insufficient, and did nothing to address the overall responsibility Facebook’s actual board has.  Ozoma suggested real change might come if Zuckerburg’s majority control of Facebook’s board was addressed.  Ozoma also said scale itself is not an issue; rather safety needs to be prioritized, regardless of the scale of the platform.  By safety, Ozoma suggested focusing on ethics and values that ensure everything from thoughtfulness across product surfaces from problematic posts to recommendation algorithms.  McNamee punctuated this point with a reminder that Section 230’s effective blanket immunity has prevented any accountability on the part of tech companies to anticipate or mitigate harm.

The speakers were pressed on the specific types of harms being created.  Ozoma highlighted the harms of health misinformation, and argued that understanding health misinformation systems can shed light on all topics of misinformation systems.  The lack of focus on the communities being harmed ultimately results in everyone being harmed, because the exact same strategies that are first used in small minority groups are recycled with new topics and messaging across larger groups.  McNamee argued that the platforms profit enormously from misinformation systems, because they boost engagement.  He also argued we should dismiss out of hand any claim from platforms that they were not aware of their role in the problem.

McNamee raised the way conspiracy theories proliferate across platforms, and how this is core to the business model of the platforms because hate speech, disinformation, and misinformation are all the types of posts that maximize engagement on the platforms.  He said the two realities that such proliferation has created should be addressed.  Ozoma highlighted the work of tech ethicists raising these issues and then being pushed out of the companies, such as the recent case of Timnit Gebru being hired as a tech ethicist at Google and subsequently being pushed out of Google.  Ozoma urged listeners to be aware of the power that non-disclosure agreements have in this process, since the companies as a result of these, control all information to conduct true research, and thus control the narrative by deciding who to share this information with.

In short, they both highlighted that there would be a true cost to the bottom line of platforms if they were to consider safety as a value.  McNamee argued this means regulation is key.  Elizabeth asked if misinformation and disinformation are merely symptoms of the greater problem, what would regulation look like and what would it address?  McNamee suggested a focus on privacy as a form of self-determination, as well as regulation addressing data sharing, ownership, and other aspects that undermine user control over their own data.  He also recommended a greater focus on safety, meaning punishing engineers even directly for their poor designs like we do other physical engineering industries.  Third, he recommended pursuing antitrust actions more seriously, especially to use the threat of criminal jail sentences as leverage to get a better set of outcomes.  Ozoma recommended a focus on worker power.  First, increasing worker protections including better whistleblower protections and decoupling healthcare costs from employment in order to ensure everyone has the equal freedom to speak up against wrongdoing.  This would enable workers to share more freely about the details and nuances in day-to-day decision-making that undermine user privacy, for example.  She recommended the legislation being pushed right now in California as a sample law that addresses these issues with solutions.  McNamee echoed that democratic engagement is paramount right now to address these core ethical concerns.

Visit the event page for more.


  • There is a conflict between “efficiency values” and “enlightenment values.” The technology industry prioritizes efficiency values at the expense of enlightenment values, such as self-determination and democratic process.
  • Understanding health misinformation is a helpful lens for understanding all types of misinformation.
  • Conspiracy theories and misinformation proliferate on the platforms because they maximize engagement.
  • Addressing platform safety concerns, especially with respect to mis- and disinformation, may harm the platforms’ bottom line and require regulatory solutions.
  • Other solutions proposed include tech worker empowerment, non-disclosure agreement (NDA) reform, and new antitrust regulation.

  • “The United States has a culture in business that the shareholder is the only stakeholder that matters. And if you think about it, if you only care about one of your stakeholders – you don’t care about your employees, you don’t care about the communities where your employees live, you don’t care about your customers or  suppliers or the country you live in – that’s going to excuse all manner of bad behavior.” (Roger McNamee, 17:15)
  • “Part of the issue is that we don’t have any sort of consensus across the [technology] industry as a whole about what it means to put safety first and to design with safety in mind.”  (Ifeoma Ozoma, 31:50)

Technology Ethics CenterUniversity of Notre Dame