TEC Talks: Misinformation and Disinformation – Learning from Our Mistakes: What Can Smaller Platforms Learn from Ethical Challenges at Scale?

Subscribe to the ThinkND podcast on AppleSpotify, or Google.

Featured Speakers: 

  • Julie Owono, Executive Director; Member of the Facebook Oversight Board, Internet Sans Frontieres
  • Clint Smith, Chief Legal Officer, Discord
  • Elizabeth M. Renieris, Professor of the Practice; Founding Director, Notre Dame-IBM Technology Ethics Lab

The panelists started by identifying the core challenges that platforms, particularly small platforms, are grappling with these days. Clint Smith, Chief Legal Officer at Discord, said the primary challenges have to do with “scale and speed.”  He said a company like Discord that started just a few years ago has found itself with tens of millions of users that engage with the product often for hours every day. The speed of the growth and the dynamic nature of the content that they have to moderate poses a challenge under those conditions, particularly as communications on platform shift even more to video and voice.

Julie Owono, Executive Director at Internet Sans Frontières, said most of these platforms are created in the US, but have – often successfully – intended to have a global reach. But in the way that platforms have dealt with the issues Smith described, Owono said the fights and solutions have been localized. Localization can help platforms anticipate and understand how to deal with the problems as they arise. She used the example of disinformation, and said that often free debates of important issues in some countries were classified too quickly as disinformation campaigns and moderated by those in Silicon Valley who run the apps. She said these decisions, which come from far away from the discussion geographically, limit important discourse and freedom of expression.

Smith said that Discord as a “second-wave” company started with a strong product design that put the user first. On Discord, this translates to giving the users lots of options to filter content as a form of user-based content moderation. Users also experience Discord through small groups that Discord creates, and these small communities largely moderate themselves through delegated authority. Discord only comes in as a final third-tier layer to moderate when user and community level controls do not suffice. Owono said the idea that users bear the brunt of keeping the platform content safe is important, and puts a good incentive on users to internalize their role in communities in a way they may only do subconsciously in offline interactions. She said that sometimes moderators can become gatekeepers, and this can have bad results.

Smith said Discord cares greatly about its users outside of the US, and is troubled that report volume-per-user outside of the US is lower than by US-based users. He asked Owono why she thought this might be, and whether she thinks there are barriers for non-US users to make reports. Owono said that first, the report button is not a feature “that speaks to everyone” suggesting that in some parts of the world, calling police is not an effective strategy, for example. Second, Owono said freedom of expression is a newer primary value for users in some countries, particularly many users have only felt empowered with freedom of expression rights for perhaps the last five years. With this in mind, Owono said reporting requires more than engagement on the content itself with these uses, but engagement with the philosophy of community guidelines overall. She suggested a bit more explanation to these users so they are empowered to use the report button. She said this would have a beneficial effect at the societal level as well, as these people will pass this empowerment on in their networks offline as well.

Turning to Discord’s business model, Smith said Discord makes money directly from users. It does not have any advertising, and as a result, Discord has no incentive for anything to go viral on its site. Rather, users pay Discord directly with money, rather than data, for transparent “extras” in the platform. He said this transparency is important so that users understand what they are getting, and how much it costs. He said this isn’t that unusual- users of Netflix pay for that service, and as a result do not assume the Netflix service relies on their data. Owono said that the impact of ad-supported business models on freedom of expression have helped some- in many parts of the world, it was difficult to get news due to decreasing revenues for traditional press outlets. However, Owono said transparency, like that which Smith described at Discord, is paramount. Increasing transparency into what data is being used, and how it is being used, is step one for customers. There should not be an assumption that people do not want that transparency, because they don’t want to know what it going on behind the scenes, for example- they do.

Smith said anonymity and pseudonymity are important for many Discord users. For example, kids in the LGBTQ+ community that are unable to identify as such in their offline lives are very safe thanks to these policies on Discord. He said furthermore, if Discord is used to plan an offline physical crime, Discord provides the plain text of those messages to authorities- and they are open to users about this. He said this strikes a balance between totally encrypted messaging services like Signal, and completely open and non-anonymous platforms (at least under their own policies), like Facebook. Owono said the key term in this issue is “law enforcement requests.” She said the companies should do some due diligence before responding to these requests, to ensure that the requests are proper and not coming from authoritarian governments that will abuse their power with that information. She said all policies on this topic at platforms should focus on how the rules and processes would impact the most marginalized users in the most authoritarian communities offline. Smith further agreed that any US-based legal reform will likewise only protect US users, and will not offer solutions for users outside of the US.

Visit the event page for more.


  • The biggest challenge for mid-sized, “second wave” platforms is scale and speed, as they have found themselves with tens of millions of users from all over the world in just a few years.
  • Users taking on the burdens of content moderation may help them more fully realize their role in, and impact on, their online communities, but it also creates the danger of moderators becoming gatekeepers.
  • Especially in geographies outside of the U.S., getting users to effectively use the “Report” function will take more effort and engagement on the part of the platform, as users are often new to the concept of free expression and the philosophy of community guidelines that support the Report function.
  • Transparency on the part of the platforms about when, how, and why users’ data is used is key.
  • Policies should be developed by taking into account how they might impact the most marginalized members of a platform’s user base.

  • “This idea that users will actually bear the brunt of keeping communities safe or will be at the forefront of doing that is of course very important. It empowers users as well and opens their minds on issues they’ve never thought about in community, in a physical space. That said, there are several challenges around that. First of all, how are we making sure that the community is diverse? Of course, we all have our own values, but there are values that are beyond us and that basically bring us all together and help us keep our communities as peaceful as possible. I’m wondering if by transferring the responsibility of the peacefulness of the spaces, are we making sure also we’re having necessary conversations around, what are actually those values that, no matter the space we’re in should be upheld?“  (Julie Owono 14:10)
  • “One of the benefits of being a smaller company being able to iterate on community guidelines, and we consistently engage with civil society groups and academics, getting input on what our community guidelines should be and that they’re clear, and that they are translated into the languages that our users are interacting with us.”    (Clint Smith, 18:33)
  • “There are certainly parts of the world that trigger way more rapid interventions than others, and that’s unfortunate…We’ve read that there were basically money incentives. ‘Well, this country, we don’t have much revenue there, why should we care? Why should we spend so much?’ But I think that’s a false assumption, and that’s a very bad market decision, because actually those small markets, where there are no revenues, at least for now, these are actually the markets where things are tested, precisely because they are small, precisely because there are no protections, and precisely because nobody cares.” (Julie Owono, 23:58)
  • “In almost all of the discussions is the tension of these large platforms, these gatekeeper platforms, and are they subject to separate rules from the smaller internet entrepreneurs around, or the mid-sized platforms like Discord?“ (Clint Smith, 48:35)

Technology Ethics CenterUniversity of Notre Dame