TEC Talks: Misinformation and Disinformation – The Ethics of Verification, Identity, and Anonymity on the Internet

Subscribe to the ThinkND podcast on AppleSpotify, or Google.

Featured Speakers: 

  • David Magerman, Differential VC
  • Jillian C. York, Director for International Freedom of Expression, Electronic Frontier Foundation

J

illian C. York, Director of International Freedom of Expression at the Electronic Frontier Foundation, said that she first honed in on questions of anonymity and identity verification online nearly a decade ago. She shared a story regarding Facebook’s real names policy, which she said affects people from minority backgrounds in unpredictable and unfair ways. She said this led her to look more closely at the critical role anonymity plays in online experiences for many people. David Magerman, co-founder and Managing Partner at Differential Ventures, said he spent most of his career in quantitative finance because it was where the richest, most accessible data and computing could be found at the time. He said he first got interested in online anonymity and the use of data to manipulate individual users after one of his colleagues got involved discreetly in funding campaigns such as that of Cambridge Analytica’s role in the 2016 election.

Magerman said that as an early data scientist, he promoted the use of all types of human-centered data, and now feels it is incumbent on people in his position to address the ways in which this has created problems broadly in society. He cited Freedom of Facebook as one such place where he and others like him can get involved.

York said the role of anonymity is paramount to ensuring freedom of expression online. She said the progress that we have seen in many social movements in the past few years has largely been due to the ability for people in marginalized groups to connect online, particularly often anonymously or using pseudonyms online or without the knowledge of their offline community. This form of grassroots work is important all over the world- giving the example of the Arab Spring overseas.

Magerman emphasized the important difference between private speech and public discourse. He said conversation should be able to happen without “oversight.” The idea that our identities and the content of our conversations are being recorded, tracked, and monitored is problematic, Magerman said. On the other hand, Magerman highlighted that public speech is different and in those cases, anonymity is more problematic than it is helpful. For example, the source of information dispersing misinformation about covid-19 to the public should be knowable. The scale and rate of disinformation is amplified by the lack of accountability in public discourse, Magerman said. For the sake of the social contract, where speech has an impact on broader society, we should see it as incumbent on all of us not to shield our identity when we fund a movement or otherwise promote a particular speaker or their speech. York disagreed somewhat with Magerman’s position, but said she also sees both sides. So many marginalized people in society need anonymity just to function, she said, such as in non-democratic countries.

The speakers discussed the idea of online identification tracking, in order to ensure those individuals committing crimes may be properly held accountable. Magerman was more pro this idea whereas York raised many problems with the position. Magerman suggested governments need to be enabled to have enforcement abilities in the digital realm just as they do in the physical realm. He said the first step is to get a government in place that we are able to trust with that type of authority, which would impliedly require data harvesting and storage on individuals specific to their true identity, in part. York did not disagree but raised that even if this could be accomplished in the US, is it a real model for individuals who do not live in the US?

The conversation migrated to the topic of surveillance capitalism and the profiteering of private companies on the back of user data. York argued the core issue with this model is that it prevents freedom of expression. She also said that these companies in some ways have more control over our speech than any government does today. On the other hand, Magerman was then asked about his prior quote that the problem we have with the internet is that we have “too much privacy.” Magerman defended this statement by saying that we should not expect for the data we have online to be private. We should assume our data is not secure anywhere, just for data security breaches by hackers alone. We should assume the people who we least wish to have this information have this information on us, at all times, and “behave accordingly” online. He stressed that it is not helpful that we strive to protect data through legal and policy suggestions, but we do not effectively protect data from hackers, so we are empowering the bad actors without empowering the good actors who could do useful things with our data that would better our society on and offline. He suggested things like making audit logs accessible and tracking use at such a level would be a helpful step forward. York also endorsed audit logs-style user empowerment, and other transparency mechanisms in place between companies and users. She suggested that private companies should inform users when their data has been requested or shared with a government, for example. She suggested this is starting to happen through the Digital Services Act and Digital Marketplace Act in Europe.

In terms of solutions, Magerman suggested a two-pronged approach. First, educating the public about how data is being used is important. He said if people had the opportunity to decide for themselves what data they would like to share and for what purpose, sans manipulation, we would be closer to an understanding of how to move forward. Second, he suggested encrypting and auditing the use of behavioral data. This would prevent unwarranted uses of the data as well as provide the opportunity to study how the data is being used, and determine which uses should be warranted (and which should not). Magerman and York both also expressed some level of hesitation around cryptocurrencies for the same reasons they are concerned about the unmitigated use of individual user data – it makes an individual’s behavior and activity easy to track in yet another realm of their lives, and in the crypto case, a particularly sensitive area of their lives being their wealth and financial decisions. However, York also said in some countries, crypto feels like a necessity due to the poor monetary governance of some currencies worldwide.

Visit the event page for more.


  • Jillian C. York argues that anonymity is important to promote free expression on the internet, especially grassroots organizing among members of marginalized groups around the world.
  • David Magerman argues that there is a difference between private speech – which should be absolutely free from oversight – and public discourse, which should be attached to a real identity to promote accountability.
  • The speakers agreed that internet platforms must be more accountable and transparent to their users – maybe through audit logging that would give users insight into how and when their data is used.
  • The burgeoning use of cryptocurrency provides yet another way for individual data to be tracked, especially sensitive financial data.

  • “Looking historically, many of the movements we now think of and accept today really just wouldn’t have existed without the right to anonymity, or at least the right to private conversation. When we think about scientific development, when we think about LGBT rights, women’s rights, all of these things – even the civil rights movement –  the progress that we’ve seen happened because people were able to talk in ways they felt safe, and anonymity is an enabler of that.”  (Jillian C. York, 8:23)
  • “We definitely deserve the right to anonymous communication privately, but for the sake of social contracts, where conversation and speech has an impact on society, we have an obligation to identify ourselves and identify our backgrounds and motivations.“  (David Magerman, 12:10)
  • “To the idea of ‘too much privacy,’ I kind of want to flip that a little bit and talk about what companies keep private, because I think that’s the other piece of this puzzle. That these companies are completely opaque in so many of their practices – whether we’re talking about encryption, which David was just talking about, or the ways in which they moderate content, the ways in which they apply their own rules…The vast majority of social media platforms do not notify users when their data has been requested by their government.“  (Jillian C. York, 31:14)
  • “People ought to be transparent about what their goals are, what their intentions are. If Google would just say ‘We want to be evil,’ I’d be OK with them. The problem is that they went through this whole thing of ‘Don’t be evil,’ and then they went and were evil for a while.”  (David Magerman, 51:15)

Technology Ethics CenterUniversity of Notre Dame