Where We’re Going: Privacy – Campus Viewpoint

Subscribe to the ThinkND podcast on Apple, Spotify, or Google.

Featured Speakers: 

  • Corey Angst, Professor of IT, Analytics, and Operations at Notre Dame’s Mendoza College of Business
  • Kirsten Martin, William P. and Hazel B. White Professor of Technology Ethics at the University of Notre Dame’s Mendoza School of Business
  • Mark McKenna, John P. Murphy Foundation Professor of Law at the Notre Dame Law School and the Director of the Notre Dame Technology Ethics Center
  • Ann Tenbrunsel, David E. Gallo Professor of Business Ethics in the College of Business Administration at the University of Notre Dame

In the first session of Where We’re Going: Privacy, moderator Mark McKenna began by introducing the interdisciplinary discussion on technology ethics. He highlighted that “in an ever-increasing data-driven world, privacy considerations have grown ever more complicated and important.” He recognized that there are many faculty colleagues who dedicate their research to areas around privacy policies and expectations. With that, he introduced Corey Angst, who shared how he has learned that privacy is an unexpectedly subjective concept, and an understanding of it differs from person to person. In addition, comfort in terms of information collected varies: Each person must ask him or herself, “How much privacy am I willing to give up in exchange for something else?”

Ann Tenbrunsel, who researches alongside Angst, expanded on what exactly their team studies: the perceived ethicality of notices of privacy practices, otherwise known as NPPs. She revealed that the interpretation of an NPP’s ethicality was usually dependent on the written statement. People tended to see data used internally and for necessary purposes as more ethical than data used for external or strategic ones. She revealed that during the coronavirus pandemic, ethicality ratings increased across all demographics, which communicated feelings of collective responsibility. However, despite the increase in ethicality, privacy concerns related to NPPs remained stable. Tenbrunsel expressed curiosity towards future research, wondering if people would eventually become more “worried about the data that was collected [during the pandemic].”

Also touching on individuals’ reactions towards collected data, Kirsten Martin discussed her recent research about location data and people’s expectations around it. She described how most associate the term “location data” with GPS location, which isn’t always the case. Martin admitted that overall, most people’s privacy expectations are often not met, and the reality of the data collection is rarely satisfactory. Although expectations are apparently too high for data harvesting companies, Martin affirmed that “people assume that their privacy expectations are met in notices.” According to Martin, the more credible notices are the more vague and informal ones because these encourage individuals to trust a company and assume their privacy expectations are being met. Knowing this, companies are taking advantage of ambiguity and the assumptions that come from it in their NPPs.

Adding on to the idea that people distrust how some companies might use their data, Tenbrunsel observed that with all the data being given out, Americans feel there is a need for more legislation to regulate privacy laws for companies who collect their data. Angst mentioned a study that showed how users of Facebook’s virtual reality technology could actually be identified very specifically — beyond just age, race, or physical traits — simply by using the Oculus. McKenna agreed with Angst’s statement that there needs to be a more mainstream awareness of technology’s inference capabilities, adding that in this “information economy,” people underestimate “the way new technologies can plug into existing data.”

Furthermore, Martin mentioned that while the topic of data was becoming more mainstream, there are some unidentifiable aggregators who gather large amounts of data, yet are generally unregulated. Angst held a similar concern, recognizing the vast amount of data that has been collected. He emphasized that while we shouldn’t stop researching, we did need to find responsible ways to gather data. In agreement, McKenna elaborated that there needed to be clear goals of how the data will be used; legal problems tend to arise when data is collected and then later it is decided how the data will be used. Questions from the audience touched on the idea of making privacy notices more straightforward, similar to GDPR practices in Europe, but those on the panel agreed that being more specific was unlikely to increase trust. McKenna closed the conversation with the idea that it would be beneficial to be more similar to GDPR in terms of having plans for how the data will be used after it has been gathered.

Visit the event page for more.


  • No one discipline has a monopoly on tech ethics, and the questions that arise from this developing area require interdisciplinary cooperation to answer.
  • Although factors such as wording or context make ethicality ratings on privacy notices fluctuate, concerns about privacy tend to remain static.
  • People tend to give data collectors the benefit of the doubt and assume that their privacy expectations are being met.
  • There should be more mainstream awareness of the amount of data being collected and the capabilities it gives companies who have adequate technology.
  • Data collection is not inherently wrong; however, there should be a priority to study research in a responsible way.

  • “In an ever-increasing data-driven world, privacy considerations have grown ever more complicated and important.” — Mark McKenna, 5:43
  • “Privacy is different to just about everyone you ask.” — Corey Angst, 8:23
  • “Americans want more legislation to regulate privacy laws with all the data being given out.” Ann Tenbrunsel, 27:47
  • “People underestimate, especially in this information economy, the way that new technologies plug into existing data.” — Mark McKenna, 35:00
  • “Just because we disclose data doesn’t mean that we don’t have privacy expectations as to what happens to that data.” Kirsten Martin, 43:45

Technology Ethics CenterUniversity of Notre Dame