An Evolutionary Case for Better Privacy Regulations
Host Kirsten Martin is joined by Laura Brandimarte, an assistant professor of management information systems at the University of Arizona’s Eller College of Management. Holding a Ph.D. in public policy and management from Carnegie Mellon University, she specializes in privacy and behavioral economics, including the psychology of self-disclosure and the social dynamics of privacy decision-making and information-sharing. Laura came on the show to talk about a paper she coauthored with Alessandro Acquisti (Carnegie Mellon University) and Jeff Hancock (Stanford University) titled “How privacy’s past may shape its future,” which appeared in January in Science magazine. Referencing work that points to the notion of privacy being present throughout human history, Laura explains that privacy management is about our ability to moderate what we share and with whom, not never sharing anything. But she notes that the strategies humans have developed evolutionarily to manage our privacy—e.g., having a conversation in hushed tones so no one but the person we’re speaking to hears—often don’t have an online equivalent and thus aren’t helpful in that context. Laura also discusses why an overreliance on the “notice and consent” approach to privacy—typified by a website presenting users with a long set of terms and conditions when they go to use it—makes it difficult to impossible for people to arrive at the best privacy decisions for themselves. Drawing on an analogy from the automotive industry and citing a lack of incentives for data holders to make changes to how they handle that data, she and her coauthors argue for regulations that move beyond notice and consent and shift responsibility for sound privacy practices to those gathering our data in the first place.
Listen to the Episode
Presented by Notre Dame Technology Ethics Center
- Article Discussed in the Episode: “How privacy’s past may shape its future”
- Laura’s Bio
- Episode Transcript
At the end of each episode, Kirsten asks for a recommendation about another scholar in tech ethics whose work our guest is particularly excited about. Laura highlighted Joy Buolamwini, founder of the Algorithmic Justice League, an organization devoted to equitable and accountable AI.