The Science Lab: Not All Faces are Alike

Facial recognition technology has encountered controversy over claims of bias, and research at Notre Dame has uncovered that the accuracy of face recognition is different for different demographic groups. The same research is beginning to uncover the causes of these differences. Meanwhile, also learn how data can benefit society, along with the challenges that come with relying only upon the numbers.

Read the biographies of the speaker here: Part 2: Not All Faces are Alike

Read the event recap, watch the video, or listen to the podcast below.


Highlights

  • “Research is very intellectually refreshing for me and I can’t stand to not have the questions we are working on unanswered” (Marie Lynn Miranda; 16:39)
  • Biometrics is the idea of measuring a living entity to identify it (20:14)
  • “All of us in biometrics think nothing is like what you see on C.S.I or any similar show” (Kevin Bowyer; 23:57)
  • Biometrics uses numbers and measurements to analyze an image and comes up with what is known as a feature vector. Occasionally, false matches between faces can occur (24:07)
  • “Facial recognition has become much more effective. It has greater accuracy and the error rates have gone down substantially. That’s made it go from something that people play around with in the lab to things that people actually use in the real world” (Roger Woodard; 30:04)
  • As facial recognition has developed, the applications to its use continue to grow (30:32)
  • Ethical considerations are an important part of how this technology will develop (49:40)

Recap

The second virtual event in the Numbers Can Lie series featured a discussion led by Roger Woodard, teaching professor in the Department of Applied and Computational Mathematics and Statistics, with Marie Lynn Miranda, Charles and Jill Fischer Provost and Professor of Applied and Computational Mathematics and Statistics at the University of Notre Dame, and Kevin Bowyer, the Shubmehl-Prein Professor of Computer Science and Engineering. Miranda talked briefly about her research in children’s environmental health, followed by Bowyer and his research and involvement in facial recognition. The key points made in this event focused on what facial recognition technology is, what it’s being used for, and finally, what ethical questions are being asked as facial recognition technology develops.

Although Miranda’s time on the call was short, she highlighted her research surrounding environmental health, specifically in regards to lead contamination. She is unique in conducting her research because she makes sure important details within a data set are captured. When data is averaged, you often lose important information that has an impact on the results of the research. Miranda added that her research is something intellectually refreshing to her job as provost and is happy that she has the time to conduct it.

The discussion shifted to the main portion of the talk on facial recognition. Bowyer defined facial recognition as a way of measuring and identifying a living entity. Facial recognition is a form of biometrics, which can be used to recognize facial features, signature, speech, pattern, and gait patterns. To do this, biometrics uses unique numbers and measurements to analyze an image and come up with what is known as a feature vector. Occasionally, false matches in facial recognition can occur in the algorithms. 

Bowyer offered some specific examples of different ways facial recognition technology is being used. One way it is being used is in newer cars for vigilant driver recognition. A camera will detect if a driver is looking toward the road or beginning to fall asleep and can alert the driver if it is a problem. Facial recognition is also being used to help catch shoplifters in retail stores. Facial recognition is not always used in the most ethical way. For example, China has used it to track ethnic minorities. This technology has the power to be used in a positive way and a detrimental way. Bowyer mentioned that people have been wrongly arrested due to a false match in the facial recognition technology. Recognizing these errors and biases within the technology is important as time and technology progress.

The conversation closed with a discussion about what Bowyer’s research team at Notre Dame is doing to address ethical issues that might come up with this type of technology. Bowyer acknowledged the importance of recognizing flaws and inaccuracies within the technology and from there, asking the question as to why inaccuracies are present and how they can be fixed. Some facial recognition technology claims that it can predict if someone is a criminal or what someone’s sexual orientation is based on facial images. Bowyer associated danger with some of these uses and asserted the importance of not causing unintended harm due to these technologies as they continue to advance.


Listen to a live discussion with Marie Lynn Miranda, the Charles and Jill Fischer Provost of the University of Notre Dame and a professor of applied and computational mathematics and statistics (ACMS), and Kevin Bowyer, the Shubmehl-Prein Professor of Computer Science and Engineering recorded on Friday, October 23, at 12 p.m. ET. 


Listen to the discussion wherever, whenever, on The ThinkND Podcast:


Register to receive information about how to join future live events.

All Recaps

Health and SocietyLaw and PoliticsScience and TechnologyFacial RecognitionKevin BowyerTech EthicsData ScienceMarie Lynn MirandaCollege of Arts and LettersCollege of ScienceData