Numbers Can Lie: When algorithms work perfectly but fail miserably – Not All Faces are Alike

Subscribe to the ThinkND podcast on Apple, Spotify, or Google.

Featured Speakers: 

  • Roger Woodard, Teaching Professor in the Department of Applied and Computational Mathematics and Statistics and Director of the Online MS in Data Science, University of Notre Dame
  • Marie Lynn Miranda, Charles and Jill Fischer Provost and Professor of Applied and Computational Mathematics and Statistics, University of Notre Dame
  • Kevin Bowyer, Shubmehl-Prein Professor of Computer Science and Engineering, University of Notre Dame

The second virtual event in the Numbers Can Lie series featured a discussion led by Roger Woodard, teaching professor in the Department of Applied and Computational Mathematics and Statistics, with Marie Lynn Miranda, Charles and Jill Fischer Provost and Professor of Applied and Computational Mathematics and Statistics at the University of Notre Dame, and Kevin Bowyer, the Shubmehl-Prein Professor of Computer Science and Engineering. Miranda talked briefly about her research in children’s environmental health, followed by Bowyer and his research and involvement in facial recognition. The key points made in this event focused on what facial recognition technology is, what it’s being used for, and finally, what ethical questions are being asked as facial recognition technology develops.

Although Miranda’s time on the call was short, she highlighted her research surrounding environmental health, specifically in regards to lead contamination. She is unique in conducting her research because she makes sure important details within a data set are captured. When data is averaged, you often lose important information that has an impact on the results of the research. Miranda added that her research is something intellectually refreshing to her job as provost and is happy that she has the time to conduct it.

The discussion shifted to the main portion of the talk on facial recognition. Bowyer defined facial recognition as a way of measuring and identifying a living entity. Facial recognition is a form of biometrics, which can be used to recognize facial features, signature, speech, pattern, and gait patterns. To do this, biometrics uses unique numbers and measurements to analyze an image and come up with what is known as a feature vector. Occasionally, false matches in facial recognition can occur in the algorithms.

Bowyer offered some specific examples of different ways facial recognition technology is being used. One way it is being used is in newer cars for vigilant driver recognition. A camera will detect if a driver is looking toward the road or beginning to fall asleep and can alert the driver if it is a problem. Facial recognition is also being used to help catch shoplifters in retail stores. Facial recognition is not always used in the most ethical way. For example, China has used it to track ethnic minorities. This technology has the power to be used in a positive way and a detrimental way. Bowyer mentioned that people have been wrongly arrested due to a false match in the facial recognition technology. Recognizing these errors and biases within the technology is important as time and technology progress.

The conversation closed with a discussion about what Bowyer’s research team at Notre Dame is doing to address ethical issues that might come up with this type of technology. Bowyer acknowledged the importance of recognizing flaws and inaccuracies within the technology and from there, asking the question as to why inaccuracies are present and how they can be fixed. Some facial recognition technology claims that it can predict if someone is a criminal or what someone’s sexual orientation is based on facial images. Bowyer associated danger with some of these uses and asserted the importance of not causing unintended harm due to these technologies as they continue to advance.

The famous saying, “numbers don’t lie,” might work when reporting the score of a football game, but even then, they don’t tell the whole story. In this lecture, you will learn the basics behind data science and discover how easily human bias can be encoded into computer models. The results of algorithms have implications for not only who may obtain a fair loan, but with who stays in prison and who’s released, and who will be favored by machine learning “decisions.” With so many parts of our lives impacted by Big Data, how do scientists balance algorithms and ethics?

The famous saying, “numbers don’t lie,” might work when reporting the score of a football game, but even then, they don’t tell the whole story. In this lecture, you will learn the basics behind data science and discover how easily human bias can be encoded into computer models. The results of algorithms have implications for not only who may obtain a fair loan, but with who stays in prison and who’s released, and who will be favored by machine learning “decisions.” With so many parts of our lives impacted by Big Data, how do scientists balance algorithms and ethics?

Visit the event page for more.


  • “Research is very intellectually refreshing for me and I can’t stand to not have the questions we are working on unanswered” (Marie Lynn Miranda; 16:39)
  • Biometrics is the idea of measuring a living entity to identify it (20:14)
  • “All of us in biometrics think nothing is like what you see on C.S.I or any similar show” (Kevin Bowyer; 23:57)
  • Biometrics uses numbers and measurements to analyze an image and comes up with what is known as a feature vector. Occasionally, false matches between faces can occur (24:07)
  • “Facial recognition has become much more effective. It has greater accuracy and the error rates have gone down substantially. That’s made it go from something that people play around with in the lab to things that people actually use in the real world” (Roger Woodard; 30:04)
  • As facial recognition has developed, the applications to its use continue to grow (30:32)
  • Ethical considerations are an important part of how this technology will develop (49:40)

Digest157Lucy Family Institute for Data & SocietyUniversity of Notre Dame