Facial Recognition – the Good, the Bad, the Ugly
Facial recognition is arguably one of the most prominent applications of computer vision across all verticals, ranging from pure entertainment all the way up to law enforcement and security. I wrote this to inspire you to experiment and make yourself familiar with the technology. I’ll also point out some of the pain points and threats that stem from a potential misuse. Another reason why I wrote the article is to show my support of the Black Lives Matter initiative. I was inspired by a report conducted by Algorithmic Justice League and decided to elaborate on the concerns they’ve raised.
What is Facial Recognition
Facial or face recognition technologies are designed to identify you as a person by analysing your face. They are capable of extracting a human face out of an image or a video. Think security cameras on the streets, busy spots like festivals, sport events or an airport terminal. Facial recognition isnโt as reliable as other biometric means of identification, such as fingerprints, but itโs favoured for its non invasiveness.
How Facial Recognition Works
As humans we are good at visuals. We are able to discern a familiar face from a large crowd of people. We also remember facial features of family members or those we care about – size of a nose, protruding cheekbones, shape of eyes etc. Whenever you see a face, a facial recognition system sees data backed by algorithms. Detected faces are stored as, well, data. Once stored, new detections are compared to a large database of others and feed into iterative improvements of the detection model.
Face recognition comprises the following steps.
- Step 1: Face Detection – As you scan your ID card or sit in front of a camera or walk past it, the system will identify and frame your face.
- Step 2: Data Capture – The detected face is extracted and stored as data (image) for a later comparison.
- Step 3: Face Analysis – The captured image is transformed into a mathematical model, a facial signature. The system seeks facial features, such as edges of your eyes, distance between nose and mouth and many others. There are likely to be tens of features the system will look for in order to locate facial landmarks, also known as key points. Depending on the algorithm, the system ends up with either a 2D or a 3D model of your face.
- Step 4: Face Recognition – Equipped with a facial signature, the system can now easily compare your face with millions of others. Thatโs the interesting part, and controversial as you will see later in this post.
Demo
Time to cut the bull* .. I wanted to show you how easily accessible face recognition is as a technology. FAAMG and other forefront runners in the industry did a pretty good job at breaking it down for all of us, who are not well versed in data science. You don’t even have to be a developer or technically apt. What I am going to show didn’t warrant a single line of code.
Amazon Rekognition allows you to upload images and videos into the cloud and let them be analysed by various machine learning models. Sign-up is mandatory, but there is a free plan / trial period.
In the first step I simply uploaded one of my personal pictures into the cloud. Note how Amazon promptly estimated my gender, mood and age category based on facial landmarks.
Next, I was curious about the actual recognition side of things. I pulled out a corporate photo, where I look like a stranger, in my opinion. Amazon were quick to prove me wrong!
To be honest, it’s not that surprising after all, is it? Key facial features won’t change as much. In my next experiment I was curious about fooling the system by introducing some obstacles, such as growing a beard or wearing shades. Yup, it made a difference.
In the first round, I let Amazon compare two images that are still fairly close to each other, but where I am wearing shades. I was identified as the same person with close to 87% of confidence. Not too shabby.
Lastly, I finally managed to confuse the system enough so that the match was rejected. However, if you you look closely at the results, you will see that Amazon were still well over 80% confident they were looking at the very same person!
This concludes my tiny little experiment. Hope you feel inspired to learn more about this fascinating technology. I am sure you will have questions about privacy. These are valid concerns. Let’s have a look at practical implications and use cases.
Shall I Be Worried?
Given a high density of monitoring systems in built-up areas, you probably should. Apparently, there are more than half a million of CCTV cameras in London and an average Londoner is caught on a camera hundreds of times a day. I wouldn’t go as far as questioning the need for increased monitoring in the interest of security and crime prevention. Nevertheless, CCTVs are omnipresent and that’s a fact. Don’t take my word for it, do your own research – for example, in this article you can find information about the density of surveillance cameras in several countries worldwide. Personally, I found some of these numbers alarming.
Biased Algorithms
While data being collected without our explicit consent presents a single biggest threat, care must be taken when the technology is used for law enforcement purposes. Misidentification can have dire consequences. But there are more issues to be concerned about. This article suggests quite a high number of false positives that warrant a human intervention in order to avoid unlawful arrests.
Algorithmic Justice League provides a detailed summary of risks concerning human rights and privacy.
- Lowered accuracy with underrepresented groups can lead to misidentification
- Misidentification leads to discrimination – Say, your school’s monitoring system is biased towards certain ethnic group you are not part of. As a result you are frequently reported as being absent. What does it say about you as a student?
- Too much reliance on emotion analysis used during the hiring process results in discrimination – Models based on top performers, certain people lack rich facial expressions, don’t follow the favoured speech patterns etc.
- Insufficient regulations of data ownership can lead to misuse and harassment – Consider monitoring of tenants and their family members by landlords due to a simple premise: Only those who live here should be able to enter the building, right?
The list goes on and the report includes specific examples too. I encourage you to read it.
Meaningful Use of Facial Recognition
On the bright side, the use of technology is legit in many cases. Instead of focusing on the obvious ones, I’ve picked three use cases that are perhaps less known or deemed unusual.
- Integration with NLP to Help Visually Impaired People – Vast amount of content are multimedia, visuals. Objects, not just faces, recognised by a computer vision system can be described by words – verbs, nouns, adjectives etc. That’s where NLP comes into the picture (literally! ๐ and helps create scene descriptions that are rich in details and sound natural. Combining these two AI disciplines brings many more benefits, namely in robotics. I recommend reading this article.
- Disease Diagnostics – Face recognition has a potential to identify certain diseases and rare genetic disorders. Could it become part of a regular medical check in the future? Volume 43 of Pepperdine Law Review looks at the tradeoffs between the novel technology and privacy in detail.
- Help Prevent ATM Scams – When withdrawing cash, your face is captured and compared against a stored ID card in the bank’s database. Simple and efficient.
Thanks for reading thus far and hope you feel inspired to find out more about facial recognition. Please let me know your thoughts in the comment section below. In my next post, I’ll show you how to connect your device’s camera to a simple Python application and do some magic ๐ Stay tuned and thanks for your time.