Introduction
Humans have always had the innate ability to recognize and distinguish between faces. Now computers are able to do the same. This opens up tons of applications. Face detection and Recognition can be used to improve access and security like the latest Apple Iphone does (see gif below), allow payments to be processed without physical cards — iphone does this too!, enable criminal identification and allow personalized healthcare and other services. Face detection and recognition is a heavily researched topic and there are tons of resources online. We have tried multiple open source projects to find the ones that are simplest to implement while being accurate. We have also created a pipeline for detection, recognition and emotion understanding on any input image
This blog is divided into 3 parts:
- Facial Detection — Ability to detect the location of face in any input image or frame. The output is the bounding box coordinates of the detected faces
- Facial Recognition — Compare multiple faces together to identify which faces belong to the same person. This is done by comparing face embedding vectors
- Emotion Detection — Classifying the emotion on the face as happy, angry, sad, neutral, surprise, disgust or fear
So let’s get started!
Facial Detection
Facial detection is the first part of our pipeline. We have used the python library Face Recognition that we found easy to install and very accurate in detecting faces. This library scans the input image and returns the bounding box coordinates of all detected faces
Facial Recognition
Facial Recognition verifies if two faces are same. The use of facial recognition is huge in security, bio-metrics, entertainment, personal safety, etc. The same python library face_recognition used for face detection can also be used for face recognition. Our testing showed it had good performance. Given two faces match, they can be matched with each other giving the result as True or False. The steps involved in facial recognition are
- Find face in an image
- Analyze facial feature
- Compare features for the 2 input faces
- Returns True if matched or else False.
Humans are used to taking in non verbal cues from facial emotions. Now computers are also getting better to reading emotions. So how do we detect emotions in an image? We have built a CNN to detect emotions. The emotions can be classified into 7 classes — happy, sad, fear, disgust, angry, neutral and surprise.