INDEX
Sr. No. Particulars Page No.
1 Introduction 3
2 Kind of emotion can be detected 4
3 work 6
4 trained 8
5 Different fields 9
6 disadvantages 14
7 summary 15
Introduction
Emotion recognition is one of the many facial
recognition technologies that have de-
veloped and grown through the years. Cur-
rently, facial emotion recognition software is
used to allow a certain program to examine
and process the expressions on a human’s
face. Using advanced image dispensation,
this software functions like a human brain
that makes it capable of recognizing emo-
tions too.
It is AI or “Artificial Intelligence” that detects
and studies different facial expressions to
use them with additional information
presented to them. This is useful for a variety
of purposes, including investigations and in-
terviews, and allows authorities to detect the
emotions of a person with just the use of
technology.
KIND OF EMOTIONS CAN BE
DETECTED
Emotion recognition can detect and recog-
nize different facial expressions using Facial
Expression Analysis. Below is a table showing
emotions along with their common corres-
ponding facial expressions:
Emotion Facial Expression
Anger Lowered and burrowed eyebrows
Intense gaze
Raised chin
Joy Raised corners of mouth into a smile
Surprise Dropped jaw
Raised brows
Wide eyes
Fear Open mouth
Wide eyes
Furrowed brows
Sadness Furrowed brows
Lip corner depressor
Anxiety Biting of the lips
WORK
Facial emotion detection technology is be-
coming more and more advanced every year.
The AI used detects and studies the expres-
sions depending on many factors to conclude
what emotion the person is showing. Factors
such as:
Location of the eyebrows and eyes
Position of the mouth
Distinct changes of the facial features
The study held in 2012 regarding emotion re-
cognition summarized the system’s al-
gorithm as follows:
Input Image
Preprocessing
and Resize Knowledge Base
Difference
measurements
Emotion
Recognition
KNOWLEDGE BASE
This base contains images that are used for comparison and
recognizing emotion variations. The images are stored in the database.
Every time an input is given to the system, it finds a relevant image
from its knowledge base by comparing the stored pictures and the
input to come up with an output.
PREPROCESSING AND RESIZE
This step enhances the input and removes different types of noises.
After that, the input image will be resized, typically with the use of the
eye selection method.
DIFFERENCE MEASUREMENTS
During this step, the system will find any differences between the
input image and the stored images and will finally lead to the emotion
recognition step.
EMOTION RECOGNITION
This is the final step of the process. The comparison is made, and the
final output is given depending on the differences found.
TRAINED
The software for emotion detection under-
goes training to ensure that outputs are cor-
rect and appropriate. Understanding the in-
puts and outputs is essential for algorithms.
Thus, the algorithms must recognize human
emotions. There are two approaches used to
achieve that:
Categorical
Based on this approach, emotions fall into various set classes, and there
are a finite set of emotions. Emotions included are contempt, sadness,
surprise, disgust, anger, happiness, and fear.
Dimensional
Emotions cannot be defined concretely and exist on a spectrum based on
this approach. The PAD emotional state has three dimensions, while the
Circumplex model of affect uses two.
The choice of approach has some important con-
sequences in modelling them with Machine
Learning. When a categorical model of human
emotion is chosen, a classifier would likely be
created, and texts and images would be labelled
with human emotions like “happy,” “sad,” etc.
DIFFERENT FIELDS
Choosing what model or training data to use
is not the end of it, as there are different in-
puts that can be used or given to the system
to analyse. Below are the main fields for
emotion recognition:
VIDEO
for It is one of the major datasets in Affective Computing that are available,
and lots of research is still ongoing to understand how to use them in emo-
tion recognition. An example of modern software this field is cameras on mo-
bile phones.
IMAGE
Just like video, it is one of the available major sets in emotion recognition.
Some stated that classified emotions in images could be applied in automatic
tagging of pictures with emotional categories and sorting video sequences
into various genres.
SPEECH
Usually, speeches are transcribed into texts to be able to analyse them, but
this method would not be applicable in emotion recognition. Various re-
search is still ongoing to explore the usage of speech instead of transcribed
texts for emotion recognition applications.
TEXT
Document-Term Matrix or DTM is the usual data structure used for texts. It is
a matrix where records of the frequency of words in a document can be
found. However, it is not appropriate for determining emotions since it uses
individual words. A text is perceived based on its tone, punctuation, etc., and
these factors are all being considered by researchers in their ongoing re-
search of new data structures for texts.
CONVERSATION
Emotion recognition in conversation focuses on acquiring emotions from dis-
cussions between two or more persons. Datasets under this are commonly
from free samples from social platforms. However, there are still a lot of chal-
lenges in this field, including the presence of sarcasm in a dialogue, the emo-
tion shift on the interlocutor, and conversational-context modelling.
EMOTION RECOGNITION USED
FOR TODAY
Nowadays, emotion recognition is used for
various purposes that some people do not
even notice on a daily basis. Here are some
of the areas that would show that emotion
recognition is beneficial:
SECURITY MEASURES
Emotion recognition is already used by schools and other institutions since it
can help prevent violence and improves the overall security of a place.
HR ASSISTANCE
There are companies that use AI with emotion recognition. API capabilities as
HR assistants. The system is helpful in determining whether the candidate is
honest and truly interested in the position by evaluating intonations, facial
expressions, keywords, and creating a report for the human recruiters for final
assessment.
CUSTOMER SERVICE
There are systems launched nowadays that are installed in customer service
centres. Using cameras equipped with artificial intelligence, the customer’s
emotions can be compared before and after going inside the centre to
determine how satisfied they are with the service they’ve received. And if
there is a low score, the system can advise the employees to improve the
service quality.
DIFFERENTLY ABLED CHILDREN
There is a project using a system in Google Glass smart glasses that aims to
help autistic children interpret the feelings of people around them. When a
child interacts with other people, clues about the other person’s emotions are
provided using graphics and sound.
AUDIENCE ENGAGEMENT
Companies are also using emotion recognition to determine their business
outcomes in terms of the audience’s emotional responses. Apple also
released a new feature in their iPhones where an emoji is designed to mimic a
person’s facial expressions, called Animoji.
VIDEO GAME TESTING
Video games are tested to gain feedback from the user to determine if the
companies have succeeded in their goals. Using emotion recognition during
these testing phases, the emotions a user is experiencing in real-time can be
understood, and their feedback can be incorporated in making the final
product.
HEALTHCARE
The healthcare industry sure is taking advantage of facial emotion recognition
nowadays. They use it to know if a patient needs medicine or for physicians to
know whom to prioritize in seeing first.
DISADVANTAGES
Just like any developing technology, emotion recogni-
tion is not perfect and has its imperfections and chal-
lenges. One of the challenges is that datasets are la-
belled by the people, and different persons can read
and interpret emotions in different ways. Also, some
visible visual cues like furrowed eyebrows can mean
other emotions aside from anger, and other cues may
be subtle hints of anger, although they are not obvi-
ous.
Another issue faced by this technology is when de-
tecting emotions from people of different colours.
There are models that detect more anger in black
people. This means that training sets need to be more
diverse, and experts are already doing what they can
to fix this.
Aside from the challenges it imposes, emotion recog-
nition seems to be also causing moral problems as it
sometimes may invade personal space.
There are places where emotion recognition is prohib-
ited by law since people do not like AI interpreting
their emotions. There are also places, like California,
where the law enforcers are banned from using such
technology since it violates the right of the citizens
and can lead to them being wary of the authorities.
SUMMARY
• A study published in 2015 showed that recognition re-
sults improve when confounding factors are removed
from the input images and the lighting is adequate.
• AI Emotion Recognition technology is continuously be-
ing studied and improved to solve and provide solu-
tions to arising risks and issues and to avoid any cul-
tural and ethical problems in the society.