Perception of Emotion in Human-Robot Interaction / Muhammad Faisal Zia

By: Zia, Muhammad FaisalContributor(s): Supervisor : Dr. Sara AliMaterial type: TextTextIslamabad : SMME- NUST; 2022Description: 59p. Soft Copy 30cmSubject(s): MS Robotics and Intelligent Machine EngineeringDDC classification: 629.8 Online resources: Click here to access online
Tags from this library: No tags from this library for this title. Log in to add tags.

Perception of emotion is an intuitive replication of a person’s internal state without the need for
verbal communication. Visual emotion recognition has been broadly studied and several end-toend deep neural networks (DNNs)-based and Machine learning-based models have been proposed
but they lack the ability to be implemented in low-specification devices like robots, and vehicles.
The drawbacks of conventional handcrafted feature-based Facial Emotion Recognition (FER)
methods are eliminated by DNNs-based FER approaches. In spite of that, Deep Neural Network
based FER techniques suffer from high processing costs and exorbitant memory requirements,
their application is constrained in fields like Human-Robot Interaction (HRI) and HumanComputer Interaction (HCI) and relies on hardware requirements. In aforementioned study, we
presented a computationally inexpensive and robust FER system for the perception of six basic
emotions (i.e., disgust, surprise, fear, anger, happy, and sad) that is capable of running on
embedded devices with constrained specifications. In the first step after pre-processing input
images, geometric features are extracted from detected facial landmarks, considering the facial
spatial position among influential landmarks. The extracted features are given as input to trainthe
SVM classifier. Our proposed FER system was trained and evaluated experimentally using two
databases, Karolinska Directed Emotional Faces (KDEF) and Extended Cohn-Kanade (CK+)
database. Fusion of KDEF and CK+ datasets at the training level were also employed in order to
generalize the FER system’s response to the variations of ethnicity, race, national and provincial
backgrounds. The results show that our proposed FER system is optimized for real-time embedded
applications with constrained specifications and yields an accuracy of 96.8%, 86.7% and 86.4%
for CK+, KDEF and fusion of CK+ and KDEF databases respectively. As a part of our future
research objectives, the developed system will make a robotic agent capable of perceiving emotion
and interacting naturally without the need for additional hardware during HRI.

There are no comments on this title.

to post a comment.
© 2023 Central Library, National University of Sciences and Technology. All Rights Reserved.