DEER: Deep emotion-sets for fine-grained emotion recognition

Author Identifier

Nima Mirnateghi: https://orcid.org/0000-0002-1814-7452

Syed Afaq Ali Shah: https://orcid.org/0000-0003-2181-8445

Document Type

Conference Proceeding

Publication Title

Proceedings - 2024 25th International Conference on Digital Image Computing: Techniques and Applications, DICTA 2024

First Page

158

Last Page

165

Publisher

IEEE

School

School of Science

RAS ID

71853

Comments

Tahir, S., Mirnateghi, N., Shah, S. A. A., & Sohel, F. (2024, November). DEER: Deep emotion-sets for fine-grained emotion recognition. In 2024 International Conference on Digital Image Computing: Techniques and Applications (DICTA) (pp. 158-165). IEEE. https://doi.org/10.1109/DICTA63115.2024.00034

Abstract

For robots to effectively interact with humans in-the-wild, it is essential that they accurately recognize their emotions. To achieve this, important facial features must be captured to reliably comprehend human emotions. Most facial emotion recognition (FER) research works have used single-shot images for classifying emotions, and in certain instances, several networks have been utilized for voting against each image. These approaches have functioned well; however, there is potential for improvement in terms of precision. In this paper, we propose emotion-sets as a unique encoding for face image data (with various people and face angles) to classify emotion classes, as opposed to the conventional single-image-based classification. For each image in an emotion-set, prediction confidence against each emotion is utilized as a vote. The results are generated by a combination of two distinct voting methods, including Majority Voting and Weighted Voting. The proposed method achieves state-of-the-art accuracy on the Facial Emotion Recognition 2013 (FER2013), Cohn Kanade (CK+), and Facial Emotion Recognition Group (FERG) datasets without using techniques like data augmentation, feature extraction, or extra training data, which are used by most state-of-the-art works. Our experimental findings indicate that the suggested emotion-set classification yields more accurate results than the current state-of-the-art FER methods.

DOI

10.1109/DICTA63115.2024.00034

Access Rights

subscription content

Share

 
COinS