A Robust Facial Expression Recognition Algorithm Based on Multi-Rate Feature Fusion Scheme
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Park, Seo-Jeon | - |
dc.contributor.author | Kim, Byung-Gyu | - |
dc.contributor.author | Chilamkurti, Naveen | - |
dc.date.accessioned | 2022-04-19T08:49:11Z | - |
dc.date.available | 2022-04-19T08:49:11Z | - |
dc.date.issued | 2021-11 | - |
dc.identifier.issn | 1424-8220 | - |
dc.identifier.issn | 1424-3210 | - |
dc.identifier.uri | https://scholarworks.sookmyung.ac.kr/handle/2020.sw.sookmyung/146144 | - |
dc.description.abstract | In recent years, the importance of catching humans' emotions grows larger as the artificial intelligence (AI) field is being developed. Facial expression recognition (FER) is a part of understanding the emotion of humans through facial expressions. We proposed a robust multi-depth network that can efficiently classify the facial expression through feeding various and reinforced features. We designed the inputs for the multi-depth network as minimum overlapped frames so as to provide more spatio-temporal information to the designed multi-depth network. To utilize a structure of a multi-depth network, a multirate-based 3D convolutional neural network (CNN) based on a multirate signal processing scheme was suggested. In addition, we made the input images to be normalized adaptively based on the intensity of the given image and reinforced the output features from all depth networks by the self-attention module. Then, we concatenated the reinforced features and classified the expression by a joint fusion classifier. Through the proposed algorithm, for the CK+ database, the result of the proposed scheme showed a comparable accuracy of 96.23%. For the MMI and the GEMEP-FERA databases, it outperformed other state-of-the-art models with accuracies of 96.69% and 99.79%. For the AFEW database, which is known as one in a very wild environment, the proposed algorithm achieved an accuracy of 31.02%. | - |
dc.language | 영어 | - |
dc.language.iso | ENG | - |
dc.publisher | MDPI | - |
dc.title | A Robust Facial Expression Recognition Algorithm Based on Multi-Rate Feature Fusion Scheme | - |
dc.type | Article | - |
dc.publisher.location | 스위스 | - |
dc.identifier.doi | 10.3390/s21216954 | - |
dc.identifier.scopusid | 2-s2.0-85117298886 | - |
dc.identifier.wosid | 000720033200001 | - |
dc.identifier.bibliographicCitation | SENSORS, v.21, no.21 | - |
dc.citation.title | SENSORS | - |
dc.citation.volume | 21 | - |
dc.citation.number | 21 | - |
dc.type.docType | Article | - |
dc.description.isOpenAccess | N | - |
dc.description.journalRegisteredClass | scie | - |
dc.description.journalRegisteredClass | scopus | - |
dc.relation.journalResearchArea | Chemistry | - |
dc.relation.journalResearchArea | Engineering | - |
dc.relation.journalResearchArea | Instruments & Instrumentation | - |
dc.relation.journalWebOfScienceCategory | Chemistry, Analytical | - |
dc.relation.journalWebOfScienceCategory | Engineering, Electrical & Electronic | - |
dc.relation.journalWebOfScienceCategory | Instruments & Instrumentation | - |
dc.subject.keywordPlus | FACE | - |
dc.subject.keywordPlus | SYSTEMS | - |
dc.subject.keywordAuthor | deep learning | - |
dc.subject.keywordAuthor | facial expression recognition (FER) | - |
dc.subject.keywordAuthor | 3D convolutional neural network (3D CNN) | - |
dc.subject.keywordAuthor | multirate signal processing | - |
dc.subject.keywordAuthor | minimum overlapped frame structure | - |
dc.subject.keywordAuthor | self-attention | - |
dc.subject.keywordAuthor | multi-depth network | - |
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.
Sookmyung Women's University. Cheongpa-ro 47-gil 100 (Cheongpa-dong 2ga), Yongsan-gu, Seoul, 04310, Korea02-710-9127
Copyright©Sookmyung Women's University. All Rights Reserved.
Certain data included herein are derived from the © Web of Science of Clarivate Analytics. All rights reserved.
You may not copy or re-distribute this material in whole or in part without the prior written consent of Clarivate Analytics.