sensors-logo

Journal Browser

Journal Browser

Sensors Applications on Emotion Recognition

A special issue of Sensors (ISSN 1424-8220). This special issue belongs to the section "Intelligent Sensors".

Deadline for manuscript submissions: 10 June 2024 | Viewed by 2852

Special Issue Editors


E-Mail Website
Guest Editor
Department of Computer Science and Information Engineering, National Taichung University of Science and Technology, Taichung City 404348, Taiwan
Interests: IoT; social computing
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
Department of Computer Science and Information Engineering, National Taichung University of Science and Technology, Taichung City 404348, Taiwan
Interests: big data analysis; social network mining

Special Issue Information

Dear Colleagues,

Emotion recognition is an ad hoc research subject in various fields that apply human emotional reactions as a signal for marketing, automation, entertainment, technical equipment, and human–robot interaction. Sensors are used to detect human emotions and are associated with many developments. The recognition and evaluation of emotions are complex subjects due to their interdisciplinary nature. However, sensors support much information for these recognitions and evaluations. Many scientific disciplines, including psychology, medical sciences, data analysis, and mechatronics, are involved in the research into sensor applications for emotion recognition.

This Special Issue aims to bring together researchers and practitioners working on the design, development, and evaluation of sensor-based emotion recognition systems. The objective of this Special Issue is to provide a comprehensive view of the latest research and advancements in the field of sensors applications on emotion recognition. This Special Issue provides a framework to discuss and study sensor applications from the perspective of emotion recognition. We invite researchers to contribute to this Special Issue by submitting comprehensive reviews, case studies, and research articles in the field of theoretical and methodological interdisciplinary sensors applications on emotion recognition. In particular, sensor application technologies specifically devised, adapted, or tailored to address problems in emotion recognition are welcome.

Prof. Dr. Jason C. Hung
Dr. Neil Yuwen Yen
Dr. Hao-Shang Ma
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Sensors is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • body posture
  • facial expression
  • gesture analysis
  • electroencephalography (EEG)
  • electrocardiography (ECG)
  • galvanic skin response (GSR)
  • heart rate variability (HRV)
  • sensor technologies for emotion recognition (e.g., physiological sensors, facial recognition, voice analysis)
  • machine learning and artificial intelligence techniques for emotion recognition
  • wearable sensors and Internet of Things (IoT) devices for emotion recognition
  • ethical and privacy issues related to sensor-based emotion recognition systems
  • applications of sensor-based emotion recognition in different domains (e.g., healthcare, education, entertainment, marketing)
  • user studies and evaluations of sensor-based emotion recognition systems

Published Papers (4 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

14 pages, 6945 KiB  
Article
Portable Facial Expression System Based on EMG Sensors and Machine Learning Models
by Paola A. Sanipatín-Díaz, Paul D. Rosero-Montalvo and Wilmar Hernandez
Sensors 2024, 24(11), 3350; https://doi.org/10.3390/s24113350 - 23 May 2024
Viewed by 359
Abstract
One of the biggest challenges of computers is collecting data from human behavior, such as interpreting human emotions. Traditionally, this process is carried out by computer vision or multichannel electroencephalograms. However, they comprise heavy computational resources, far from final users or where the [...] Read more.
One of the biggest challenges of computers is collecting data from human behavior, such as interpreting human emotions. Traditionally, this process is carried out by computer vision or multichannel electroencephalograms. However, they comprise heavy computational resources, far from final users or where the dataset was made. On the other side, sensors can capture muscle reactions and respond on the spot, preserving information locally without using robust computers. Therefore, the research subject is the recognition of the six primary human emotions using electromyography sensors in a portable device. They are placed on specific facial muscles to detect happiness, anger, surprise, fear, sadness, and disgust. The experimental results showed that when working with the CortexM0 microcontroller, enough computational capabilities were achieved to store a deep learning model with a classification store of 92%. Furthermore, we demonstrate the necessity of collecting data from natural environments and how they need to be processed by a machine learning pipeline. Full article
(This article belongs to the Special Issue Sensors Applications on Emotion Recognition)
Show Figures

Figure 1

24 pages, 2189 KiB  
Article
Generating Synthetic Health Sensor Data for Privacy-Preserving Wearable Stress Detection
by Lucas Lange, Nils Wenzlitschke and Erhard Rahm
Sensors 2024, 24(10), 3052; https://doi.org/10.3390/s24103052 - 11 May 2024
Viewed by 439
Abstract
Smartwatch health sensor data are increasingly utilized in smart health applications and patient monitoring, including stress detection. However, such medical data often comprise sensitive personal information and are resource-intensive to acquire for research purposes. In response to this challenge, we introduce the privacy-aware [...] Read more.
Smartwatch health sensor data are increasingly utilized in smart health applications and patient monitoring, including stress detection. However, such medical data often comprise sensitive personal information and are resource-intensive to acquire for research purposes. In response to this challenge, we introduce the privacy-aware synthetization of multi-sensor smartwatch health readings related to moments of stress, employing Generative Adversarial Networks (GANs) and Differential Privacy (DP) safeguards. Our method not only protects patient information but also enhances data availability for research. To ensure its usefulness, we test synthetic data from multiple GANs and employ different data enhancement strategies on an actual stress detection task. Our GAN-based augmentation methods demonstrate significant improvements in model performance, with private DP training scenarios observing an 11.90–15.48% increase in F1-score, while non-private training scenarios still see a 0.45% boost. These results underline the potential of differentially private synthetic data in optimizing utility–privacy trade-offs, especially with the limited availability of real training samples. Through rigorous quality assessments, we confirm the integrity and plausibility of our synthetic data, which, however, are significantly impacted when increasing privacy requirements. Full article
(This article belongs to the Special Issue Sensors Applications on Emotion Recognition)
Show Figures

Figure 1

13 pages, 865 KiB  
Article
Electroencephalogram-Based Facial Gesture Recognition Using Self-Organizing Map
by Takahiro Kawaguchi, Koki Ono and Hiroomi Hikawa
Sensors 2024, 24(9), 2741; https://doi.org/10.3390/s24092741 - 25 Apr 2024
Viewed by 427
Abstract
Brain–computer interfaces (BCIs) allow information to be transmitted directly from the human brain to a computer, enhancing the ability of human brain activity to interact with the environment. In particular, BCI-based control systems are highly desirable because they can control equipment used by [...] Read more.
Brain–computer interfaces (BCIs) allow information to be transmitted directly from the human brain to a computer, enhancing the ability of human brain activity to interact with the environment. In particular, BCI-based control systems are highly desirable because they can control equipment used by people with disabilities, such as wheelchairs and prosthetic legs. BCIs make use of electroencephalograms (EEGs) to decode the human brain’s status. This paper presents an EEG-based facial gesture recognition method based on a self-organizing map (SOM). The proposed facial gesture recognition uses α, β, and θ power bands of the EEG signals as the features of the gesture. The SOM-Hebb classifier is utilized to classify the feature vectors. We utilized the proposed method to develop an online facial gesture recognition system. The facial gestures were defined by combining facial movements that are easy to detect in EEG signals. The recognition accuracy of the system was examined through experiments. The recognition accuracy of the system ranged from 76.90% to 97.57% depending on the number of gestures recognized. The lowest accuracy (76.90%) occurred when recognizing seven gestures, though this is still quite accurate when compared to other EEG-based recognition systems. The implemented online recognition system was developed using MATLAB, and the system took 5.7 s to complete the recognition flow. Full article
(This article belongs to the Special Issue Sensors Applications on Emotion Recognition)
Show Figures

Figure 1

17 pages, 3003 KiB  
Article
The Difference in the Assessment of Knee Extension/Flexion Angles during Gait between Two Calibration Methods for Wearable Goniometer Sensors
by Tomoya Ishida and Mina Samukawa
Sensors 2024, 24(7), 2092; https://doi.org/10.3390/s24072092 - 25 Mar 2024
Viewed by 677
Abstract
Frontal and axial knee motion can affect the accuracy of the knee extension/flexion motion measurement using a wearable goniometer. The purpose of this study was to test the hypothesis that calibrating the goniometer on an individual’s body would reduce errors in knee flexion [...] Read more.
Frontal and axial knee motion can affect the accuracy of the knee extension/flexion motion measurement using a wearable goniometer. The purpose of this study was to test the hypothesis that calibrating the goniometer on an individual’s body would reduce errors in knee flexion angle during gait, compared to bench calibration. Ten young adults (23.2 ± 1.3 years) were enrolled. Knee flexion angles during gait were simultaneously assessed using a wearable goniometer sensor and an optical three-dimensional motion analysis system, and the absolute error (AE) between the two methods was calculated. The mean AE across a gait cycle was 2.4° (0.5°) for the on-body calibration, and the AE was acceptable (<5°) throughout a gait cycle (range: 1.5–3.8°). The mean AE for the on-bench calibration was 4.9° (3.4°) (range: 1.9–13.6°). Statistical parametric mapping (SPM) analysis revealed that the AE of the on-body calibration was significantly smaller than that of the on-bench calibration during 67–82% of the gait cycle. The results indicated that the on-body calibration of a goniometer sensor had acceptable and better validity compared to the on-bench calibration, especially for the swing phase of gait. Full article
(This article belongs to the Special Issue Sensors Applications on Emotion Recognition)
Show Figures

Figure 1

Back to TopTop