A Deep Learning-Based Approach for Emotion Classification Using Stretchable Sensor Data-Scilight

Journal of Intelligent Communication

Article

A Deep Learning-Based Approach for Emotion Classification Using Stretchable Sensor Data

Downloads

Refat, C. M. M. (2025). A Deep Learning-Based Approach for Emotion Classification Using Stretchable Sensor Data. Journal of Intelligent Communication, 4(1), 74–86. https://doi.org/10.54963/jic.v4i1.961

Authors

  • Chowdhury Mohammad Masum Refat

    Department of Mechanical Engineering, Osaka University, 1-1 Yamadoka, Suita 565-0871, Japan

Facial expressions play a vital role in human communication, especially for individuals with motor impairments who rely on alternative interaction methods. This study presents a deep learning-based approach for real-time emotion classification using stretchable strain sensors integrated into a wearable system. The sensors, fabricated with conductive silver ink on a flexible Tegaderm substrate, detect subtle facial muscle movements. Positioned strategically on the forehead, upper lip, lower lip, and left cheek, these sensors effectively capture emotions such as happiness, neutrality, sadness, and disgust. A data pipeline incorporating Min-Max normalization and SMOTE balancing addresses noise and class imbalances, while dimensionality reduction techniques like PCA and t-SNE enhance data visualization. The system’s classification performance was evaluated using standard machine learning metrics, achieving an overall accuracy of 76.6%, with notable success in distinguishing disgust (86.0% accuracy) and neutrality (81.0% accuracy). This work offers a flexible, cost-effective, and biocompatible solution for emotion recognition, with potential applications in rehabilitation robotics, assistive technologies, and human-computer interaction.

Keywords:

Facial Expression Recognition Emotion Classification Stretchable Sensors Deep Learning Wearable Technology Facial Expression Recognition, Emotion Classification, Stretchable Sensors, Deep Learning, Wearable Technology, Rehabilitation Robotics

References

  1. Krumhuber, E.G.; Skora, L.I.; Hill, H.C.H.; et al. The role of facial movements in emotion recognition. Nat. Rev. Psychol. 2023, 2, 283–296.
  2. Hadley, L.V.; Naylor, G.; Hamilton. A review of theories and methods in the science of face-to-face social interaction. Nat. Rev. Psychol. 2022, 1, 42–54.
  3. Tcherkassof, A.; Dupré, D. The emotion–facial expression link: evidence from human and automatic expression recognition. Psychol. Res. 2021, 85, 2954–2969.
  4. Calbi, M.; Langiulli, N.; Ferroni, F.; et al. The consequences of Covid-19 on social interactions: an online study on face covering. Sci. Rep. 2021, 11, 2601.
  5. Ilyas, C.M.A.; Rehm, M.; Nasrollahi, K.; et al. Deep transfer learning in human–robot interaction for cognitive and physical rehabilitation purposes. Pattern Anal. Appl. 2022, 25, 653–67.
  6. Li, M.; Wu, Z.; Zhao, C.G.; et al. Facial expressions-controlled flight game with haptic feedback for stroke rehabilitation: A proof-of-concept study. IEEE Robot. Autom. Lett. 2022, 7, 6351–6358.
  7. Tao,K.; Lei, J.; Huang, J. Physical integrated digital twin-based interaction mechanism of artificial intelligence rehabilitation robots combining visual cognition and motion control. Wirel. Pers. Commun. 2024.
  8. de Freitas, M.P.; Piai, V.A.; Farias, R.H.; et al. Artificial intelligence of things applied to assistive technology: A systematic literature review. Sensors 2022, 22, 8531.
  9. Leong, S.C.;Tang, Y.M.; Lai,C.H.; et al. Facial expression and body gesture emotion recognition: A systematic review on the use of visual data in affective computing. Comput. Sci. Rev. 2023, 48, 100545.
  10. Refat, C.M.M.; Azlan, N.Z. Stretch sensor-based facial expression recognition and classification using machine learning. Int. J. Comput. Intell. Appl. 2021, 20, 2150010.
  11. Gao,Y.; Yu, L.; Yeo,J.C.; et al. Flexible hybrid sensors for health monitoring: Materials and mechanisms to render wearability. Adv. Mater. 2020, 32, 1902133.
  12. Sun,T.; Tasnim, F.; McIntosh, R.T.; et al. Decoding of facial strains via conformable piezo- electric interfaces. Nat. Biomed. Eng. 2020, 4, 954–972.
  13. Singh, S.; Prasad, S.V.A.V. Techniques and challenges of face recognition: A critical review. In Proceedings of the 8th International Conference on Advances in Computing Communications (ICACC-2018). Procedia Comput. Sci. 2018, 143, 536–543.
  14. Balconi, M.; Bortolotti, A.; Gonzaga, L. Emotional face recognition, emg response, and medial prefrontal activity in empathic behaviour. Neurosci. Res. 2011, 71, 251–259.
  15. Clancy, E.A.; Morin, E.L.; Merletti, R. Sampling, noise-reduction and amplitude estimation issues in surface electromyography. J Electromyogr Kinesiol 2022, 12(1), 1–16.
  16. Lu, G.; Brittain, J.S.; Holland, P. et al. Removing ecg noise from surface emg signals using adaptive filtering. Neurosci Lett 2009, 462(1), 14–19.
  17. Mohd Asri, M.A.; Mak, W.C.; Norazman, S.A. et al. Low-cost and rapid prototyping of integrated electrochemical microfluidic platforms using consumer-grade off-the-shelf tools and materials. Lab Chip 2022, 22, 1779–1792.
  18. Mohd Asri, M.A.; Nordin, A.N.; Ramli, N. Low-cost and cleanroom-free prototyping of microfluidic and electrochemical biosensors: Techniques in fabrication and bioconjugation. Biomicrofluidics 2021, 15, 061502.
  19. Ramli, N.A.; Nordin, A.N.; Azlan, N.Z. Development of low cost screen- printed piezoresistive strain sensor for facial expressions recognition systems. Microelectron Eng 2020, 234, 111440.
  20. Mohd Asri, M.A.; Ramli, N.A.; Nordin, A.N. Electrical performance and reliability assessment of silver inkjet printed circuits on flexible substrates. J Mater Sci Mater Electron 2021, 32(22), 16024–16037.
  21. Chawla, N.V.; Bowyer, K.W.; Hall, L.O. et al. Smote: synthetic minority over-sampling technique. J Artif Intell Res 2002, 16(1), 321–357.
  22. Sehgal, S.; Singh, H.; Agarwal, M. et al. Data analysis using principal component analysis. In Proceedings of the 2014 International Conference on Medical Imaging, m-Health and Emerging Commu- nication Systems (MedCom), pp. 45–48.
  23. Rogovschi, N.; Kitazono, J.; Grozavu, N. t-distributed stochastic neighbor embedding spectral clustering. In Proceedings of the 2017 International Joint Conference on Neural Networks (IJCNN), pp. 1628–1632.
  24. Nordin, A.N.; Refat, C.M.M.; Azlan, N.Z. Facial expression-driven rehabilitation robotics: A machine learning approach with stretchable sensors. Res Sq 2024, PREPRINT (Version 1).