Expression Detection of Children with Special Needs Using Yolov4-Tiny


Husri Sidi(1); Aviv Yuniar Rahman(2*); Fitri Marisa(3);

(1) Universitas Widyagama Malang
(2) Universitas Widyagama Malang Asia e University
(3) Universitas Widyagama Malang
(*) Corresponding Author

  

Abstract


This research addresses the challenge of detecting emotional expressions in children with special needs, who often rely on nonverbal communication due to difficulties in verbal expression. Traditional emotion detection methods struggle to accurately recognize subtle emotions in these children, which can lead to communication barriers in educational and therapeutic settings. This study proposes the use of the Yolov4-Tiny model, a lightweight and efficient object detection architecture, to accurately detect four key facial expressions: Angry, Happy, Smile, and Afraid. The dataset consists of 1500 images, evenly distributed across the four expression classes, captured under controlled conditions. The model was evaluated using various metrics, including Confidence, Precision, Recall, F1-Score, and Mean Average Precision (mAP), across different training-to-testing data splits. The results demonstrated that the Yolov4-Tiny model achieved high accuracy, with a perfect mAP of 100% for balanced and slightly imbalanced splits, and a minimum mAP of 93.1% for more imbalanced splits. This high level of performance highlights the model's robustness and potential for application in educational and therapeutic environments, where understanding emotional expressions is critical for providing tailored support to children with special needs. The proposed system offers a significant improvement over traditional methods, enhancing communication and emotional support for this vulnerable population.


Keywords


Children with Special Needs; Emotion Detection; Nonverbal Communication; Yolov4-Tiny

  
  

Full Text:

PDF
  

Article Metrics

Abstract view: 41 times
PDF view: 14 times
     

Digital Object Identifier

doi  https://doi.org/10.33096/ilkom.v16i3.1609.221-227
  

Cite

References


A. K. Koessler, J. F. Ortiz-Riomalo, M. Janke, and S. Engel, “Structuring Communication Effectively—The Causal Effects of Communication Elements on Cooperation in Social Dilemmas,” Environ. Resour. Econ., vol. 79, no. 4, pp. 683–712, 2021, doi: 10.1007/s10640-021-00552-2.

J. Israelashvili and A. Fischer, “Recognition of Emotion from Verbal and Nonverbal Expressions and Its Relation to Effective Communication: A Preliminary Evidence of a Positive Link,” J. Intell., vol. 11, no. 1, 2023, doi: 10.3390/jintelligence11010006.

S. A. Reid and H. Giles, “Intergroup relations: Its linguistic and communicative parameters,” Gr. Process. Intergr. Relations, vol. 8, no. 3 SPEC. ISS., pp. 211–214, 2005, doi: 10.1177/1368430205053938.

V. Jethava, J. Kadish, L. Kakonge, and C. Wiseman-Hakes, “Early Attachment and the Development of Social Communication: A Neuropsychological Approach,” Front. Psychiatry, vol. 13, no. April, pp. 1–8, 2022, doi: 10.3389/fpsyt.2022.838950.

P. Wu, C. Miller, T. Ott, S. Schmer-Galunder, and J. Rye, “Mining For psycho-social dimensions through socio-linguistics,” AAAI Spring Symp. - Tech. Rep., vol. SS, pp. 33–37, 2015.

W. M. C. M. Caris-Verhallen, A. Kerkstra, and J. M. Bensing, “Non-verbal behaviour in nurse-elderly patient communication,” J. Adv. Nurs., vol. 29, no. 4, pp. 808–818, 1999, doi: 10.1046/j.1365-2648.1999.00965.x.

L. L. Lott, F. B. Spengler, T. Stächele, B. Schiller, and M. Heinrichs, “EmBody/EmFace as a new open tool to assess emotion recognition from body and face expressions,” Sci. Rep., vol. 12, no. 1, pp. 1–13, 2022, doi: 10.1038/s41598-022-17866-w.

R. A. Smith and E. S. Cross, “The McNorm library: creating and validating a new library of emotionally expressive whole body dance movements,” Psychol. Res., vol. 87, no. 2, pp. 484–508, 2023, doi: 10.1007/s00426-022-01669-9.

P. Terhürne et al., “Validation and application of the Non-Verbal Behavior Analyzer: An automated tool to assess non-verbal emotional expressions in psychotherapy,” Front. Psychiatry, vol. 13, no. October, pp. 1–12, 2022, doi: 10.3389/fpsyt.2022.1026015.

M. A. Pelzl et al., “Reduced impact of nonverbal cues during integration of verbal and nonverbal emotional information in adults with high-functioning autism,” Front. Psychiatry, vol. 13, no. January, pp. 1–11, 2023, doi: 10.3389/fpsyt.2022.1069028.

A. Brignell, K. V. Chenausky, H. Song, J. Zhu, C. Suo, and A. T. Morgan, “Communication interventions for autism spectrum disorder in minimally verbal children,” Cochrane Database Syst. Rev., vol. 2018, no. 11, 2018, doi: 10.1002/14651858.CD012324.pub2.

S. Drain and P. E. Engelhardt, “Naturalistic Observations of Nonverbal Children with Autism: A Study of Intentional Communicative Acts in the Classroom,” Child Dev. Res., vol. 2013, pp. 1–10, 2013, doi: 10.1155/2013/296039.

A. Sturrock, K. Foy, J. Freed, C. Adams, and K. Leadbitter, “The impact of subtle language and communication difficulties on the daily lives of autistic children without intellectual disability: Parent perspectives,” Int. J. Lang. Commun. Disord., vol. 58, no. 4, pp. 1232–1250, 2023, doi: 10.1111/1460-6984.12859.

A. Selick, J. Durbin, Y. Hamdani, J. Rayner, and Y. Lunsky, “‘Can you hear me now?’: a qualitative exploration of communication quality in virtual primary care encounters for patients with intellectual and developmental disabilities,” BMC Prim. Care, vol. 24, no. 1, pp. 1–10, 2023, doi: 10.1186/s12875-023-02055-z.

R. Rowson, “Expressions of emotion as perceptual media,” Synthese, vol. 201, no. 6, pp. 1–23, 2023, doi: 10.1007/s11229-023-04212-4.

E. L. Acland, J. Peplak, A. Suri, and T. Malti, “Emotion recognition links to reactive and proactive aggression across childhood: A multi-study design,” Dev. Psychopathol., pp. 1–12, 2023, doi: 10.1017/s0954579423000342.

P. J. Standen et al., “An evaluation of an adaptive learning system based on multimodal affect recognition for learners with intellectual disabilities,” Br. J. Educ. Technol., vol. 51, no. 5, pp. 1748–1765, 2020, doi: 10.1111/bjet.13010.

S. W. McQuiggan, “Responding to Student Affect and Efficacy through Empathetic Companion Agents in Interactive Learning Environments,” Proc. 22nd AAAI Conf. Artif. Intell. AAAI 2007, pp. 1939–1940, 2007.

N. Hadjikhani, M. Galazka, T. Kenet, R. Joseph, and J. Åsberg Johnels, “Discrepancy between high non-verbal intelligence and low accuracy at reading emotional expressions in the eyes reflects the magnitude of social–emotional difficulties in autism,” Eur. Arch. Psychiatry Clin. Neurosci., vol. 273, no. 3, pp. 755–759, 2023, doi: 10.1007/s00406-022-01471-z.

L. G. Bailes, G. Ennis, S. M. Lempres, D. A. Cole, and K. L. Humphreys, “Parents’ emotion socialization behaviors in response to preschool-aged children’s justified and unjustified negative emotions,” PLoS One, vol. 18, no. 4 April, pp. 1–16, 2023, doi: 10.1371/journal.pone.0283689.

J. Ortega, Z. Chen, and D. Whitney, “Inferential Emotion Tracking reveals impaired context-based emotion processing in individuals with high Autism Quotient scores,” Sci. Rep., vol. 13, no. 1, pp. 1–13, 2023, doi: 10.1038/s41598-023-35371-6.

A. Landowska et al., “Automatic Emotion Recognition in Children with Autism: A Systematic Literature Review,” Sensors, vol. 22, no. 4, pp. 1–29, 2022, doi: 10.3390/s22041649.

H. Kalantarian et al., “The performance of emotion classifiers for children with parent-reported autism: Quantitative feasibility study,” JMIR Ment. Heal., vol. 7, no. 4, 2020, doi: 10.2196/13174.

H. Zang, S. Y. Foo, S. Bernadin, and A. Meyer-Baese, “Facial Emotion Recognition Using Asymmetric Pyramidal Networks with Gradient Centralization,” IEEE Access, vol. 9, pp. 64487–64498, 2021, doi: 10.1109/ACCESS.2021.3075389.

G. Tonguç and B. Ozaydın Ozkara, “Automatic Recognition of Student Emotions from Facial Expressions During A Lecture,” Comput. Educ., vol. 148, no. November 2019, p. 103797, 2020, doi: 10.1016/j.compedu.2019.103797.

B. Coşkun et al., “A physiological signal database of children with different special needs for stress recognition,” Sci. Data, vol. 10, no. 1, pp. 1–10, 2023, doi: 10.1038/s41597-023-02272-2.

M. A. Rashidan et al., “Technology-Assisted Emotion Recognition for Autism Spectrum Disorder (ASD) Children: A Systematic Literature Review,” IEEE Access, vol. 9, pp. 33638–33653, 2021, doi: 10.1109/ACCESS.2021.3060753.

D. Feng et al., “Fry Counting Models Based on Attention Mechanism and YOLOv4-Tiny,” IEEE Access, vol. 10, no. October, pp. 132363–132375, 2022, doi: 10.1109/ACCESS.2022.3230909.

L. Wang, K. Zhou, A. Chu, G. Wang, and L. Wang, “An Improved Light-Weight Traffic Sign Recognition Algorithm Based on YOLOv4-Tiny,” IEEE Access, vol. 9, pp. 124963–124971, 2021, doi: 10.1109/ACCESS.2021.3109798.

E. Smistad, A. Østvik, and A. Pedersen, “High performance neural network inference, streaming, and visualization of medical images using FAST,” IEEE Access, vol. 7, pp. 136310–136321, 2019, doi: 10.1109/ACCESS.2019.2942441.

C. M. Ng, “Novel hybrid GPU-CPU implementation of parallelized monte carlo parametric expectation maximization estimation method for population pharmacokinetic data analysis,” AAPS J., vol. 15, no. 4, pp. 1212–1221, 2013, doi: 10.1208/s12248-013-9524-0.

F. Fei, W. Liu, and L. Shu, “A Lightweight Convolutional Neural Network for Salient Object Detection,” Teh. Vjesn., vol. 31, no. 4, pp. 1402–1410, 2024, doi: 10.17559/TV-20230210000345.


Refbacks

  • There are currently no refbacks.


Copyright (c) 2024 Husri Sidi, Aviv Yuniar Rahman, Fitri Marisa

Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.