Preview

Acta Biomedica Scientifica

Advanced search

End-to-end convolutional neural network for automatic encoding facial descriptor (N-CNN) in the diagnosis of intrauterine distress

https://doi.org/10.29413/ABS.2023-8.4.4

Abstract

Background. Existing methods for studying intrauterine distress, despite their prevalence, still have their limitations, so studying and assessment of fetal movements during ultrasound diagnostics can become a convenient and affordable additional tool for diagnosing this pathological condition.

The aim of the study. To assess the prevalence and diagnostic significance of a known set of fetal facial movements for the timely determination of intrauterine distress.

Methods. This prospective single-center study included 225 fetuses of a gestational age from 32 to 40 weeks. The FIGO chart was used as fitting criteria of intrauterine distress. The assessment of facial movements in all fetuses was carried out using the BabyFACS technique, where the action unit (AU) used for the assessment; its coding is carried out in strict accordance with the chart of  motor descriptors (MD). Statistical data processing was carried out using SPSS Statistics 20 (IBM Corp., USA). The Mann – Whitney test was used as the main statistical parameter, where a threshold level of 0.05 was chosen to interpret the p-tests value.

Results. Despite the occurrence of AU1, AU2, AU3, AU4 in both groups, these MDs were recorded in the group with confirmed distress (p  =  0.00001). Facial units such as AU9 and AU20 were found only in children with intrauterine distress, which, in the total amount of the MD assessment, can be considered one of the main search signs that specialists should first of all pay attention to. All motor descriptors showed high positive predictive value and diagnostic sensitivity, with the highest results registered for AU9 and AU20.

Conclusion. Assessment of facial units during ultrasound diagnostics can be a convenient tool as an additional diagnosis of the development of intrauterine distress and requires further study.

About the Authors

N. V. Korotaeva
Voronezh State Medical University named after N.N. Burdenko; Voronezh Regional Clinical Hospital No. 1
Russian Federation

Natalia V. Korotaeva – Cand. Sc. (Med.), Associate Professor at the Department of Neonatology and Pediatrics; Neonatologist at the Department of the Pathology of Newborns and Premature Children, Perinatal Center 

Studencheskaya str. 10, Voronezh 394036;
Moskovsky ave. 151, Voronezh 394066



L. I. Ippolitova
Voronezh State Medical University named after N.N. Burdenko
Russian Federation

Lyudmila I. Ippolitova – Dr. Sc. (Med.), Head of the Department of Neonatology and Pediatrics 

Studencheskaya str. 10, Voronezh 394036



E. S. Pershina
Voronezh State Medical University named after N.N. Burdenko; Voronezh Regional Clinical Hospital No. 1
Russian Federation

Elena S. Pershina – Teaching Assistant at the Department of Neonatology and Pediatrics; Neonatologist at the Department of the Pathology of Newborns and Premature Children, Perinatal Center 

Studencheskaya str. 10, Voronezh 394036;
Moskovsky ave. 151, Voronezh 394066



References

1. Bartlett MS, Littlewort GC, Frank MG, Lee K. Automatic decoding of facial movements reveals deceptive pain expressions. Curr Biol. 2014; 24: 738-743. doi: 10.1016/j.cub.2014.02.009

2. Zhu S. Pain expression recognition based on pLSA model. Sci World J. 2014; 2014: 736106. doi: 10.1155/2014/736106

3. Paraschiv-Ionescu A, Buchser EE, Rutscmann B, Najafi B, Aminian K. Ambulatory system for the quantitative and qualitative analysis of gait and posture in chronic pain patients treated with spinal cord stimulation. Gait Posture. 2004; 20: 113-125. doi: 10.1016/j.gaitpost.2003.07.005

4. El Ayadi M, Kamel MS, Karray F. Survey on speech emotion recognition: Features, classification schemes, and databases. Pattern Recognit. 2011; 44: 572-587.

5. Sariyanidi E, Gunes H, Cavallaro A. Automatic analysis of facial affect: A survey of registration, representation, and recognition. IEEE Trans Pattern Anal Mach Intell. 2015; 37: 1113-1133. doi: 10.1109/TPAMI.2014.2366127

6. Peters JW, Koot HM, Grunau RE, de Boer J, van Druenen MJ, Tibboel D, et al. Neonatal facial coding system for assessing postoperative pain in infants: Item reduction is valid and feasible. Clin J Pain. 2003; 19: 353-363. doi: 10.1097/00002508-200311000-00003

7. Korotaeva NV, Ippolitova LI, Pershina ES. Detection and identification of facial signals of newborns of different gestational age to assess infants’ emotional status. Pediatria n. a. G.N. Speransky. 2020; 99(1): 40-44. (In Russ.).

8. Korotaeva NV, Ippolitova LI, Fedotova TV, Chibisova NA, Pershina ES. Minimal facial movements of the fetus as one of the diagnostic criteria intrauterine distress. Neonatology. 2020; 8(2): 7-12. (In Russ.). doi: 10.33029/2308-2402-2020-8-2-7-12

9. Spairani E, Daniele B, Signorini MG, Magenes G. A deep learning mixed-data type approach for the classification of FHR signals. Front Bioeng Biotechnol. 2022; 10: 887549. doi: 10.3389/fbioe.2022.887549

10. Comert Z, Yang Z, Velappan S, Boopathi AM, Kocamaz AF. Performance evaluation of empirical mode decomposition and discrete wavelet transform for computerized hypoxia detection and prediction. 26th IEEE Signal Processing and Communications Applications Conference (Izmir, Turkey, 02–05 May 2018). 2018; 1-4. doi: 10.1109/siu.2018.8404243

11. Thill B. Fetal pain in the first trimester. Linacre Q. 2022; 89(1): 73-100. doi: 10.1177/00243639211059245

12. Ahmad KA, Frey CS, Fierro MA, Kenton AB, Placencia FX. Two-Year neurodevelopmental outcome of an infant born at 21 weeks 4 days gestation. Pediatrics. 2017; 140(6): e20170103. doi: 10.1542/peds.2017-0103


Review

For citations:


Korotaeva N.V., Ippolitova L.I., Pershina E.S. End-to-end convolutional neural network for automatic encoding facial descriptor (N-CNN) in the diagnosis of intrauterine distress. Acta Biomedica Scientifica. 2023;8(4):32-38. https://doi.org/10.29413/ABS.2023-8.4.4

Views: 474


Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 License.


ISSN 2541-9420 (Print)
ISSN 2587-9596 (Online)