Objectives To develop a deep learning (DL)-model using convolutional neural networks (CNN) to automatically identify the fetal head position at transperineal ultrasound in the second stage of labor. Material and methods Prospective, multicenter study including singleton, term, cephalic pregnancies in the second stage of labor. We assessed the fetal head position using transabdominal ultrasound and subsequently, obtained an image of the fetal head on the axial plane using transperineal ultrasound and labeled it according to the transabdominal ultrasound findings. The ultrasound images were randomly allocated into the three datasets containing a similar proportion of images of each subtype of fetal head position (occiput anterior, posterior, right and left transverse): the training dataset included 70 %, the validation dataset 15 %, and the testing dataset 15 % of the acquired images. The pre-trained ResNet18 model was employed as a foundational framework for feature extraction and classification. CNN1 was trained to differentiate between occiput anterior (OA) and non-OA positions, CNN2 classified fetal head malpositions into occiput posterior (OP) or occiput transverse (OT) position, and CNN3 classified the remaining images as right or left OT. The DL-model was constructed using three convolutional neural networks (CNN) working simultaneously for the classification of fetal head positions. The performance of the algorithm was evaluated in terms of accuracy, sensitivity, specificity, F1-score and Cohen’s kappa. Results Between February 2018 and May 2023, 2154 transperineal images were included from eligible participants across 16 collaborating centers. The overall performance of the model for the classification of the fetal head position in the axial plane at transperineal ultrasound was excellent, with an of 94.5 % (95 % CI 92.0––97.0), a sensitivity of 95.6 % (95 % CI 96.8–100.0), a specificity of 91.2 % (95 % CI 87.3–95.1), a F1-score of 0.92 and a Cohen’s kappa of 0.90. The best performance was achieved by the CNN1 – OA position vs fetal head malpositions – with an accuracy of 98.3 % (95 % CI 96.9–99.7), followed by CNN2 – OP vs OT positions – with an accuracy of 93.9 % (95 % CI 89.6–98.2), and finally, CNN3 – right vs left OT position – with an accuracy of 91.3 % (95 % CI 83.5–99.1). Conclusions We have developed a DL-model capable of assessing fetal head position using transperineal ultrasound during the second stage of labor with an excellent overall accuracy. Future studies should validate our DL model using larger datasets and real-time patients before introducing it into routine clinical practice.

A deep learning approach to identify the fetal head position using transperineal ultrasound during labor / Ramirez Zegarra, Ruben; Conversano, Francesco; Dalĺasta, Andrea; Giovanna Di Trani, Maria; Fieni, Stefania; Morello, Rocco; Melito, Chiara; Pisani, Paola; Iurlaro, Enrico; Tondo, Marta; Gabriel Iliescu, Dominic; Nagy, Rodica; Vaso, Edvin; Abou-Dakn, Michael; Muslu, Gülhan; Lau, Wailam; Hung, Catherine; Sirico, Angelo; Lanzone, Antonio; Rizzo, Giuseppe; Mappa, Ilenia; Lees, Christoph; Usman, Sana; Winkler, Alice; Braun, Christian; Levy, Roni; Vaisbuch, Edi; Hassan, Wassim A.; Taylor, Sasha; Vimercati, Antonella; Mazzeo, Allegra; Moe Eggebø, Torbjørn; Amo Wiafe, Yaw; Ghi, Tullio; Casciaro, Sergio. - In: EUROPEAN JOURNAL OF OBSTETRICS, GYNECOLOGY, AND REPRODUCTIVE BIOLOGY. - ISSN 0301-2115. - (2024). [10.1016/j.ejogrb.2024.08.012]

A deep learning approach to identify the fetal head position using transperineal ultrasound during labor

Rizzo, Giuseppe;
2024

Abstract

Objectives To develop a deep learning (DL)-model using convolutional neural networks (CNN) to automatically identify the fetal head position at transperineal ultrasound in the second stage of labor. Material and methods Prospective, multicenter study including singleton, term, cephalic pregnancies in the second stage of labor. We assessed the fetal head position using transabdominal ultrasound and subsequently, obtained an image of the fetal head on the axial plane using transperineal ultrasound and labeled it according to the transabdominal ultrasound findings. The ultrasound images were randomly allocated into the three datasets containing a similar proportion of images of each subtype of fetal head position (occiput anterior, posterior, right and left transverse): the training dataset included 70 %, the validation dataset 15 %, and the testing dataset 15 % of the acquired images. The pre-trained ResNet18 model was employed as a foundational framework for feature extraction and classification. CNN1 was trained to differentiate between occiput anterior (OA) and non-OA positions, CNN2 classified fetal head malpositions into occiput posterior (OP) or occiput transverse (OT) position, and CNN3 classified the remaining images as right or left OT. The DL-model was constructed using three convolutional neural networks (CNN) working simultaneously for the classification of fetal head positions. The performance of the algorithm was evaluated in terms of accuracy, sensitivity, specificity, F1-score and Cohen’s kappa. Results Between February 2018 and May 2023, 2154 transperineal images were included from eligible participants across 16 collaborating centers. The overall performance of the model for the classification of the fetal head position in the axial plane at transperineal ultrasound was excellent, with an of 94.5 % (95 % CI 92.0––97.0), a sensitivity of 95.6 % (95 % CI 96.8–100.0), a specificity of 91.2 % (95 % CI 87.3–95.1), a F1-score of 0.92 and a Cohen’s kappa of 0.90. The best performance was achieved by the CNN1 – OA position vs fetal head malpositions – with an accuracy of 98.3 % (95 % CI 96.9–99.7), followed by CNN2 – OP vs OT positions – with an accuracy of 93.9 % (95 % CI 89.6–98.2), and finally, CNN3 – right vs left OT position – with an accuracy of 91.3 % (95 % CI 83.5–99.1). Conclusions We have developed a DL-model capable of assessing fetal head position using transperineal ultrasound during the second stage of labor with an excellent overall accuracy. Future studies should validate our DL model using larger datasets and real-time patients before introducing it into routine clinical practice.
2024
Artificial intelligence Pattern recognition Intrapartum ultrasound Fetal head malposition Translabial ultrasound
01 Pubblicazione su rivista::01a Articolo in rivista
A deep learning approach to identify the fetal head position using transperineal ultrasound during labor / Ramirez Zegarra, Ruben; Conversano, Francesco; Dalĺasta, Andrea; Giovanna Di Trani, Maria; Fieni, Stefania; Morello, Rocco; Melito, Chiara; Pisani, Paola; Iurlaro, Enrico; Tondo, Marta; Gabriel Iliescu, Dominic; Nagy, Rodica; Vaso, Edvin; Abou-Dakn, Michael; Muslu, Gülhan; Lau, Wailam; Hung, Catherine; Sirico, Angelo; Lanzone, Antonio; Rizzo, Giuseppe; Mappa, Ilenia; Lees, Christoph; Usman, Sana; Winkler, Alice; Braun, Christian; Levy, Roni; Vaisbuch, Edi; Hassan, Wassim A.; Taylor, Sasha; Vimercati, Antonella; Mazzeo, Allegra; Moe Eggebø, Torbjørn; Amo Wiafe, Yaw; Ghi, Tullio; Casciaro, Sergio. - In: EUROPEAN JOURNAL OF OBSTETRICS, GYNECOLOGY, AND REPRODUCTIVE BIOLOGY. - ISSN 0301-2115. - (2024). [10.1016/j.ejogrb.2024.08.012]
File allegati a questo prodotto
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/1717179
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact