Skull-stripping and region segmentation are fundamental steps in preclinical magnetic resonance imaging (MRI) studies, and these common procedures are usually performed manually. We present Multi-task U-Net (MU-Net), a convolutional neural network designed to accomplish both tasks simultaneously. MU-Net achieved higher segmentation accuracy than state-of-the-art multi-atlas segmentation methods with an inference time of 0.35 s and no pre-processing requirements. We trained and validated MU-Net on 128 T2-weighted mouse MRI volumes as well as on the publicly available MRM NeAT dataset of 10 MRI volumes. We tested MU-Net with an unusually large dataset combining several independent studies consisting of 1782 mouse brain MRI volumes of both healthy and Huntington animals, and measured average Dice scores of 0.906 (striati), 0.937 (cortex), and 0.978 (brain mask). Further, we explored the effectiveness of our network in the presence of different architectural features, including skip connections and recently proposed framing connections, and the effects of the age range of the training set animals. These high evaluation scores demonstrate that MU-Net is a powerful tool for segmentation and skull-stripping, decreasing inter and intra-rater variability of manual segmentation. The MU-Net code and the trained model are publicly available at https://github.com/Hierakonpolis/MU-Net.

Automated joint skull-stripping and segmentation with Multi-Task U-Net in large mouse brain MRI databases / De Feo, R.; Shatillo, A.; Sierra, A.; Valverde, J. M.; Grohn, O.; Giove, F.; Tohka, J.. - In: NEUROIMAGE. - ISSN 1053-8119. - 229:(2021), pp. 1-12. [10.1016/j.neuroimage.2021.117734]

Automated joint skull-stripping and segmentation with Multi-Task U-Net in large mouse brain MRI databases

De Feo R.
;
Giove F.;
2021

Abstract

Skull-stripping and region segmentation are fundamental steps in preclinical magnetic resonance imaging (MRI) studies, and these common procedures are usually performed manually. We present Multi-task U-Net (MU-Net), a convolutional neural network designed to accomplish both tasks simultaneously. MU-Net achieved higher segmentation accuracy than state-of-the-art multi-atlas segmentation methods with an inference time of 0.35 s and no pre-processing requirements. We trained and validated MU-Net on 128 T2-weighted mouse MRI volumes as well as on the publicly available MRM NeAT dataset of 10 MRI volumes. We tested MU-Net with an unusually large dataset combining several independent studies consisting of 1782 mouse brain MRI volumes of both healthy and Huntington animals, and measured average Dice scores of 0.906 (striati), 0.937 (cortex), and 0.978 (brain mask). Further, we explored the effectiveness of our network in the presence of different architectural features, including skip connections and recently proposed framing connections, and the effects of the age range of the training set animals. These high evaluation scores demonstrate that MU-Net is a powerful tool for segmentation and skull-stripping, decreasing inter and intra-rater variability of manual segmentation. The MU-Net code and the trained model are publicly available at https://github.com/Hierakonpolis/MU-Net.
2021
brain; deep learning; mice; MRI; segmentation; U-Net
01 Pubblicazione su rivista::01a Articolo in rivista
Automated joint skull-stripping and segmentation with Multi-Task U-Net in large mouse brain MRI databases / De Feo, R.; Shatillo, A.; Sierra, A.; Valverde, J. M.; Grohn, O.; Giove, F.; Tohka, J.. - In: NEUROIMAGE. - ISSN 1053-8119. - 229:(2021), pp. 1-12. [10.1016/j.neuroimage.2021.117734]
File allegati a questo prodotto
File Dimensione Formato  
De Feo_Automated_2021pdf

accesso aperto

Tipologia: Versione editoriale (versione pubblicata con il layout dell'editore)
Licenza: Creative commons
Dimensione 2.2 MB
Formato Unknown
2.2 MB Unknown

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/1487121
Citazioni
  • ???jsp.display-item.citation.pmc??? 10
  • Scopus 22
  • ???jsp.display-item.citation.isi??? 21
social impact