Clinical decision support systems (CDSS) that make use of algorithms based on intelligent systems, such as machine learning or deep learning, they sufer from the fact that often the methods used are hard to interpret and difcult to understand on how some decisions are made; the opacity ofsome methods, sometimes voluntary due to problems such as data privacy or the techniques used to protect intellectual property, makes these systems very complicated. Besides this series of problems, the results obtained also sufer from the poor possibility of being interpreted; in the clinical context therefore it is required that the methods used are as accurate as possible, transparent techniques and explainable results. In this work the problem of the development of cervical cancer is treated, a disease that mainly afects the female population. In order to introduce advanced machine learning techniques in a clinical decision support system that can be transparent and explainable, a robust, accurate ensemble method is presented, in terms of error and sensitivity linked to the classifcation of possible development of the aforementioned pathology and advanced techniques are also presented of explainability and interpretability (Explanaible Machine Learning) applied to the context of CDSS such as Lime and Shapley. The results obtained, as well as being interesting, are understandable and can be implemented in the treatment of this type of problem.

Cervical cancer risk prediction with robust ensemble and explainable black boxes method / Curia, Francesco. - In: HEALTH AND TECHNOLOGY. - ISSN 2190-7188. - 11:(2021), pp. 875-885. [10.1007/s12553-021-00554-6]

Cervical cancer risk prediction with robust ensemble and explainable black boxes method

Curia, Francesco
2021

Abstract

Clinical decision support systems (CDSS) that make use of algorithms based on intelligent systems, such as machine learning or deep learning, they sufer from the fact that often the methods used are hard to interpret and difcult to understand on how some decisions are made; the opacity ofsome methods, sometimes voluntary due to problems such as data privacy or the techniques used to protect intellectual property, makes these systems very complicated. Besides this series of problems, the results obtained also sufer from the poor possibility of being interpreted; in the clinical context therefore it is required that the methods used are as accurate as possible, transparent techniques and explainable results. In this work the problem of the development of cervical cancer is treated, a disease that mainly afects the female population. In order to introduce advanced machine learning techniques in a clinical decision support system that can be transparent and explainable, a robust, accurate ensemble method is presented, in terms of error and sensitivity linked to the classifcation of possible development of the aforementioned pathology and advanced techniques are also presented of explainability and interpretability (Explanaible Machine Learning) applied to the context of CDSS such as Lime and Shapley. The results obtained, as well as being interesting, are understandable and can be implemented in the treatment of this type of problem.
2021
Cervical cancer; Ensembl; Interpretable AI; Risk prediction;
01 Pubblicazione su rivista::01a Articolo in rivista
Cervical cancer risk prediction with robust ensemble and explainable black boxes method / Curia, Francesco. - In: HEALTH AND TECHNOLOGY. - ISSN 2190-7188. - 11:(2021), pp. 875-885. [10.1007/s12553-021-00554-6]
File allegati a questo prodotto
File Dimensione Formato  
Curia_Cervical_2021.pdf

accesso aperto

Note: DOI 10.1007/s12553-021-00554-6
Tipologia: Versione editoriale (versione pubblicata con il layout dell'editore)
Licenza: Creative commons
Dimensione 1.37 MB
Formato Adobe PDF
1.37 MB Adobe PDF

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/1547698
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 15
  • ???jsp.display-item.citation.isi??? 8
social impact