Counterfactual Explanation (CE) techniques have garnered attention as a means to provide insights to the users engaging with AI systems. While extensively researched in domains such as medical imaging and autonomous vehicles, Graph Counterfactual Explanation (GCE) methods have been comparatively under-explored. GCEs generate a new graph similar to the original one, with a different outcome grounded on the underlying predictive model. Among these GCE techniques, those rooted in generative mechanisms have received relatively limited investigation despite demonstrating impressive accomplishments in other domains, such as artistic styles and natural language modelling. The preference for generative explainers stems from their capacity to generate counterfactual instances during inference, leveraging autonomously acquired perturbations of the input graph. Motivated by the rationales above, our study introduces RSGG-CE, a novel Robust Stochastic Graph Generator for Counterfactual Explanations able to produce counterfactual examples from the learned latent space considering a partially ordered generation sequence. Furthermore, we undertake quantitative and qualitative analyses to compare RSGG-CE's performance against SoA generative explainers, highlighting its increased ability to engendering plausible counterfactual candidates.

Robust Stochastic Graph Generator for Counterfactual Explanations / Prado-Romero, M. A.; Prenkaj, B.; Stilo, G.. - 38:19(2024), pp. 21518-21526. (Intervento presentato al convegno 38th AAAI Conference on Artificial Intelligence, AAAI 2024 tenutosi a Vancouver, British Columbia, Canada) [10.1609/aaai.v38i19.30149].

Robust Stochastic Graph Generator for Counterfactual Explanations

Prenkaj B.
Secondo
;
Stilo G.
Ultimo
2024

Abstract

Counterfactual Explanation (CE) techniques have garnered attention as a means to provide insights to the users engaging with AI systems. While extensively researched in domains such as medical imaging and autonomous vehicles, Graph Counterfactual Explanation (GCE) methods have been comparatively under-explored. GCEs generate a new graph similar to the original one, with a different outcome grounded on the underlying predictive model. Among these GCE techniques, those rooted in generative mechanisms have received relatively limited investigation despite demonstrating impressive accomplishments in other domains, such as artistic styles and natural language modelling. The preference for generative explainers stems from their capacity to generate counterfactual instances during inference, leveraging autonomously acquired perturbations of the input graph. Motivated by the rationales above, our study introduces RSGG-CE, a novel Robust Stochastic Graph Generator for Counterfactual Explanations able to produce counterfactual examples from the learned latent space considering a partially ordered generation sequence. Furthermore, we undertake quantitative and qualitative analyses to compare RSGG-CE's performance against SoA generative explainers, highlighting its increased ability to engendering plausible counterfactual candidates.
2024
38th AAAI Conference on Artificial Intelligence, AAAI 2024
explainable AI; generative AI; algorithmic recourse; graph neural networks; deep learning
04 Pubblicazione in atti di convegno::04b Atto di convegno in volume
Robust Stochastic Graph Generator for Counterfactual Explanations / Prado-Romero, M. A.; Prenkaj, B.; Stilo, G.. - 38:19(2024), pp. 21518-21526. (Intervento presentato al convegno 38th AAAI Conference on Artificial Intelligence, AAAI 2024 tenutosi a Vancouver, British Columbia, Canada) [10.1609/aaai.v38i19.30149].
File allegati a questo prodotto
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/1723542
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 2
  • ???jsp.display-item.citation.isi??? ND
social impact