Early detection and accurate diagnosis of Autism Spectrum Disorder are critical for enabling timely interventions and personalized support. However, conventional diagnostic approaches remain largely subjective, posing challenges in achieving consistency and reliability. Recent clinical studies suggest that eye gaze patterns offer valuable insights into cognitive and attentional mechanisms underlying Autism Spectrum Disorder, highlighting their potential as biomarkers for objective assessment. In this work, we introduce a patch-based Graph Neural Network framework designed to analyze eye-tracking data by integrating fixation maps with the corresponding visual stimuli observed by both neurotypical and neurodivergent individuals. The proposed method constructs a hierarchical graph representation of image patches, progressively aggregating visual and attentional information to enhance feature extraction. A combination of Graph and Convolutional Neural Networks are employed for classification, capturing spatial and contextual dependencies between regions of interest respectively. Experimental evaluations and analysis demonstrate the effectiveness of our approach in distinguishing Autism Spectrum Disorder from neurotypical behavior, underscoring the potential of graph-based learning for neurodevelopmental assessments.

Patch-based graph neural network for autism classification via eye gaze analysis / Verdone, A.; Colonnese, F.; Rosato, A.; Panella, M.. - (2025), pp. 1-8. ( 2025 International Joint Conference on Neural Networks, IJCNN 2025 Roma (Italai) ) [10.1109/IJCNN64981.2025.11228803].

Patch-based graph neural network for autism classification via eye gaze analysis

Verdone A.;Colonnese F.;Rosato A.;Panella M.
2025

Abstract

Early detection and accurate diagnosis of Autism Spectrum Disorder are critical for enabling timely interventions and personalized support. However, conventional diagnostic approaches remain largely subjective, posing challenges in achieving consistency and reliability. Recent clinical studies suggest that eye gaze patterns offer valuable insights into cognitive and attentional mechanisms underlying Autism Spectrum Disorder, highlighting their potential as biomarkers for objective assessment. In this work, we introduce a patch-based Graph Neural Network framework designed to analyze eye-tracking data by integrating fixation maps with the corresponding visual stimuli observed by both neurotypical and neurodivergent individuals. The proposed method constructs a hierarchical graph representation of image patches, progressively aggregating visual and attentional information to enhance feature extraction. A combination of Graph and Convolutional Neural Networks are employed for classification, capturing spatial and contextual dependencies between regions of interest respectively. Experimental evaluations and analysis demonstrate the effectiveness of our approach in distinguishing Autism Spectrum Disorder from neurotypical behavior, underscoring the potential of graph-based learning for neurodevelopmental assessments.
2025
2025 International Joint Conference on Neural Networks, IJCNN 2025
graph neural network; autism classification; eye gaze analysis
04 Pubblicazione in atti di convegno::04b Atto di convegno in volume
Patch-based graph neural network for autism classification via eye gaze analysis / Verdone, A.; Colonnese, F.; Rosato, A.; Panella, M.. - (2025), pp. 1-8. ( 2025 International Joint Conference on Neural Networks, IJCNN 2025 Roma (Italai) ) [10.1109/IJCNN64981.2025.11228803].
File allegati a questo prodotto
File Dimensione Formato  
Verdone_Patch-based-graph_2025.pdf

solo gestori archivio

Tipologia: Versione editoriale (versione pubblicata con il layout dell'editore)
Licenza: Tutti i diritti riservati (All rights reserved)
Dimensione 2.21 MB
Formato Adobe PDF
2.21 MB Adobe PDF   Contatta l'autore

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/1757938
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? ND
social impact