We present GERN, a novel scalable framework for training GNNs in node classification tasks, based on effective resistance, a standard tool in spectral graph theory. Our method progressively refines the GNN weights on a sequence of random spanning trees suitably transformed into path graphs which, despite their simplicity, are shown to retain essential topological and node information of the original input graph. The sparse nature of these path graphs substantially lightens the computational burden of GNN training. This not only enhances scalability but also improves accuracy in subsequent test phases, especially under small training set regimes, which are of great practical importance, as in many real-world scenarios labels may be hard to obtain. In these settings, our framework yields very good results as it effectively counters the training deterioration caused by overfitting when the training set is small. Our method also addresses common issues like over-squashing and over-smoothing while avoiding under-reaching phenomena. Although our framework is flexible and can be deployed in several types of GNNs, in this paper we focus on graph convolutional networks and carry out an extensive experimental investigation on a number of real-world graph benchmarks, where we achieve simultaneous improvement of training speed and test accuracy over a wide pool of representative baselines.

Fast and Effective GNN Training through Sequences of Random Path Graphs / Bonchi, F.; Gentile, C.; Nerini, F. P.; Panisson, A.; Vitale, F.. - 1:(2025), pp. 49-60. ( 31st ACM SIGKDD Conference on Knowledge Discovery and Data Mining, KDD 2025 Toronto; Canada ) [10.1145/3690624.3709301].

Fast and Effective GNN Training through Sequences of Random Path Graphs

Bonchi F.
;
Nerini F. P.
;
2025

Abstract

We present GERN, a novel scalable framework for training GNNs in node classification tasks, based on effective resistance, a standard tool in spectral graph theory. Our method progressively refines the GNN weights on a sequence of random spanning trees suitably transformed into path graphs which, despite their simplicity, are shown to retain essential topological and node information of the original input graph. The sparse nature of these path graphs substantially lightens the computational burden of GNN training. This not only enhances scalability but also improves accuracy in subsequent test phases, especially under small training set regimes, which are of great practical importance, as in many real-world scenarios labels may be hard to obtain. In these settings, our framework yields very good results as it effectively counters the training deterioration caused by overfitting when the training set is small. Our method also addresses common issues like over-squashing and over-smoothing while avoiding under-reaching phenomena. Although our framework is flexible and can be deployed in several types of GNNs, in this paper we focus on graph convolutional networks and carry out an extensive experimental investigation on a number of real-world graph benchmarks, where we achieve simultaneous improvement of training speed and test accuracy over a wide pool of representative baselines.
2025
31st ACM SIGKDD Conference on Knowledge Discovery and Data Mining, KDD 2025
effective resistance; gnn; training acceleration
04 Pubblicazione in atti di convegno::04b Atto di convegno in volume
Fast and Effective GNN Training through Sequences of Random Path Graphs / Bonchi, F.; Gentile, C.; Nerini, F. P.; Panisson, A.; Vitale, F.. - 1:(2025), pp. 49-60. ( 31st ACM SIGKDD Conference on Knowledge Discovery and Data Mining, KDD 2025 Toronto; Canada ) [10.1145/3690624.3709301].
File allegati a questo prodotto
File Dimensione Formato  
Bonchi_Fast_postprint_2025.pdf

accesso aperto

Note: https://doi.org/10.1145/3690624.3709301
Tipologia: Documento in Post-print (versione successiva alla peer review e accettata per la pubblicazione)
Licenza: Tutti i diritti riservati (All rights reserved)
Dimensione 1.34 MB
Formato Adobe PDF
1.34 MB Adobe PDF
Bonchi_Fast_2025.pdf

accesso aperto

Note: https://doi.org/10.1145/3690624.3709301
Tipologia: Versione editoriale (versione pubblicata con il layout dell'editore)
Licenza: Tutti i diritti riservati (All rights reserved)
Dimensione 1.81 MB
Formato Adobe PDF
1.81 MB Adobe PDF

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/1754077
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 3
  • ???jsp.display-item.citation.isi??? 3
social impact