Graph neural networks (GNNs) have gained significant attention for their ability to learn representations from graph-structured data, in which message passing and feature fusion strategies play an essential role. However, traditional Graph Neural Architecture Search (GNAS) mainly focuses on optimization with a static perceptive field to ease the search process. To efficiently utilize latent relationships between non-adjacent nodes as well as edge features, this work proposes a novel two-stage approach that is able to optimize GNN structures more effectively by adaptively aggregating neighborhood features in multiple scales. This adaptive multi-scale GNAS is able to assign optimal weights for different neighbors in different graphs and learning tasks. In addition, it takes latent relationships and edge features into message passing into account, and can incorporate different feature fusion strategies. Compared with traditional ones, our proposed approach can explore a much larger and more diversified search space efficiently. We also prove that traditional multi-hop GNNs are low-pass filters, which can lead to the removal of important low-frequency components of signals from remote neighbors in a graph, and they are not even expressive enough to distinguish some simple regular graphs, justifying the superiority of our approach. Experiments with seven datasets across three graph learning tasks, including graph regression, node classification, and graph classification, demonstrate that our method yields significant improvement compared with state-of-the-art GNAS approaches and human-designed GNN approaches. Specifically, for example, with our framework, the MAE of the 12-layer AM-GNAS was 0.102 for the ZINC dataset, yielding over 25% improvement.

Adaptive multi-scale Graph Neural Architecture Search framework / Yang, L.; Lio, P.; Shen, X.; Zhang, Y.; Peng, C.. - In: NEUROCOMPUTING. - ISSN 0925-2312. - 599:(2024). [10.1016/j.neucom.2024.128094]

Adaptive multi-scale Graph Neural Architecture Search framework

Lio P.;
2024

Abstract

Graph neural networks (GNNs) have gained significant attention for their ability to learn representations from graph-structured data, in which message passing and feature fusion strategies play an essential role. However, traditional Graph Neural Architecture Search (GNAS) mainly focuses on optimization with a static perceptive field to ease the search process. To efficiently utilize latent relationships between non-adjacent nodes as well as edge features, this work proposes a novel two-stage approach that is able to optimize GNN structures more effectively by adaptively aggregating neighborhood features in multiple scales. This adaptive multi-scale GNAS is able to assign optimal weights for different neighbors in different graphs and learning tasks. In addition, it takes latent relationships and edge features into message passing into account, and can incorporate different feature fusion strategies. Compared with traditional ones, our proposed approach can explore a much larger and more diversified search space efficiently. We also prove that traditional multi-hop GNNs are low-pass filters, which can lead to the removal of important low-frequency components of signals from remote neighbors in a graph, and they are not even expressive enough to distinguish some simple regular graphs, justifying the superiority of our approach. Experiments with seven datasets across three graph learning tasks, including graph regression, node classification, and graph classification, demonstrate that our method yields significant improvement compared with state-of-the-art GNAS approaches and human-designed GNN approaches. Specifically, for example, with our framework, the MAE of the 12-layer AM-GNAS was 0.102 for the ZINC dataset, yielding over 25% improvement.
2024
Graph Neural Architecture Search; Graph neural networks; Graph representation learning
01 Pubblicazione su rivista::01a Articolo in rivista
Adaptive multi-scale Graph Neural Architecture Search framework / Yang, L.; Lio, P.; Shen, X.; Zhang, Y.; Peng, C.. - In: NEUROCOMPUTING. - ISSN 0925-2312. - 599:(2024). [10.1016/j.neucom.2024.128094]
File allegati a questo prodotto
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/1723993
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? 0
social impact