Since their introduction, graph attention networks achieved outstanding results in graph representation learning tasks. However, these networks consider only pairwise relations between features associated to the nodes and then are unable to fully exploit higher-order and long-range interactions present in many real world data-sets. In this paper, we introduce a neural architecture operating on data defined over the nodes and the edges of a graph, represented as the 1-skeleton of a regular cell complex, able to capture insightful higher-order and long-range interactions. In particular, we exploit the lower and upper neighborhoods, as encoded in the cell complex, to design two independent masked self-attention mechanisms, thus generalizing the conventional graph attention strategy. The approach used is hierarchical and it incorporates the following steps: i) a lifting algorithm that learns (additional) edge features from node features; ii) a cell attention mechanism to find the optimal combination of edge features over both lower and upper neighbors; iii) a hierarchical edge pooling mechanism to extract a compact meaningful set of features. The experimental results show that this method compares favorably with state of the art results on graph-based learning tasks while maintaining a low complexity.

Cell Attention Networks / Giusti, Lorenzo; Battiloro, Claudio; Testa, Lucia; Di Lorenzo, Paolo; Sardellitti, Stefania; Barbarossa, Sergio. - (2023), pp. 1-8. (Intervento presentato al convegno International Joint Conference on Neural Networks (IJCNN) 2023 tenutosi a Gold Coast, Australia) [10.1109/IJCNN54540.2023.10191530].

Cell Attention Networks

Giusti, Lorenzo;Battiloro, Claudio;Testa, Lucia;Di Lorenzo, Paolo;Sardellitti, Stefania;Barbarossa, Sergio
2023

Abstract

Since their introduction, graph attention networks achieved outstanding results in graph representation learning tasks. However, these networks consider only pairwise relations between features associated to the nodes and then are unable to fully exploit higher-order and long-range interactions present in many real world data-sets. In this paper, we introduce a neural architecture operating on data defined over the nodes and the edges of a graph, represented as the 1-skeleton of a regular cell complex, able to capture insightful higher-order and long-range interactions. In particular, we exploit the lower and upper neighborhoods, as encoded in the cell complex, to design two independent masked self-attention mechanisms, thus generalizing the conventional graph attention strategy. The approach used is hierarchical and it incorporates the following steps: i) a lifting algorithm that learns (additional) edge features from node features; ii) a cell attention mechanism to find the optimal combination of edge features over both lower and upper neighbors; iii) a hierarchical edge pooling mechanism to extract a compact meaningful set of features. The experimental results show that this method compares favorably with state of the art results on graph-based learning tasks while maintaining a low complexity.
2023
International Joint Conference on Neural Networks (IJCNN) 2023
topological deep learning; geometric deep learning; attention networks; cell complexes
04 Pubblicazione in atti di convegno::04b Atto di convegno in volume
Cell Attention Networks / Giusti, Lorenzo; Battiloro, Claudio; Testa, Lucia; Di Lorenzo, Paolo; Sardellitti, Stefania; Barbarossa, Sergio. - (2023), pp. 1-8. (Intervento presentato al convegno International Joint Conference on Neural Networks (IJCNN) 2023 tenutosi a Gold Coast, Australia) [10.1109/IJCNN54540.2023.10191530].
File allegati a questo prodotto
File Dimensione Formato  
Giusti_Cell_2023.pdf

solo gestori archivio

Tipologia: Versione editoriale (versione pubblicata con il layout dell'editore)
Licenza: Tutti i diritti riservati (All rights reserved)
Dimensione 3.73 MB
Formato Adobe PDF
3.73 MB Adobe PDF   Contatta l'autore

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/1687999
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 1
  • ???jsp.display-item.citation.isi??? 0
social impact