The aim of this paper is to propose a novel framework to infer the sheaf Laplacian, including the topology of a graph and the restriction maps, from a set of data observed over the nodes of a graph. The proposed method is based on sheaf theory, which represents an important generalization of graph signal processing. The learning problem aims to find the sheaf Laplacian that minimizes the total variation of the observed data, where the variation over each edge is also locally minimized by optimizing the associated restriction maps. Compared to alternative methods based on semidefinite programming, our solution is significantly more numerically efficient, as all its fundamental steps are resolved in closed form. The method is numerically tested on data consisting of vectors defined over subspaces of varying dimensions at each node. We demonstrate how the resulting graph is influenced by two key factors: the cross-correlation and the dimensionality difference of the data residing on the graph's nodes.

Learning sheaf Laplacian optimizing restriction maps / Di Nino, Leonardo; Barbarossa, Sergio; Di Lorenzo, Paolo. - (2024), pp. 59-63. ( 58th Annual Asilomar Conference on Signals, Systems, and Computers Asilomar, Pacific Grove, CA ) [10.1109/IEEECONF60004.2024.10942997].

Learning sheaf Laplacian optimizing restriction maps

Leonardo Di Nino;Sergio Barbarossa;Paolo Di Lorenzo
2024

Abstract

The aim of this paper is to propose a novel framework to infer the sheaf Laplacian, including the topology of a graph and the restriction maps, from a set of data observed over the nodes of a graph. The proposed method is based on sheaf theory, which represents an important generalization of graph signal processing. The learning problem aims to find the sheaf Laplacian that minimizes the total variation of the observed data, where the variation over each edge is also locally minimized by optimizing the associated restriction maps. Compared to alternative methods based on semidefinite programming, our solution is significantly more numerically efficient, as all its fundamental steps are resolved in closed form. The method is numerically tested on data consisting of vectors defined over subspaces of varying dimensions at each node. We demonstrate how the resulting graph is influenced by two key factors: the cross-correlation and the dimensionality difference of the data residing on the graph's nodes.
2024
58th Annual Asilomar Conference on Signals, Systems, and Computers
representation learning; topological signal processing
04 Pubblicazione in atti di convegno::04b Atto di convegno in volume
Learning sheaf Laplacian optimizing restriction maps / Di Nino, Leonardo; Barbarossa, Sergio; Di Lorenzo, Paolo. - (2024), pp. 59-63. ( 58th Annual Asilomar Conference on Signals, Systems, and Computers Asilomar, Pacific Grove, CA ) [10.1109/IEEECONF60004.2024.10942997].
File allegati a questo prodotto
File Dimensione Formato  
Di_Nino_Learning-sheaf-laplacian_2024.pdf

solo gestori archivio

Tipologia: Versione editoriale (versione pubblicata con il layout dell'editore)
Licenza: Tutti i diritti riservati (All rights reserved)
Dimensione 637.08 kB
Formato Adobe PDF
637.08 kB Adobe PDF   Contatta l'autore

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/1732985
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? ND
social impact