This work establishes a fully-spectral framework to capture informative long-range temporal interactions in a dynamic system. We connect the spectral transform to the low-rank self-attention mechanisms and investigate its energy-balancing effect and computational efficiency. Based on the observations, we leverage the adaptive power method SVD and global graph framelet convolution to encode time-dependent features and graph structure for continuous-time dynamic graph representation learning. The former serves as an efficient high-order linear self-attention with determined propagation rules, and the latter establishes scalable and transferable geometric characterization for property prediction. Empirically, the proposed model learns well-conditioned hidden representations on a variety of online learning tasks, and it achieves top performance with a reduced number of learnable parameters and faster propagation speed.

Well-conditioned Spectral Transforms for Dynamic Graph Representation / Zhou, B.; Liu, X.; Liu, Y.; Huang, Y.; Lio, P.; Wang, Y. G.. - 198:(2022). (Intervento presentato al convegno 1st Learning on Graphs Conference, LOG 2022 tenutosi a Virtual, Online).

Well-conditioned Spectral Transforms for Dynamic Graph Representation

Lio P.
;
2022

Abstract

This work establishes a fully-spectral framework to capture informative long-range temporal interactions in a dynamic system. We connect the spectral transform to the low-rank self-attention mechanisms and investigate its energy-balancing effect and computational efficiency. Based on the observations, we leverage the adaptive power method SVD and global graph framelet convolution to encode time-dependent features and graph structure for continuous-time dynamic graph representation learning. The former serves as an efficient high-order linear self-attention with determined propagation rules, and the latter establishes scalable and transferable geometric characterization for property prediction. Empirically, the proposed model learns well-conditioned hidden representations on a variety of online learning tasks, and it achieves top performance with a reduced number of learnable parameters and faster propagation speed.
2022
1st Learning on Graphs Conference, LOG 2022
Continuous time systems; Graphic methods; Machine learning
04 Pubblicazione in atti di convegno::04b Atto di convegno in volume
Well-conditioned Spectral Transforms for Dynamic Graph Representation / Zhou, B.; Liu, X.; Liu, Y.; Huang, Y.; Lio, P.; Wang, Y. G.. - 198:(2022). (Intervento presentato al convegno 1st Learning on Graphs Conference, LOG 2022 tenutosi a Virtual, Online).
File allegati a questo prodotto
File Dimensione Formato  
Zhou_well-conditioned_2022.pdf

accesso aperto

Note: https://openreview.net/forum?id=kQsniwmGgF5
Tipologia: Documento in Pre-print (manoscritto inviato all'editore, precedente alla peer review)
Licenza: Creative commons
Dimensione 1.05 MB
Formato Adobe PDF
1.05 MB Adobe PDF

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/1727977
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 1
  • ???jsp.display-item.citation.isi??? 0
social impact