This paper presents a new approach for assembling graph neural networks based on framelet transforms. The latter provides a multi-scale representation for graph-structured data. We decompose an input graph into low-pass and high-pass frequencies coefficients for network training, which then defines a framelet-based graph convolution. The framelet decomposition naturally induces a graph pooling strategy by aggregating the graph feature into low-pass and high-pass spectra, which considers both the feature values and geometry of the graph data and conserves the total information. The graph neural networks with the proposed framelet convolution and pooling achieve state-of-the-art performance in many node and graph prediction tasks. Moreover, we propose shrinkage as a new activation for the framelet convolution, which thresholds high-frequency information at different scales. Compared to ReLU, shrinkage activation improves model performance on denoising and signal compression: noises in both node and structure can be significantly reduced by accurately cutting off the high-pass coefficients from framelet decomposition, and the signal can be compressed to less than half its original size with well-preserved prediction performance.

How Framelets Enhance Graph Neural Networks / Zheng, X.; Zhou, B.; Gao, J.; Wang, Y. G.; Lio, P.; Li, M.; Montufar, G.. - 139:(2021), pp. 12761-12771. (Intervento presentato al convegno International Conference on Machine Learning tenutosi a Virtual, Online).

How Framelets Enhance Graph Neural Networks

Lio P.;
2021

Abstract

This paper presents a new approach for assembling graph neural networks based on framelet transforms. The latter provides a multi-scale representation for graph-structured data. We decompose an input graph into low-pass and high-pass frequencies coefficients for network training, which then defines a framelet-based graph convolution. The framelet decomposition naturally induces a graph pooling strategy by aggregating the graph feature into low-pass and high-pass spectra, which considers both the feature values and geometry of the graph data and conserves the total information. The graph neural networks with the proposed framelet convolution and pooling achieve state-of-the-art performance in many node and graph prediction tasks. Moreover, we propose shrinkage as a new activation for the framelet convolution, which thresholds high-frequency information at different scales. Compared to ReLU, shrinkage activation improves model performance on denoising and signal compression: noises in both node and structure can be significantly reduced by accurately cutting off the high-pass coefficients from framelet decomposition, and the signal can be compressed to less than half its original size with well-preserved prediction performance.
2021
International Conference on Machine Learning
Chemical activation; Convolution; Machine learning; Shrinkage; Signal processing
04 Pubblicazione in atti di convegno::04b Atto di convegno in volume
How Framelets Enhance Graph Neural Networks / Zheng, X.; Zhou, B.; Gao, J.; Wang, Y. G.; Lio, P.; Li, M.; Montufar, G.. - 139:(2021), pp. 12761-12771. (Intervento presentato al convegno International Conference on Machine Learning tenutosi a Virtual, Online).
File allegati a questo prodotto
File Dimensione Formato  
Zheng_preprint_How-Framelets_2021.pdf

accesso aperto

Tipologia: Documento in Pre-print (manoscritto inviato all'editore, precedente alla peer review)
Licenza: Creative commons
Dimensione 16.7 MB
Formato Adobe PDF
16.7 MB Adobe PDF

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/1720266
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 20
  • ???jsp.display-item.citation.isi??? 18
social impact