It has been observed that representations learned by distinct neural networks conceal structural similarities when the models are trained under similar inductive biases. From a geometric perspective, identifying the classes of transformations and the related invariances that connect these representations is fundamental to unlocking applications, such as merging, stitching, and reusing different neural modules. However, estimating task-specific transformations a priori can be challenging and expensive due to several factors (e.g., weights initialization, training hyperparameters, or data modality). To this end, we introduce a versatile method to directly incorporate a set of invariances into the representations, constructing a product space of invariant components on top of the latent representations without requiring prior knowledge about the optimal invariance to infuse. We validate our solution on classification and reconstruction tasks, observing consistent latent similarity and downstream performance improvements in a zero-shot stitching setting. The experimental analysis comprises three modalities (vision, text, and graphs), twelve pretrained foundational models, nine benchmarks, and several architectures trained from scratch.

From Bricks to Bridges: Product of Invariances to Enhance Latent Space Communication / Cannistraci, Irene; Moschella, Luca; Fumero, Marco; Maiorca, Valentino; Rodolà, Emanuele. - (2024). (Intervento presentato al convegno The Twelfth International Conference on Learning Representations tenutosi a Vienna, Austria).

From Bricks to Bridges: Product of Invariances to Enhance Latent Space Communication

Irene Cannistraci
Primo
;
Luca Moschella;Marco Fumero;Valentino Maiorca;
2024

Abstract

It has been observed that representations learned by distinct neural networks conceal structural similarities when the models are trained under similar inductive biases. From a geometric perspective, identifying the classes of transformations and the related invariances that connect these representations is fundamental to unlocking applications, such as merging, stitching, and reusing different neural modules. However, estimating task-specific transformations a priori can be challenging and expensive due to several factors (e.g., weights initialization, training hyperparameters, or data modality). To this end, we introduce a versatile method to directly incorporate a set of invariances into the representations, constructing a product space of invariant components on top of the latent representations without requiring prior knowledge about the optimal invariance to infuse. We validate our solution on classification and reconstruction tasks, observing consistent latent similarity and downstream performance improvements in a zero-shot stitching setting. The experimental analysis comprises three modalities (vision, text, and graphs), twelve pretrained foundational models, nine benchmarks, and several architectures trained from scratch.
2024
The Twelfth International Conference on Learning Representations
invariance, latent space, latent comunication, zero-shot stitching, representation learning, relative representation
04 Pubblicazione in atti di convegno::04b Atto di convegno in volume
From Bricks to Bridges: Product of Invariances to Enhance Latent Space Communication / Cannistraci, Irene; Moschella, Luca; Fumero, Marco; Maiorca, Valentino; Rodolà, Emanuele. - (2024). (Intervento presentato al convegno The Twelfth International Conference on Learning Representations tenutosi a Vienna, Austria).
File allegati a questo prodotto
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/1712069
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact