Proteins perform much of the work in living organisms, and consequently the development of efficient computational methods for protein representation is essential for advancing large-scale biological research. Most current approaches struggle to efficiently integrate the wealth of information contained in the protein sequence and structure. In this paper, we propose a novel framework for embedding protein graphs in geometric vector spaces, by learning an encoder function that preserves the structural distance between protein graphs. Utilizing Graph Neural Networks (GNNs) and Large Language Models (LLMs), the proposed framework generates structure-and sequence-aware protein representations. We demonstrate that our embeddings are successful in the task of comparing protein structures, while providing a significant speed-up compared to traditional approaches based on structural alignment. Our framework achieves remarkable results in the task of protein structure classification; in particular, when compared to other work, the proposed method shows an average F1-Score improvement of 26% on out-of-distribution (OOD) samples and of 32% when tested on samples coming from the same distribution as the training data. Our approach finds applications in areas such as drug prioritization, drug re-purposing, disease sub-type analysis and elsewhere.

Integrating Structure and Sequence: Protein Graph Embeddings via GNNs and LLMs / Ceccarelli, F.; Giusti, L.; Holden, S. B.; Lio, P.. - 1:(2024), pp. 582-593. (Intervento presentato al convegno International Conference on Pattern Recognition Applications and Methods tenutosi a Roma; ita) [10.5220/0012453600003654].

Integrating Structure and Sequence: Protein Graph Embeddings via GNNs and LLMs

Lio P.
2024

Abstract

Proteins perform much of the work in living organisms, and consequently the development of efficient computational methods for protein representation is essential for advancing large-scale biological research. Most current approaches struggle to efficiently integrate the wealth of information contained in the protein sequence and structure. In this paper, we propose a novel framework for embedding protein graphs in geometric vector spaces, by learning an encoder function that preserves the structural distance between protein graphs. Utilizing Graph Neural Networks (GNNs) and Large Language Models (LLMs), the proposed framework generates structure-and sequence-aware protein representations. We demonstrate that our embeddings are successful in the task of comparing protein structures, while providing a significant speed-up compared to traditional approaches based on structural alignment. Our framework achieves remarkable results in the task of protein structure classification; in particular, when compared to other work, the proposed method shows an average F1-Score improvement of 26% on out-of-distribution (OOD) samples and of 32% when tested on samples coming from the same distribution as the training data. Our approach finds applications in areas such as drug prioritization, drug re-purposing, disease sub-type analysis and elsewhere.
2024
International Conference on Pattern Recognition Applications and Methods
Graph Neural Networks; Large Language Models; Protein Representation Learning
04 Pubblicazione in atti di convegno::04b Atto di convegno in volume
Integrating Structure and Sequence: Protein Graph Embeddings via GNNs and LLMs / Ceccarelli, F.; Giusti, L.; Holden, S. B.; Lio, P.. - 1:(2024), pp. 582-593. (Intervento presentato al convegno International Conference on Pattern Recognition Applications and Methods tenutosi a Roma; ita) [10.5220/0012453600003654].
File allegati a questo prodotto
File Dimensione Formato  
Ceccarelli_Integrating_2024.pdf

accesso aperto

Note: https://www.scitepress.org/Link.aspx?doi=10.5220/0012453600003654
Tipologia: Versione editoriale (versione pubblicata con il layout dell'editore)
Licenza: Creative commons
Dimensione 2.32 MB
Formato Adobe PDF
2.32 MB Adobe PDF

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/1728991
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? ND
social impact