Inductive link prediction is emerging as a key paradigm for real-world knowledge graphs (KGs), where new entities frequently appear and models must generalize to them without retraining. Predicting links in a KG faces the challenge of guessing previously unseen entities by leveraging generalizable node features such as subgraph structure, type annotations, and ontological constraints. However, explicit type information is often lacking or incomplete. Even when available, type information in most KGs is often coarse-grained, sparse, and prone to errors due to human annotation. In this work, we explore the potential of pre-trained language models (PLMs) to enrich node representations with implicit type signals. We introduce TyleR, a Type-less yet type-awaRe approach for subgraph-based inductive link prediction that leverages PLMs for semantic enrichment. Experiments on standard benchmarks demonstrate that TyleR outperforms state-of-the-art baselines in scenarios with scarce type annotations and sparse graph connectivity. To ensure reproducibility, we share our code at https://github.com/sisinflab/tyler .

Type-Less yet Type-Aware Inductive Link Prediction with Pretrained Language Models / De Bellis, Alessandro; Bufi, Salvatore; Servedio, Giovanni; Anelli, Vito Walter; Di Noia, Tommaso; Di Sciascio, Eugenio. - (2025), pp. 27181-27197. (Intervento presentato al convegno Empirical Methods in Natural Language Processing tenutosi a Suzhou, China) [10.18653/v1/2025.emnlp-main.1383].

Type-Less yet Type-Aware Inductive Link Prediction with Pretrained Language Models

Servedio, Giovanni;
2025

Abstract

Inductive link prediction is emerging as a key paradigm for real-world knowledge graphs (KGs), where new entities frequently appear and models must generalize to them without retraining. Predicting links in a KG faces the challenge of guessing previously unseen entities by leveraging generalizable node features such as subgraph structure, type annotations, and ontological constraints. However, explicit type information is often lacking or incomplete. Even when available, type information in most KGs is often coarse-grained, sparse, and prone to errors due to human annotation. In this work, we explore the potential of pre-trained language models (PLMs) to enrich node representations with implicit type signals. We introduce TyleR, a Type-less yet type-awaRe approach for subgraph-based inductive link prediction that leverages PLMs for semantic enrichment. Experiments on standard benchmarks demonstrate that TyleR outperforms state-of-the-art baselines in scenarios with scarce type annotations and sparse graph connectivity. To ensure reproducibility, we share our code at https://github.com/sisinflab/tyler .
2025
Empirical Methods in Natural Language Processing
inductive link prediction; pretrained language models; knowledge graph
04 Pubblicazione in atti di convegno::04b Atto di convegno in volume
Type-Less yet Type-Aware Inductive Link Prediction with Pretrained Language Models / De Bellis, Alessandro; Bufi, Salvatore; Servedio, Giovanni; Anelli, Vito Walter; Di Noia, Tommaso; Di Sciascio, Eugenio. - (2025), pp. 27181-27197. (Intervento presentato al convegno Empirical Methods in Natural Language Processing tenutosi a Suzhou, China) [10.18653/v1/2025.emnlp-main.1383].
File allegati a questo prodotto
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/1755158
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact