Extracting relation triplets from raw text is a crucial task in Information Extraction, enabling multiple applications such as populating or validating knowledge bases, factchecking, and other downstream tasks. However, it usually involves multiple-step pipelines that propagate errors or are limited to a small number of relation types. To overcome these issues, we propose the use of autoregressive seq2seq models. Such models have previously been shown to perform well not only in language generation, but also in NLU tasks such as Entity Linking, thanks to their framing as seq2seq tasks. In this paper, we show how Relation Extraction can be simplified by expressing triplets as a sequence of text and we present REBEL, a seq2seq model based on BART that performs end-to-end relation extraction for more than 200 different relation types. We show our model’s flexibility by fine-tuning it on an array of Relation Extraction and Relation Classification benchmarks, with it attaining state-of-the-art performance in most of them.

REBEL: Relation Extraction By End-to-end Language generation / Huguet Cabot, Pere-Lluís; Navigli, Roberto. - (2021), pp. 2370-2381. (Intervento presentato al convegno 2021 Conference on Empirical Methods in Natural Language Processing tenutosi a Punta Cana, Dominican Republic) [10.18653/v1/2021.findings-emnlp.204].

REBEL: Relation Extraction By End-to-end Language generation

Huguet Cabot, Pere-Lluís
;
Navigli, Roberto
2021

Abstract

Extracting relation triplets from raw text is a crucial task in Information Extraction, enabling multiple applications such as populating or validating knowledge bases, factchecking, and other downstream tasks. However, it usually involves multiple-step pipelines that propagate errors or are limited to a small number of relation types. To overcome these issues, we propose the use of autoregressive seq2seq models. Such models have previously been shown to perform well not only in language generation, but also in NLU tasks such as Entity Linking, thanks to their framing as seq2seq tasks. In this paper, we show how Relation Extraction can be simplified by expressing triplets as a sequence of text and we present REBEL, a seq2seq model based on BART that performs end-to-end relation extraction for more than 200 different relation types. We show our model’s flexibility by fine-tuning it on an array of Relation Extraction and Relation Classification benchmarks, with it attaining state-of-the-art performance in most of them.
2021
2021 Conference on Empirical Methods in Natural Language Processing
Relation Extraction; Information Extraction; seq2seq; Computational Linguistics
04 Pubblicazione in atti di convegno::04b Atto di convegno in volume
REBEL: Relation Extraction By End-to-end Language generation / Huguet Cabot, Pere-Lluís; Navigli, Roberto. - (2021), pp. 2370-2381. (Intervento presentato al convegno 2021 Conference on Empirical Methods in Natural Language Processing tenutosi a Punta Cana, Dominican Republic) [10.18653/v1/2021.findings-emnlp.204].
File allegati a questo prodotto
File Dimensione Formato  
Cabot_REBEL_2021.pdf

accesso aperto

Tipologia: Versione editoriale (versione pubblicata con il layout dell'editore)
Licenza: Creative commons
Dimensione 375.83 kB
Formato Adobe PDF
375.83 kB Adobe PDF

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/1604164
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 178
  • ???jsp.display-item.citation.isi??? 51
social impact