While cross-lingual techniques are finding increasing success in a wide range of Natural Language Processing tasks, their application to Semantic Role Labeling (SRL) has been strongly limited by the fact that each language adopts its own linguistic formalism, from PropBank for English to AnCora for Spanish and PDT-Vallex for Czech, inter alia. In this work, we address this issue and present a unified model to perform cross-lingual SRL over heterogeneous linguistic resources. Our model implicitly learns a high-quality mapping for different formalisms across diverse languages without resorting to word alignment and/or translation techniques. We find that, not only is our cross-lingual system competitive with the current state of the art but that it is also robust to low-data scenarios. Most interestingly, our unified model is able to annotate a sentence in a single forward pass with all the inventories it was trained with, providing a tool for the analysis and comparison of linguistic theories across different languages. We release our code and model at https://github.com/SapienzaNLP/unify-srl.

Unifying Cross-Lingual Semantic Role Labeling with Heterogeneous Linguistic Resources / Conia, Simone; Bacciu, Andrea; Navigli, Roberto. - (2021), pp. 338-351. (Intervento presentato al convegno North American Association for Computational Linguistics tenutosi a Online) [10.18653/v1/2021.naacl-main.31].

Unifying Cross-Lingual Semantic Role Labeling with Heterogeneous Linguistic Resources

Simone Conia
Primo
;
Andrea Bacciu
Secondo
;
Roberto Navigli
Ultimo
2021

Abstract

While cross-lingual techniques are finding increasing success in a wide range of Natural Language Processing tasks, their application to Semantic Role Labeling (SRL) has been strongly limited by the fact that each language adopts its own linguistic formalism, from PropBank for English to AnCora for Spanish and PDT-Vallex for Czech, inter alia. In this work, we address this issue and present a unified model to perform cross-lingual SRL over heterogeneous linguistic resources. Our model implicitly learns a high-quality mapping for different formalisms across diverse languages without resorting to word alignment and/or translation techniques. We find that, not only is our cross-lingual system competitive with the current state of the art but that it is also robust to low-data scenarios. Most interestingly, our unified model is able to annotate a sentence in a single forward pass with all the inventories it was trained with, providing a tool for the analysis and comparison of linguistic theories across different languages. We release our code and model at https://github.com/SapienzaNLP/unify-srl.
2021
North American Association for Computational Linguistics
natural language processing; artificial intelligence; deep learning; semantic role labeling; multilinguality; cross-linguality
04 Pubblicazione in atti di convegno::04b Atto di convegno in volume
Unifying Cross-Lingual Semantic Role Labeling with Heterogeneous Linguistic Resources / Conia, Simone; Bacciu, Andrea; Navigli, Roberto. - (2021), pp. 338-351. (Intervento presentato al convegno North American Association for Computational Linguistics tenutosi a Online) [10.18653/v1/2021.naacl-main.31].
File allegati a questo prodotto
File Dimensione Formato  
Conia_Unifyng_2021.pdf

accesso aperto

Note: DOI: 10.18653/v1/2021.naacl-main.31
Tipologia: Versione editoriale (versione pubblicata con il layout dell'editore)
Licenza: Tutti i diritti riservati (All rights reserved)
Dimensione 1.92 MB
Formato Adobe PDF
1.92 MB Adobe PDF
Frontespizio-indice.pdf

accesso aperto

Note: URL: https://aclanthology.org/2021.naacl-main
Tipologia: Altro materiale allegato
Licenza: Tutti i diritti riservati (All rights reserved)
Dimensione 822.19 kB
Formato Adobe PDF
822.19 kB Adobe PDF

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/1569652
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 33
  • ???jsp.display-item.citation.isi??? 17
social impact