This paper introduces a novel aligner for Abstract Meaning Representation (AMR) graphs that can scale cross-lingually, and is thus capable of aligning units and spans in sentences of different languages. Our approach leverages modern Transformer-based parsers, which inherently encode alignment information in their cross-attention weights, allowing us to extract this information during parsing. This eliminates the need for English-specific rules or the Expectation Maximization (EM) algorithm that have been used in previous approaches. In addition, we propose a guided supervised method using alignment to further enhance the performance of our aligner. We achieve state-of-the-art results in the benchmarks for AMR alignment and demonstrate our aligner’s ability to obtain them across multiple languages.

Cross-lingual AMR Aligner: Paying Attention to Cross-Attention / Martinez Lorenzo, Abelardo Carlos; Huguet Cabot, Pere Lluís; Navigli, Roberto. - (2023), pp. 1726-1742. (Intervento presentato al convegno Association for Computational Linguistics tenutosi a Toronto, Canada) [10.18653/v1/2023.findings-acl.109].

Cross-lingual AMR Aligner: Paying Attention to Cross-Attention

Martinez Lorenzo, Abelardo Carlos
;
Huguet Cabot, Pere Lluís
;
Navigli, Roberto
Supervision
2023

Abstract

This paper introduces a novel aligner for Abstract Meaning Representation (AMR) graphs that can scale cross-lingually, and is thus capable of aligning units and spans in sentences of different languages. Our approach leverages modern Transformer-based parsers, which inherently encode alignment information in their cross-attention weights, allowing us to extract this information during parsing. This eliminates the need for English-specific rules or the Expectation Maximization (EM) algorithm that have been used in previous approaches. In addition, we propose a guided supervised method using alignment to further enhance the performance of our aligner. We achieve state-of-the-art results in the benchmarks for AMR alignment and demonstrate our aligner’s ability to obtain them across multiple languages.
2023
Association for Computational Linguistics
Semantic Parsing; AMR; AMR parsing; Cross-lingual AMR parsing; AMR alignment; Cross-Attention
04 Pubblicazione in atti di convegno::04b Atto di convegno in volume
Cross-lingual AMR Aligner: Paying Attention to Cross-Attention / Martinez Lorenzo, Abelardo Carlos; Huguet Cabot, Pere Lluís; Navigli, Roberto. - (2023), pp. 1726-1742. (Intervento presentato al convegno Association for Computational Linguistics tenutosi a Toronto, Canada) [10.18653/v1/2023.findings-acl.109].
File allegati a questo prodotto
File Dimensione Formato  
AbelardoCarlosMartínez_Cross-lingual_2023.pdf

accesso aperto

Note: https://aclanthology.org/2023.findings-acl.109.pdf
Tipologia: Versione editoriale (versione pubblicata con il layout dell'editore)
Licenza: Creative commons
Dimensione 1.47 MB
Formato Adobe PDF
1.47 MB Adobe PDF

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/1688042
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? ND
social impact