Recent studies have shed some light on a common pitfall of Neural Machine Translation (NMT) models, stemming from their struggle to disambiguate polysemous words without lapsing into their most frequently occurring senses in the training corpus.In this paper, we first provide a novel approach for automatically creating high-precision sense-annotated parallel corpora, and then put forward a specifically tailored fine-tuning strategy for exploiting these sense annotations during training without introducing any additional requirement at inference time.The use of explicit senses proved to be beneficial to reduce the disambiguation bias of a baseline NMT model, while, at the same time, leading our system to attain higher BLEU scores than its vanilla counterpart in 3 language pairs.
Reducing Disambiguation Biases in NMT by Leveraging Explicit Word Sense Information / Campolungo, Niccolò; Pasini, Tommaso; Emelin, Denis; Navigli, Roberto. - (2022), pp. 4824-4838. ( 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies Seattle; USA ) [10.18653/v1/2022.naacl-main.355].
Reducing Disambiguation Biases in NMT by Leveraging Explicit Word Sense Information
Campolungo, Niccolò
Primo
;Pasini, TommasoSecondo
;Navigli, RobertoUltimo
2022
Abstract
Recent studies have shed some light on a common pitfall of Neural Machine Translation (NMT) models, stemming from their struggle to disambiguate polysemous words without lapsing into their most frequently occurring senses in the training corpus.In this paper, we first provide a novel approach for automatically creating high-precision sense-annotated parallel corpora, and then put forward a specifically tailored fine-tuning strategy for exploiting these sense annotations during training without introducing any additional requirement at inference time.The use of explicit senses proved to be beneficial to reduce the disambiguation bias of a baseline NMT model, while, at the same time, leading our system to attain higher BLEU scores than its vanilla counterpart in 3 language pairs.| File | Dimensione | Formato | |
|---|---|---|---|
|
Campolungo_Reducing-Disambiguation_2022.pdf
accesso aperto
Note: DOI: 10.18653/v1/2022.naacl-main.355
Tipologia:
Versione editoriale (versione pubblicata con il layout dell'editore)
Licenza:
Creative commons
Dimensione
486.85 kB
Formato
Adobe PDF
|
486.85 kB | Adobe PDF |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.


