Word embeddings are widely used in Nat-ural Language Processing, mainly due totheir success in capturing semantic infor-mation from massive corpora. However,their creation process does not allow thedifferent meanings of a word to be auto-matically separated, as it conflates theminto a single vector. We address this issueby proposing a new model which learnsword and sense embeddings jointly. Ourmodel exploits large corpora and knowl-edge from semantic networks in order toproduce a unified vector space of wordand sense embeddings. We evaluate themain features of our approach both qual-itatively and quantitatively in a variety oftasks, highlighting the advantages of theproposed method in comparison to state-of-the-art word- and sense-based models.

Embedding Words and Senses Together via Joint Knowledge-Enhanced Training / Mancini, Massimiliano; CAMACHO COLLADOS, Jose'; Iacobacci, IGNACIO JAVIER; Navigli, Roberto. - ELETTRONICO. - (2017), pp. 100-111. (Intervento presentato al convegno 21st Conference on Computational Natural Language Learning (CoNLL 2017) tenutosi a Vancouver; Canada) [10.18653/v1/K17-1012].

Embedding Words and Senses Together via Joint Knowledge-Enhanced Training

Mancini, Massimiliano
;
CAMACHO COLLADOS, JOSE'
;
IACOBACCI, IGNACIO JAVIER;NAVIGLI, Roberto
2017

Abstract

Word embeddings are widely used in Nat-ural Language Processing, mainly due totheir success in capturing semantic infor-mation from massive corpora. However,their creation process does not allow thedifferent meanings of a word to be auto-matically separated, as it conflates theminto a single vector. We address this issueby proposing a new model which learnsword and sense embeddings jointly. Ourmodel exploits large corpora and knowl-edge from semantic networks in order toproduce a unified vector space of wordand sense embeddings. We evaluate themain features of our approach both qual-itatively and quantitatively in a variety oftasks, highlighting the advantages of theproposed method in comparison to state-of-the-art word- and sense-based models.
2017
21st Conference on Computational Natural Language Learning (CoNLL 2017)
Nat-ural Language Processing; capturing semantic infor-mation; Related work
04 Pubblicazione in atti di convegno::04b Atto di convegno in volume
Embedding Words and Senses Together via Joint Knowledge-Enhanced Training / Mancini, Massimiliano; CAMACHO COLLADOS, Jose'; Iacobacci, IGNACIO JAVIER; Navigli, Roberto. - ELETTRONICO. - (2017), pp. 100-111. (Intervento presentato al convegno 21st Conference on Computational Natural Language Learning (CoNLL 2017) tenutosi a Vancouver; Canada) [10.18653/v1/K17-1012].
File allegati a questo prodotto
File Dimensione Formato  
Mancini_Embedding-Words_2017.pdf

accesso aperto

Tipologia: Versione editoriale (versione pubblicata con il layout dell'editore)
Licenza: Creative commons
Dimensione 357.53 kB
Formato Adobe PDF
357.53 kB Adobe PDF
Mancini_Frontespizio-indice_Embedding-Words_2017.pdf

accesso aperto

Tipologia: Altro materiale allegato
Licenza: Creative commons
Dimensione 161.74 kB
Formato Adobe PDF
161.74 kB Adobe PDF

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/975557
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 49
  • ???jsp.display-item.citation.isi??? ND
social impact