Goal recognition is the task of identifying the goal an observed agent is pursuing. The quality of its results depends on the quality of the observed information. In most goal recognition approaches, the accuracy significantly decreases in settings with missing observations. To mitigate this issue, we develop a learning model based on LSTMs, leveraging attention mechanisms, to enhance observed traces by predicting missing observations in goal recognition problems. We experiment using a dataset of goal recognition problems and apply the model to enhance the observation traces where missing. We evaluate the technique using a state-of-the-art goal recognizer in four different domains to compare the accuracy between the standard and the enhanced observation traces. Experimental evaluation shows that recurrent neural networks with self-attention mechanisms improve the accuracy metrics of state-of-the-art goal recognition techniques by an average of 60%.

Using self-attention LSTMs to enhance observations in goal recognition / Amado, L.; Paludo Licks, G.; Marcon, M.; Fraga Pereira, R.; Meneguzzi, F.. - (2020). (Intervento presentato al convegno 2020 International Joint Conference on Neural Networks, IJCNN 2020 tenutosi a Glasgow; Scotland) [10.1109/IJCNN48605.2020.9207597].

Using self-attention LSTMs to enhance observations in goal recognition

Paludo Licks G.
;
Fraga Pereira R.
;
2020

Abstract

Goal recognition is the task of identifying the goal an observed agent is pursuing. The quality of its results depends on the quality of the observed information. In most goal recognition approaches, the accuracy significantly decreases in settings with missing observations. To mitigate this issue, we develop a learning model based on LSTMs, leveraging attention mechanisms, to enhance observed traces by predicting missing observations in goal recognition problems. We experiment using a dataset of goal recognition problems and apply the model to enhance the observation traces where missing. We evaluate the technique using a state-of-the-art goal recognizer in four different domains to compare the accuracy between the standard and the enhanced observation traces. Experimental evaluation shows that recurrent neural networks with self-attention mechanisms improve the accuracy metrics of state-of-the-art goal recognition techniques by an average of 60%.
2020
2020 International Joint Conference on Neural Networks, IJCNN 2020
goal recognition; self-attention; lstm
04 Pubblicazione in atti di convegno::04b Atto di convegno in volume
Using self-attention LSTMs to enhance observations in goal recognition / Amado, L.; Paludo Licks, G.; Marcon, M.; Fraga Pereira, R.; Meneguzzi, F.. - (2020). (Intervento presentato al convegno 2020 International Joint Conference on Neural Networks, IJCNN 2020 tenutosi a Glasgow; Scotland) [10.1109/IJCNN48605.2020.9207597].
File allegati a questo prodotto
File Dimensione Formato  
Amado_Using_2020.pdf

solo gestori archivio

Tipologia: Versione editoriale (versione pubblicata con il layout dell'editore)
Licenza: Tutti i diritti riservati (All rights reserved)
Dimensione 951.26 kB
Formato Adobe PDF
951.26 kB Adobe PDF   Contatta l'autore

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/1670727
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 3
  • ???jsp.display-item.citation.isi??? 1
social impact