Despite significant advancements in deep learning for sequence forecasting, neural models are typically trained only on data, and the incorporation of high-level prior logical knowledge in their training is still an hard challenge. This limitation hinders the exploitation of background knowledge, such as common sense or domain-specific information, in predictive tasks performed by neural networks. In this work, we propose a principled approach to integrate prior knowledge in Linear Temporal Logic over finite traces (\ltlf) into deep autoregressive models for multistep symbolic sequence generation (i.e., suffix prediction) at training time. Our method involves representing logical knowledge through continuous probabilistic relaxations and employing a differentiable schedule for sampling the next symbol from the network. We test our approach on synthetic datasets based on background knowledge in Declare, inspired by Business Process Management (BPM) applications. The results demonstrate that our method consistently improves the performance of the neural predictor, achieving lower Damerau-Levenshtein (DL) distances from target sequences and higher satisfaction rates of the logical knowledge compared to models trained solely on data.

Enhancing Deep Sequence Generation with Logical Temporal Knowledge / Umili, E.; Paludo Licks, G.; Patrizi, F.. - 3779:(2024), pp. 23-34. (Intervento presentato al convegno the 3rd International Workshop on Process Management in the AI Era (PMAI 2024) co-located with 27th European Conference on Artificial Intelligence (ECAI 2024), Santiago de Compostela, Spain, October 19, 2024 tenutosi a Santiago de Compostela; Spain).

Enhancing Deep Sequence Generation with Logical Temporal Knowledge

Umili E.
;
Paludo Licks G.;Patrizi F.
2024

Abstract

Despite significant advancements in deep learning for sequence forecasting, neural models are typically trained only on data, and the incorporation of high-level prior logical knowledge in their training is still an hard challenge. This limitation hinders the exploitation of background knowledge, such as common sense or domain-specific information, in predictive tasks performed by neural networks. In this work, we propose a principled approach to integrate prior knowledge in Linear Temporal Logic over finite traces (\ltlf) into deep autoregressive models for multistep symbolic sequence generation (i.e., suffix prediction) at training time. Our method involves representing logical knowledge through continuous probabilistic relaxations and employing a differentiable schedule for sampling the next symbol from the network. We test our approach on synthetic datasets based on background knowledge in Declare, inspired by Business Process Management (BPM) applications. The results demonstrate that our method consistently improves the performance of the neural predictor, achieving lower Damerau-Levenshtein (DL) distances from target sequences and higher satisfaction rates of the logical knowledge compared to models trained solely on data.
2024
the 3rd International Workshop on Process Management in the AI Era (PMAI 2024) co-located with 27th European Conference on Artificial Intelligence (ECAI 2024), Santiago de Compostela, Spain, October 19, 2024
suffix prediction; neurosymbolic ai; deep learning with logical knowledge; linear temporal logic
04 Pubblicazione in atti di convegno::04b Atto di convegno in volume
Enhancing Deep Sequence Generation with Logical Temporal Knowledge / Umili, E.; Paludo Licks, G.; Patrizi, F.. - 3779:(2024), pp. 23-34. (Intervento presentato al convegno the 3rd International Workshop on Process Management in the AI Era (PMAI 2024) co-located with 27th European Conference on Artificial Intelligence (ECAI 2024), Santiago de Compostela, Spain, October 19, 2024 tenutosi a Santiago de Compostela; Spain).
File allegati a questo prodotto
File Dimensione Formato  
Umili_Enhancing-Deep-Sequence_2024.pdf

accesso aperto

Note: https://ceur-ws.org/Vol-3779/paper4.pdf
Tipologia: Versione editoriale (versione pubblicata con il layout dell'editore)
Licenza: Creative commons
Dimensione 416.42 kB
Formato Adobe PDF
416.42 kB Adobe PDF

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/1727985
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? ND
social impact