In an era characterized by unprecedented virtual connectivity, paradoxically, individuals often find themselves disconnected from genuine human interactions. The advent of remote working arrangements, compounded by the influence of digital communication platforms, has fostered a sense of isolation among people. Consequently, the prevailing socio-technological landscape has underscored the critical need for innovative solutions to address the emotional void. Conversational systems help people improve their everyday tasks with informative dialogues, and recent applications employ them to target emotional support conversation tasks. Nevertheless, their understanding of human feelings is limited, as they depend solely on information discernible from the text or the users' emotional declarations. Recently, Brain-Computer Interfaces (BCIs), devices that analyze electroencephalographic (EEG) signals, have increasingly become popular given their minimally invasive nature and low cost, besides enabling the detection of users' emotional states reliably. Hence, we propose ARIEL, an emotionAl suppoRt bcI dEvices and Llm-based conversational agent that aims at supporting users' emotional states through conversations and monitoring them via BCI. In this way, it is possible to comprehend the users' feelings reliably, thus making the conversational agent aware of users' emotional evolution during conversations. Our framework makes the LlaMA 2 chat model communicate with an emotion recognition BCI-based system to achieve the emotional support conversation goal. Also, we present a controlled running example that shows the potential of our model and its effective functioning, made possible by a wisely designed hard-prompt strategy. In the future, we will conduct an in-vivo experiment to evaluate the system and its components.

ARIEL: Brain-Computer Interfaces meet Large Language Models for Emotional Support Conversation / Sorino, P.; Biancofiore, G. M.; Lofu, D.; Colafiglio, T.; Lombardi, A.; Narducci, F.; Di Noia, T.. - (2024), pp. 601-609. (Intervento presentato al convegno 32nd ACM Conference on User Modeling, Adaptation and Personalization, UMAP 2024 tenutosi a Cagliari; Italy) [10.1145/3631700.3665193].

ARIEL: Brain-Computer Interfaces meet Large Language Models for Emotional Support Conversation

Colafiglio T.;
2024

Abstract

In an era characterized by unprecedented virtual connectivity, paradoxically, individuals often find themselves disconnected from genuine human interactions. The advent of remote working arrangements, compounded by the influence of digital communication platforms, has fostered a sense of isolation among people. Consequently, the prevailing socio-technological landscape has underscored the critical need for innovative solutions to address the emotional void. Conversational systems help people improve their everyday tasks with informative dialogues, and recent applications employ them to target emotional support conversation tasks. Nevertheless, their understanding of human feelings is limited, as they depend solely on information discernible from the text or the users' emotional declarations. Recently, Brain-Computer Interfaces (BCIs), devices that analyze electroencephalographic (EEG) signals, have increasingly become popular given their minimally invasive nature and low cost, besides enabling the detection of users' emotional states reliably. Hence, we propose ARIEL, an emotionAl suppoRt bcI dEvices and Llm-based conversational agent that aims at supporting users' emotional states through conversations and monitoring them via BCI. In this way, it is possible to comprehend the users' feelings reliably, thus making the conversational agent aware of users' emotional evolution during conversations. Our framework makes the LlaMA 2 chat model communicate with an emotion recognition BCI-based system to achieve the emotional support conversation goal. Also, we present a controlled running example that shows the potential of our model and its effective functioning, made possible by a wisely designed hard-prompt strategy. In the future, we will conduct an in-vivo experiment to evaluate the system and its components.
2024
32nd ACM Conference on User Modeling, Adaptation and Personalization, UMAP 2024
Brain-Computer Interface; Conversational Agent; Emotion Recognition; Emotional Support Conversation; Large Language Model; Machine Learning
04 Pubblicazione in atti di convegno::04b Atto di convegno in volume
ARIEL: Brain-Computer Interfaces meet Large Language Models for Emotional Support Conversation / Sorino, P.; Biancofiore, G. M.; Lofu, D.; Colafiglio, T.; Lombardi, A.; Narducci, F.; Di Noia, T.. - (2024), pp. 601-609. (Intervento presentato al convegno 32nd ACM Conference on User Modeling, Adaptation and Personalization, UMAP 2024 tenutosi a Cagliari; Italy) [10.1145/3631700.3665193].
File allegati a questo prodotto
File Dimensione Formato  
Soriano_ARIEL-Brain-Computer_2024.pdf

accesso aperto

Note: https://dl.acm.org/doi/pdf/10.1145/3631700.3665193
Tipologia: Versione editoriale (versione pubblicata con il layout dell'editore)
Licenza: Tutti i diritti riservati (All rights reserved)
Dimensione 619.49 kB
Formato Adobe PDF
619.49 kB Adobe PDF

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/1727073
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? 0
social impact