Social interaction and embodiment are key issues for future User Centric Media. Social networks and games are more and more characterized by an active, physical participation of the users. The integration in mobile devices of a growing number of sensors to capture users’ physical activity (e.g., accelerometers, cameras) and context information (GPS, location) supports novel systems capable to connect audiovisual content processing and communication to users social behavior, including joint movement and physical engagement. In this paper, a system enabling a novel paradigm for social, active experience of sound and music content is presented. An instance of such a system, named Sync‘n’Move, allowing two users to explore a multi-channel pre-recorded music piece as the result of their social interaction, and in particular of their synchronization, is introduced. This research has been developed in the framework of the EU-ICT Project SAME (www.sameproject.eu) and has been presented at Agora Festival (IRCAM, Centre Pompidou, Paris, June 2009). In that occasion, Sync‘n’Move has been evaluated by both expert and non expert users, and results are briefly presented. Perspectives on the impact of such a novel paradigm and system in future User Centric Media are finally discussed, with a specific focus on social active experience of audiovisual content.

A System for Mobile Active Music Listening Based on Social Interaction and Embodiment / Varni, Giovanna; Mancini, Maurizio; Volpe, Gualtiero; Camurri, Antonio. - In: MOBILE NETWORKS AND APPLICATIONS. - ISSN 1383-469X. - 16:(2011), pp. 375-384. [10.1007/s11036-010-0256-4]

A System for Mobile Active Music Listening Based on Social Interaction and Embodiment

MANCINI, MAURIZIO;VOLPE, GUALTIERO;
2011

Abstract

Social interaction and embodiment are key issues for future User Centric Media. Social networks and games are more and more characterized by an active, physical participation of the users. The integration in mobile devices of a growing number of sensors to capture users’ physical activity (e.g., accelerometers, cameras) and context information (GPS, location) supports novel systems capable to connect audiovisual content processing and communication to users social behavior, including joint movement and physical engagement. In this paper, a system enabling a novel paradigm for social, active experience of sound and music content is presented. An instance of such a system, named Sync‘n’Move, allowing two users to explore a multi-channel pre-recorded music piece as the result of their social interaction, and in particular of their synchronization, is introduced. This research has been developed in the framework of the EU-ICT Project SAME (www.sameproject.eu) and has been presented at Agora Festival (IRCAM, Centre Pompidou, Paris, June 2009). In that occasion, Sync‘n’Move has been evaluated by both expert and non expert users, and results are briefly presented. Perspectives on the impact of such a novel paradigm and system in future User Centric Media are finally discussed, with a specific focus on social active experience of audiovisual content.
2011
mobile active music listening; social signal processing; synchronization
01 Pubblicazione su rivista::01a Articolo in rivista
A System for Mobile Active Music Listening Based on Social Interaction and Embodiment / Varni, Giovanna; Mancini, Maurizio; Volpe, Gualtiero; Camurri, Antonio. - In: MOBILE NETWORKS AND APPLICATIONS. - ISSN 1383-469X. - 16:(2011), pp. 375-384. [10.1007/s11036-010-0256-4]
File allegati a questo prodotto
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/1528128
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 16
  • ???jsp.display-item.citation.isi??? 9
social impact