Annotation ofWeb content is becoming a widespread activity by which users appropriate and enrich information available on the Net. MADCOW (Multimedia Annotation of Digital Content Over the Web) is a system for annotating HTML pages that provides a uniform interactive paradigm to make annotations on text, images and video. MADCOW has also recently included novel features for groups and group annotations with reference to ontological domains, but interaction with the MADCOW client is currently limited to the use of common input devices (e.g., keyboard, mouse), requiring a precise selection of the portions to be annotated. In this paper we present a sketch-based interface, which can be used to annotate not only content but also aspects relative to the presentation of the information. While interacting with a standard Web browser, users can draw free-hand geometrical shapes (e.g., circles, rectangles, closed paths) for selecting specific parts of the Web pages to be annotated. The interaction mode depends on the adopted input device. For example, users interacting with touch-screen devices (e.g., smartphones, tablets) can draw shapes with their fingers, but in principle any device able to detect sketching gestures can be supported (e.g., graphic tablets, optical pens). The paper discusses interaction aspects together with an overview of the system architecture. Finally, preliminary experimental tests and some considerations on the usability are also reported.

An Interactive Tool for Sketch-Based Annotation / Antico, Matteo; Avola, Danilo; Bottoni, Paolo; Hawash, Amjad; Kanev, Kamen; Presicce, Franceso Parisi. - In: JAPANESE JOURNAL OF APPLIED PHYSICS. - ISSN 1347-4065. - (2016), p. 011604. [10.7567/JJAPCP.4.011604]

An Interactive Tool for Sketch-Based Annotation

Avola, Danilo;Bottoni, Paolo;Hawash, Amjad;Kanev, Kamen;
2016

Abstract

Annotation ofWeb content is becoming a widespread activity by which users appropriate and enrich information available on the Net. MADCOW (Multimedia Annotation of Digital Content Over the Web) is a system for annotating HTML pages that provides a uniform interactive paradigm to make annotations on text, images and video. MADCOW has also recently included novel features for groups and group annotations with reference to ontological domains, but interaction with the MADCOW client is currently limited to the use of common input devices (e.g., keyboard, mouse), requiring a precise selection of the portions to be annotated. In this paper we present a sketch-based interface, which can be used to annotate not only content but also aspects relative to the presentation of the information. While interacting with a standard Web browser, users can draw free-hand geometrical shapes (e.g., circles, rectangles, closed paths) for selecting specific parts of the Web pages to be annotated. The interaction mode depends on the adopted input device. For example, users interacting with touch-screen devices (e.g., smartphones, tablets) can draw shapes with their fingers, but in principle any device able to detect sketching gestures can be supported (e.g., graphic tablets, optical pens). The paper discusses interaction aspects together with an overview of the system architecture. Finally, preliminary experimental tests and some considerations on the usability are also reported.
2016
MADCOW; Sketch-Based Interfaces; Human-Computer Interaction
01 Pubblicazione su rivista::01a Articolo in rivista
An Interactive Tool for Sketch-Based Annotation / Antico, Matteo; Avola, Danilo; Bottoni, Paolo; Hawash, Amjad; Kanev, Kamen; Presicce, Franceso Parisi. - In: JAPANESE JOURNAL OF APPLIED PHYSICS. - ISSN 1347-4065. - (2016), p. 011604. [10.7567/JJAPCP.4.011604]
File allegati a questo prodotto
File Dimensione Formato  
Avola_Interactive-Tool_2016.pdf

solo gestori archivio

Tipologia: Versione editoriale (versione pubblicata con il layout dell'editore)
Licenza: Tutti i diritti riservati (All rights reserved)
Dimensione 728.27 kB
Formato Adobe PDF
728.27 kB Adobe PDF   Contatta l'autore

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/1245531
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact