This paper presents a novel strategy to implement a CAPTCHA (Completely Automated Public Turing test to tell Computers and Humans Apart). The aim of these tests is to easily and reliably distinguish between real human users and (malicious) bots. The approach underlying FATCHA is to exploit real time capture of human actions instead of human ability to recognize visual or auditory items. The latter approach explicitly requires proposing a challenge difficult for an automatic responder but easy for a human. However, it is often the case that pursuing the first feature takes to lose the second one. Moreover the user task may be hindered by specific disabilities. According to FATCHA approach the system rather asks the user to carry out some trivial gesture, e.g., rotating or moving the head. The webcam, which is available in almost all computers or mobile devices, captures the user gesture, and the server (hosting the service to protect) matches it with the requested one. It is possible to extend the service with a second module that allows the user to authenticate himself by face recognition instead of using a password. On the contrary, FATCHA gesture challenge can be used as a liveliness test to avoid biometric spoofing. Multimodal interaction is the base for both an advanced Human Interactive Proof (HIP) test and for robust/comfortable authentication. © 2016 Springer Science+Business Media New York

FATCHA: biometrics lends tools for CAPTCHAs / DE MARSICO, Maria; Marchionni, Luca; Novelli, Andrea; Oertel, Michael. - In: MULTIMEDIA TOOLS AND APPLICATIONS. - ISSN 1380-7501. - STAMPA. - 76:4(2017), pp. 5117-5140. [10.1007/s11042-016-3518-8]

FATCHA: biometrics lends tools for CAPTCHAs

DE MARSICO, Maria;MARCHIONNI , LUCA;NOVELLI, ANDREA;OERTEL, MICHAEL
2017

Abstract

This paper presents a novel strategy to implement a CAPTCHA (Completely Automated Public Turing test to tell Computers and Humans Apart). The aim of these tests is to easily and reliably distinguish between real human users and (malicious) bots. The approach underlying FATCHA is to exploit real time capture of human actions instead of human ability to recognize visual or auditory items. The latter approach explicitly requires proposing a challenge difficult for an automatic responder but easy for a human. However, it is often the case that pursuing the first feature takes to lose the second one. Moreover the user task may be hindered by specific disabilities. According to FATCHA approach the system rather asks the user to carry out some trivial gesture, e.g., rotating or moving the head. The webcam, which is available in almost all computers or mobile devices, captures the user gesture, and the server (hosting the service to protect) matches it with the requested one. It is possible to extend the service with a second module that allows the user to authenticate himself by face recognition instead of using a password. On the contrary, FATCHA gesture challenge can be used as a liveliness test to avoid biometric spoofing. Multimodal interaction is the base for both an advanced Human Interactive Proof (HIP) test and for robust/comfortable authentication. © 2016 Springer Science+Business Media New York
2017
Accessibility; BOT; CAPTCHA; Denial of service; Human face detection; Human Interactive Proofs; Multimodal interaction; Usability; Media Technology; Hardware and Architecture; Computer Networks and Communications; Software
01 Pubblicazione su rivista::01a Articolo in rivista
FATCHA: biometrics lends tools for CAPTCHAs / DE MARSICO, Maria; Marchionni, Luca; Novelli, Andrea; Oertel, Michael. - In: MULTIMEDIA TOOLS AND APPLICATIONS. - ISSN 1380-7501. - STAMPA. - 76:4(2017), pp. 5117-5140. [10.1007/s11042-016-3518-8]
File allegati a questo prodotto
File Dimensione Formato  
DeMarsico_FATCHA_2017.pdf

solo gestori archivio

Tipologia: Versione editoriale (versione pubblicata con il layout dell'editore)
Licenza: Tutti i diritti riservati (All rights reserved)
Dimensione 1.67 MB
Formato Adobe PDF
1.67 MB Adobe PDF   Contatta l'autore

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/911609
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 5
  • ???jsp.display-item.citation.isi??? 3
social impact