Due to the increasing amount of information shared online every day, the need for sound and reliable ways of distinguishing between trustworthy and non-trustworthy information is as present as ever. One technique for performing fact-checking at scale is to employ human intelligence in the form of crowd workers. Although earlier work has suggested that crowd workers can reliably identify misinformation, cognitive biases of crowd workers may reduce the quality of truthfulness judgments in this context. We performed a systematic exploratory analysis of publicly available crowdsourced data to identify a set of potential systematic biases that may occur when crowd workers perform fact-checking tasks. Following this exploratory study, we collected a novel data set of crowdsourced truthfulness judgments to validate our hypotheses. Our findings suggest that workers generally overestimate the truthfulness of statements and that different individual characteristics (i.e., their belief in science) and cognitive biases (i.e., the affect heuristic and overconfidence) can affect their annotations. Interestingly, we find that, depending on the general judgment tendencies of workers, their biases may sometimes lead to more accurate judgments.

The Effects of Crowd Worker Biases in Fact-Checking Tasks / Draws, T.; La Barbera, D.; Soprano, M.; Roitero, K.; Ceolin, D.; Checco, A.; Mizzaro, S.. - (2022), pp. 2114-2124. (Intervento presentato al convegno 5th ACM Conference on Fairness, Accountability, and Transparency, FAccT 202 tenutosi a Seoul, South Korea) [10.1145/3531146.3534629].

The Effects of Crowd Worker Biases in Fact-Checking Tasks

Checco A.;
2022

Abstract

Due to the increasing amount of information shared online every day, the need for sound and reliable ways of distinguishing between trustworthy and non-trustworthy information is as present as ever. One technique for performing fact-checking at scale is to employ human intelligence in the form of crowd workers. Although earlier work has suggested that crowd workers can reliably identify misinformation, cognitive biases of crowd workers may reduce the quality of truthfulness judgments in this context. We performed a systematic exploratory analysis of publicly available crowdsourced data to identify a set of potential systematic biases that may occur when crowd workers perform fact-checking tasks. Following this exploratory study, we collected a novel data set of crowdsourced truthfulness judgments to validate our hypotheses. Our findings suggest that workers generally overestimate the truthfulness of statements and that different individual characteristics (i.e., their belief in science) and cognitive biases (i.e., the affect heuristic and overconfidence) can affect their annotations. Interestingly, we find that, depending on the general judgment tendencies of workers, their biases may sometimes lead to more accurate judgments.
2022
5th ACM Conference on Fairness, Accountability, and Transparency, FAccT 202
fact-checking; crowdsourcing; information retrieval
04 Pubblicazione in atti di convegno::04b Atto di convegno in volume
The Effects of Crowd Worker Biases in Fact-Checking Tasks / Draws, T.; La Barbera, D.; Soprano, M.; Roitero, K.; Ceolin, D.; Checco, A.; Mizzaro, S.. - (2022), pp. 2114-2124. (Intervento presentato al convegno 5th ACM Conference on Fairness, Accountability, and Transparency, FAccT 202 tenutosi a Seoul, South Korea) [10.1145/3531146.3534629].
File allegati a questo prodotto
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/1696308
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 16
  • ???jsp.display-item.citation.isi??? ND
social impact