A Checklist to Combat Cognitive Biases in Crowdsourcing

Tim Draws, Alisa Rieger, Oana Inel, Ujwal Gadiraju, Nava Tintarev

Research output: Chapter in Book/Report/Conference proceedingConference article in proceedingAcademicpeer-review

190 Downloads (Pure)

Abstract

Recent research has demonstrated that cognitive biases such as the confirmation bias or the anchoring effect can negatively affect the quality of crowdsourced data. In practice, however, such biases go unnoticed unless specifically assessed or controlled for. Task requesters need to ensure that task workflow and design choices do not trigger workers’ cognitive biases. Moreover, to facilitate the reuse of crowd sourced data collections, practitioners can benefit from understanding whether and which cognitive biases may be associated with the data. To this end, we propose a 12-item checklist adapted from business psychology to combat cognitive biases in crowdsourcing. We demonstrate the practical application of this checklist in a case study on viewpoint annotations for search results. Through a retrospective analysis of relevant crowdsourcing research that has been published at HCOMP in 2018, 2019, and 2020, we show that cognitive biases may often affect crowd workers but are typically not considered as potential sources of poor data quality. The checklist we propose is a practical tool that requesters can use to improve their task designs and appropriately describe potential limitations of collected data. It contributes to a body of efforts towards making human-labeled data more reliable and reusable.

Original languageEnglish
Title of host publicationProceedings of the Ninth AAAI Conference on Human Computation and Crowdsourcing
Subtitle of host publicationhcomp-21
EditorsEce Kamar, Kurt Luther
PublisherAAAI Press
Pages48-59
Number of pages12
Volume9
ISBN (Print)978-1-57735-872-5
DOIs
Publication statusPublished - 2021
Event9th AAAI Conference on Human Computation and Crowdsourcing - Online
Duration: 14 Nov 202118 Nov 2021
Conference number: 9
https://www.humancomputation.com/

Conference

Conference9th AAAI Conference on Human Computation and Crowdsourcing
Abbreviated titleHCOMP 2021
Period14/11/2118/11/21
Internet address

Fingerprint

Dive into the research topics of 'A Checklist to Combat Cognitive Biases in Crowdsourcing'. Together they form a unique fingerprint.

Cite this