TY - JOUR
T1 - Personalized and diverse task composition in crowdsourcing
AU - Alsayasneh, Maha
AU - Amer-Yahia, Sihem
AU - Gaussier, Eric
AU - Leroy, Vincent
AU - Pilourdault, Julien
AU - Borromeo, Ria Mae
AU - Toyama, Motomichi
AU - Renders, Jean Michel
N1 - Funding Information:
This work was partially funded by ANR-13-CORD-0020. This paper is an extension of the DSAA 2016 paper “Task Composition in Crowdsourcing” by the same authors.
Funding Information:
This work was partially funded by ANR-13-CORD-0020. This paper is an extension of the DSAA 2016 paper "Task Composition in Crowdsourcing" by the same authors.
Publisher Copyright:
© 2017 IEEE.
PY - 2018/1
Y1 - 2018/1
N2 - We study task composition in crowdsourcing and the effect of personalization and diversity on performance. A central process in crowdsourcing is task assignment, the mechanism through which workers find tasks. On popular platforms such as Amazon Mechanical Turk, task assignment is facilitated by the ability to sort tasks by dimensions such as creation date or reward amount. Task composition improves task assignment by producing for each worker, a personalized summary of tasks, referred to as a Composite Task (CT). We propose different ways of producing CTs and formulate an optimization problem that finds for a worker, the most relevant and diverse CTs. We show empirically that workers' experience is greatly improved due to personalization that enforces an adequation of CTs with workers' skills and preferences. We also study and formalize various ways of diversifying tasks in each CT. Task diversity is grounded in organization studies that have shown its impact on worker motivation [33]. Our experiments show that diverse CTs contribute to improving outcome quality. More specifically, we show that while task throughput and worker retention are best with ranked lists, crowdwork quality reaches its best with CTs diversified by requesters, thereby confirming that workers look to expose their "good" work to many requesters.
AB - We study task composition in crowdsourcing and the effect of personalization and diversity on performance. A central process in crowdsourcing is task assignment, the mechanism through which workers find tasks. On popular platforms such as Amazon Mechanical Turk, task assignment is facilitated by the ability to sort tasks by dimensions such as creation date or reward amount. Task composition improves task assignment by producing for each worker, a personalized summary of tasks, referred to as a Composite Task (CT). We propose different ways of producing CTs and formulate an optimization problem that finds for a worker, the most relevant and diverse CTs. We show empirically that workers' experience is greatly improved due to personalization that enforces an adequation of CTs with workers' skills and preferences. We also study and formalize various ways of diversifying tasks in each CT. Task diversity is grounded in organization studies that have shown its impact on worker motivation [33]. Our experiments show that diverse CTs contribute to improving outcome quality. More specifically, we show that while task throughput and worker retention are best with ranked lists, crowdwork quality reaches its best with CTs diversified by requesters, thereby confirming that workers look to expose their "good" work to many requesters.
KW - Crowdsourcing
KW - Task assignment
KW - Task composition
KW - Task diversity
UR - http://www.scopus.com/inward/record.url?scp=85031105160&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85031105160&partnerID=8YFLogxK
U2 - 10.1109/TKDE.2017.2755660
DO - 10.1109/TKDE.2017.2755660
M3 - Article
AN - SCOPUS:85031105160
SN - 1041-4347
VL - 30
SP - 128
EP - 141
JO - IEEE Transactions on Knowledge and Data Engineering
JF - IEEE Transactions on Knowledge and Data Engineering
IS - 1
ER -