direkt zum Inhalt springen

direkt zum Hauptnavigationsmenü

Sie sind hier

TU Berlin

Page Content


Motivation and Automatic Quality Assessment in Paid Crowdsourcing Online Labor Markets




Building on the growth in availability and popularity of crowd-sourced data, this project will answer essential questions about data quality and characteristics of crowd-sourced data.  It will focus on how to assess, monitor and assure the data quality in the context of crowdsourcing and examine factors influencing the motivation of workers.

Regardless of the growing popularity of crowd sourcing, one essential problem remains untackled.  Anonymously paid micro tasks are frequently corrupted by certain proportion of subjects who do not focus enough on the job or who do not work as instructed but give random data. This in turn leads to noisy responses or inaccurate results and thus to a considerable deterioration of the data quality.

In this project, we are intended to answer the following research questions:

  • How can we measure and predict the quality of responses to predefined online jobs such as survey, multimedia recording or free-text responses?
  • What is the relationship between the motivation of an online worker and performance, and how can a worker’s motivation be influenced?



Time Frame:
4/2012 - 3/2015
Team Members:
Babak Naderi
Software Campus Partners:
TU-Berlin, Deutsche Telekom AG
Funding by:
Bundesministerium für Bildung und Forschung - BMBF


Naderi, Babak and Polzehl, Tim and Beyer, André and Pilz, tibor and Möller, Sebastian (2014). Crowdee: Mobile Crowdsourcing Micro-task Platform for Celebrating the Diversity of Languages. Proc. 15th Ann. Conf. of the Int. Speech Comm. Assoc. (Interspeech 2014), Show & Tell Session. ISCA, 1496–1497.

Naderi, Babak and Wechsung, Ina and Polzehl, Tim and Möller, Sebastian (2014). Development and Validation of Extrinsic Motivation Scale for Crowdsourcing Micro-task Platforms. Proceedings of the 2014 International ACM Workshop on Crowdsourcing for Multimedia. ACM, 31–36.

Naderi, Babak and Wechsung, Ina and Möller, Sebastian (2015). Effect of Being Observed on the Reliability of Responses in Crowdsourcing Micro-task Platforms. Quality of Multimedia Experience (QoMEX), 2015 Seventh International Workshop on. IEEE, 1–2.

Naderi, Babak and Polzehl, Tim and Wechsung, Ina and Köster, Friedemann and Möller, Sebastian (2015). Effect of Trapping Questions on the Reliability of Speech Quality Judgments in a Crowdsourcing Paradigm. 16th Ann. Conf. of the Int. Speech Comm. Assoc. (Interspeech 2015). ISCA, 2799–2803.

Zusatzinformationen / Extras

Quick Access:

Schnellnavigation zur Seite über Nummerneingabe