October 30, 2013
Computer Science Conference Room
Speaker: Yiling Chen, Associate Professor, School of Engineering and Applied Science, Harvard
Crowdsourcing and human computation have shown great potential for harnessing human intelligence for computational tasks. For example, hundreds of thousands of micro-tasks are completed everyday on Amazon Mechanical Turk, a popular online labor market; volunteers contribute to scientific projects on Citizen Science websites such as Zooniverse. In this talk, I will discuss a series of online experiments on understanding how financial incentives affect the performance of crowd workers. We consider three settings: (1) a task requester can evaluate the quality of work and payment to a worker is contingent on the work quality, (2) a task requester cannot evaluate the quality of the work and payment to a worker depends on the quantity of work produced or the amount of time spent, and (3) a task requester cannot evaluate the quality of the work and uses a theoretically grounded mechanism, the peer prediction mechanism, to reward workers. Our experimental results reveal interesting psychological bias and strategic behavior of crowd workers.
Yiling Chen is the John L. Loeb Associate Professor of Natural Sciences and Associate Professor of Computer Science at Harvard's School of Engineering and Applied Sciences. As of fall 2013, she is a visiting researcher at Microsoft Research New York City. Prior to working at Harvard, she spent two years at the Microeconomic and Social Systems group of Yahoo! Research in New York City. Her general research interests are on the border of computer science and economics. She is interested in designing and analyzing social computing systems according to both computational and economic objectives. She received her Ph.D. in Information Sciences and Technology from the Pennsylvania State University. Prof. Chen is a recipient of an NSF CAREER Award and was recognized by IEEE Intelligent Systems as one of AI's 10 to Watch in 2011.