Presentation is loading. Please wait.

Presentation is loading. Please wait.

Human Computation and Crowdsourcing: Survey and Taxonomy Uichin Lee Sept. 7, 2011.

Similar presentations


Presentation on theme: "Human Computation and Crowdsourcing: Survey and Taxonomy Uichin Lee Sept. 7, 2011."— Presentation transcript:

1 Human Computation and Crowdsourcing: Survey and Taxonomy Uichin Lee Sept. 7, 2011

2 Human Computation: A Survey and Taxonomy of a Growing Field Alexander J. Quinn, Benjamin B. Bederson CHI 2011

3 Human Computation Computer scientists (in the artificial intelligence field) have been trying to emulate human like abilities, e.g., language, visual processing, reasoning Alan Turing wrote in 1950: “The idea behind digital computers may be explained by saying that these machines are intended to carry out any operations which could be done by a human computer.” L. Von Ahn 2005 a doctorial thesis about human computation The field is now thriving: business, art, R&D, HCI, databases, artificial intelligence, etc.

4 Definition of Human Computation Dates back 1938 in philosophy and psychology literature ; 1960 in Computer Science literature (by Turing) Modern usage inspired by von Ahn’s 2005 dissertation titled by “Human Computation” and the work leading to it – “…a paradigm for utilizing human processing power to solve problems that computers cannot yet solve.”

5 Definition of Human Computation “…the idea of using human effort to perform tasks that computers cannot yet perform, usually in an enjoyable manner.” (Law, von Ahn 2009) “…a new research area that studies the process of channeling the vast internet population to perform tasks or provide data towards solving difficult problems that no known efficient computer algorithms can yet solve” (Chandrasekar, et al., 2010) “…a technique that makes use of human abilities for computation to solve problems.” (Yuen, Chen, King, 2009) “…a technique to let humans solve tasks, which cannot be solved by computers.” (Schall, Truong, Dustdar, 2008) “A computational process that involves humans in certain steps…” (Yang, et al., 2008) “…systems of computers and large numbers of humans that work together in order to solve problems that could not be solved by either computers or humans alone” (Quinn, Bederson, 2009) “…a new area of research that studies how to build systems, such as simple casual games, to collect annotations from human users.” (Law, et al., 2009)

6 Related Ideas Crowdsourcing Social computing Data mining Collective intelligence

7 Crowdsourcing “Crowdsourcing is the act of taking a job traditionally performed by a designated agent (usually an employee) and outsourcing it to an undefined, generally large group of people in the form of an open call.” (Jeff Howe) Human computation replaces computers with humans, whereas crowdsourcing replaces traditional human workers with members of the public – HC: replacement of computers with humans – CS: replacement of insourced workers with crowdsourced workers Some crowdsourcing tasks can be considered as human computation tasks – Hiring crowdsourced workers for translation jobs : – Machine translation (fast, but low quality) vs. human translation (slow, high quality)

8 Social Computing Wikipedia: – “.. supporting any sort of social behavior in or through computational systems” (e.g., blogs, email, IM, SNS, wikis, social bookmarking) – “.. Supporting computations that are carried out by groups of people” (e.g., collaborative filtering, online auctions, prediction markets, reputation systems) Some other definitions: – “… applications and services that facilitate collective action and social interaction online with rich exchange of multimedia information and evolution of aggregate knowledge…” (Parameswaran, Whinston, 2007) – “… the interplay between persons' social behaviors and their interactions with computing technologies” (Dryer, Eisbach, Ark, 1999)

9 Data Mining Data mining is defined broadly as the application of specific algorithms for extracting patterns from data.” (Fayyad, Piatetsky-Shapiro, Smyth, 1996) While data mining deals with human created data, it does not involve human computation – Google PageRank “only” uses human created data (links)

10 Collective Intelligence Overarching notion: large groups of loosely organized people can accomplish great things working together – Traditional study focused on “decision making capabilities by a large group of people” Taxonomical “genome” of collective intelligence – “… groups of individuals doing things collectively that seem intelligent” (Malone, 2009) Collective intelligence generally encompasses human computation and social computing

11 Relationship Diagram Collective Intelligence Data Mining Crowdsourcing Social Computing Human Computation

12 Classifying Human Computation Motivation – What does motivate people to perform HC? Human skill – What kinds of human skills do HC tasks require? Aggregation – How to combine results of HC tasks? Quality control – How to control quality of the results of HC tasks? Processing order of different roles – Roles (requester, worker, computer) Task-request cardinality – Requester vs. Worker cardinality

13 Motivation Examples Pay (financial rewards) Mechanical Turk (online labor marketplace), ChaCha (mobile Q&A), LiveOps (a distributed call center) Altruism (just helping other people for good)helpfindjim.com (Jim Gray), Naver KiN, Yahoo! Answer Enjoyment (fun) Game With A Purpose (GWAP): http://www.gwap.com - ESP Game, Tag a Tune, Reputation (recognition) Volunteer translators at childrenslibrary.org, Naver KiN, Yahoo! Answer Implicit workreCAPTCHA

14 Quality Control Examples Output agreement ESP Game (a game for labeling images) – answer is accepted if the pair agree on the same answer Input agreement Tag-a-tune: two humans are listening to different inputs (music). They are asked to describe the music and try to decide whether they are looking at the same music or different music Economic models When money is a motivating factor; some economic models can be used to elicit quality answers (e.g., game-theoretic model of the worker’s rating to reduce the incentive to cheat) Defensive task design Design tasks so that it’s difficult to cheat (e.g., comprehension questions) Redundancy Each task is given to multiple people to separate the wheat from the chaff Statistical filtering Filter or aggregate the data in some way that removes the effects of irrelevant work Multilevel review One set of workers does the work; the second set reviews the results and rates the quality (e.g., Soylent : find-fix-verity) Automatic check fold.it (protein folding game); easy to check using computer, but hard to find answers Reputation system Motivated to provide quality answers by a reputation scoring systems; Mechanical Turk, Naver KiN, etc. Expert check Trusted expert skims or cross-checks results for relevance and apparent accuracy

15 Aggregation Examples Collection (to build a knowledge base) Artificial intelligence research; to build large DB of common sense facts (e.g., people can’t brush their hairs with a table) Examples: ESP game, reCAPTCHA, FACTory, Verbosity, etc. Wisdom of crowds (statistical processing of data) Average guess of normal people can be very close to the actual outcome; e.g., Ask500people, News Futures, Iowa Electronic Markets Search Large number of volunteers to sift through photos or videos, searching for some desired scientific phenomenon, person, or object, e.g., helpfindjim.com, Stardust@home project Iterative improvementGiving answers of previous worker to elicit better answers, e.g., MonoTrans Active learning Classifier training; selects the samples that could potentially give best training benefits and select them for manual annotations for training Genetic algorithm (search/optimization) Free Knowledge Exchange, PicBreeder None (if independent task is performed) VizWiz (a mobile app that les a blind user take a photo and ask question)

16 Human Skills, Processing Order, Task-Request Cardinality Human SkillsExamples Visual recognitionESP Game Language understandingSoylent Basic human communicationChaCha Processing OrderExamples Computer  Worker (>> Requester)reCAPTCHA Worker (player)  Requester  Computer (aggregation)ESP Game (image labeling) Computer  Worker  Requester  Computer Cyc inferred large # of common senses  FACTory, a GWAP where worker (players solve problem), Cyc performs aggregation Requester  WorkerMechanical Turk Task-Request CardinalityExamples One-to-one (one worker to one task)ChaCha Many-to-many (many workers to many tasks)ESP Game Many-to-one (many workers to one task)helpfindjim.com (Jim Gary) Few-to-one (few workers to one task)VizWiz

17 Crowdsourcing Systems on the World-Wide Web Anhai Doan, Raghu Ramakrishnan, Alon Y. Halevy Communications of the ACM Vol. 54 No. 4, Pages 86-96 2011

18 Crowdsourcing Systems (CS) Defining crowdsourcing systems: tricky – Explicit vs. implicit collaboration to solve something? CS system “enlists a crowd to help solve a problem defined by the system owners” Addressing the following issues: – How to recruit and retain users? – What contributions can users make? – How to combine user contributions to solve the target problem? – How to evaluate users and their contributions?

19 Explicit Crowdsourcing Systems

20 Implicit Crowdsourcing Systems

21 Summary Definition of human computation and crowdsourcing Relationship with other related issues Classifying human computation and crowdsourcing systems – Motivation, human skill, aggregation, quality control, processing order, task-request cardinality – Nature of collaboration, architecture, recruitment, human skill


Download ppt "Human Computation and Crowdsourcing: Survey and Taxonomy Uichin Lee Sept. 7, 2011."

Similar presentations


Ads by Google