Crowd Computing
Courses
Cod: 22293
Department: DCET
ECTS: 6
Scientific area: Computer Engineering
Total working hours: 156
Total contact time: 30

Crowdsourcing and human computing are emerging themes, combining computer science and economics, to understand how people can be used to solve complex tasks that are beyond the capabilities of algorithms and artificial intelligence. In this curricular unit, students will acquire crowd programming skills.

It is expected that the student when completing this course unit will be able to:
  1. Develop applications that use crowdsourcing platforms (Amazon Mechanical Turk, oDesk);
  2. Apply techniques and principles of usability in adapting to the crowd for high quality responses;
  3. Use statistical methods to improve the quality of the work received,
  4. Create systems that relate to the crowd's work in real time;
  5. Conduct experiments to better understand the differences between different sources of work of the crowd.

  1. Introduction to crowdsourcing: concepts of crowdsourcing, human computing, and collective intelligence.
  2. Crowd Workers: tool design; new generation interfaces; ethical principles of crowdsourcing.
  3. Crowdsourcing platforms: Amazon Mechanical Turk; CrowdFlower; oDesk.
  4. Programming concepts for human computing.
  5. Iterative and parallel processing.
  6. Taxonomy for crowdsourcing and human computing: motivation; quality control; aggregation; human capabilities; and flow and process control.
  7. Workflows.
  8. Quality control: methods based on agreements; gold standards; economic incentives; reputation systems.
  9. Specialized crowds.
  10. Real time-crowdsourcing.
  11. Machines and the crowds.
  12. Case studies.

Howe, Jeff. The rise of crowdsourcing. Wired magazine 14.6 (2006): 1-4.
 
Quinn, Alexander J., and Bederson, Benjamin B. Human computation: a survey and taxonomy of a growing field. CHI 2011.
 
Law, E., & Ahn, L. V. (2011). Human computation. Synthesis Lectures on Artificial Intelligence and Machine Learning, 5(3), 1-121.
 
Marcus, A., & Parameswaran, A. (2015). Crowdsourced data management: Industry and academic perspectives. Foundations and Trends in Databases, 6(1-2), 1-161.
 
Kittur, Aniket, et al. Crowdforge: Crowdsourcing complex work. UIST 2011.
 
Geiger, David, et al. Managing the Crowd: Towards a Taxonomy of Crowdsourcing Processes. AMCIS. 2011.
 
Bernstein, Michael S., et al. Crowds in two seconds: Enabling realtime crowdpowered interfaces. UIST 201
 
Sankar, Shyam. The Rise of Human Computer Cooperation. TED Talk Video (12 mins).
 
Kittur, Aniket, et al. The future of crowd work. CSCW 2013.
 
Woolley, Anita Williams, et al. Evidence for a collective intelligence factor in the performance of human groups. Science, 330.6004 (2010).

Evaluation is made on individual basis and it involves the coexistence of two modes: continuous assessment (60%) and final evaluation (40%). Further information is detailed in the Learning Agreement of the course unit.