Week 1: | 1/10 |
Foundations, terminology
NO READINGS
|
Warm-up #0: self-introduction in HTML,
due 1/15 (Sun)
|
1/12 |
"
|
||
Week 2: | 1/17 |
Crowd-powered systems
|
Warm-up #1: post a HIT via AMT web site,
due 1/22 (Sun)
|
1/19 |
Image understanding
|
||
Week 3: | 1/24 |
Interface design: forms
Instead of a normal response, please send a screenshot of an example from the web, and analyze it in the context of the respective reading. This can be in the body of the email or in an attachment, at your option.
|
|
1/26 |
Interface design: instructions
Post a screenshot of a task interface—from AMT or any other online “task”—and analyze it how well it follows the article's guidelines.
|
||
Week 4: | 1/31 |
Open problems
|
Warm-up #2: design and implement task UI,
due 2/1 (Wed)
Project: declare groups,
due 2/3 (Fri)
|
2/2 |
Hierarchical task decomposition
|
||
Week 5: | 2/7 |
"
|
|
2/9 |
Computer vision: overview
|
||
Week 6: | 2/14 |
Computer vision: overview
|
Warm-up #3: implement server backend + add UI tracking,
due 2/14 (Tue)
Project: proposal,
due 2/15 (Wed)
|
2/16 |
Computer vision: image segmentation
|
||
Week 7: | 2/21 |
Computer vision: city streets
|
|
2/23 |
Image understanding in real-time
|
||
Week 8: | 2/28 |
Security
Instead of a normal response, for each principle mentioned in the first two articles, think of one example not mentioned in the articles. Try to find examples others might not think of. We are reading less formal sources to ensure that everyone gets the high level principles as clearly as possible. There is some overlap between the first two.
|
Project: introduction,
due 3/5 (Sun)
|
3/2 |
Security
|
||
Week 9: | 3/7 |
Security quiz
No reading
|
Project: v0.1,
due 3/12 (Sun)
|
3/9 |
Real-time
|
||
Week 10: | 3/23 |
Real-time: development
No reading
|
Project: email check-in,
due 3/26 (Sun)
|
Week 11: | 3/28 |
Programming crowds: SQL (+ AMT APIs)
This paper includes an explanation of AMT platform concepts you need for warm-up #4.
|
|
3/30 |
Programming crowds: development
No reading
|
||
Week 12: | 4/4 |
Programming crowds: iterative vs. parallel
|
Warm-up #4: complete crowd-powered system,
due 4/5 (Wed)
Project: v0.5,
due 4/7 (Fri)
|
4/6 |
Machine learning: hybrid classifiers
|
||
Week 13: | 4/11 |
Quality
|
Project: email check-in,
due 4/12 (Wed)
|
4/13 |
Quality
relatively light
|
||
Week 14: | 4/18 |
Beyond AMT: human computation games
relatively light
|
Project: v1.0,
due 4/21 (Fri)
Project: evaluation data,
due 4/23 (Sun)
|
4/20 |
Beyond AMT: macro-tasks
|
||
Week 15: | 4/25 |
Beyond AMT: citizen science games
This is about Fold.it, a game for predicting protein structures. This project was a big deal. We are reading a minor paper about its design because the Nature article—while far more cited—assumes some knowledge of biology, and has less to offer for our context.
|
Project: video,
due 4/26 (Wed)
Project: report,
due 4/29 (Sat)
|
4/27 |
Projects
NO READINGS
|
Readings
How to read papers for this course
Many people find it easier to read papers if they have a purpose in mind. As you read each paper, you might find it helpful to focus on a few questions:
- What was the contribution type? (examples)
- What do the authors claim as their key contributions?
- What strategies, methods, and technologies were used?
- What generalizable knowledge does the work contribute? What research questions does it address? How will this benefit other researchers?
- Do you find the conclusions convincing? Are the results well-supported by data obtained with sound methods?
- What aspects of the work you find strongest? … and weakest?
- What would be a natural next step for the work?
Note: These questions are included only to help guide your reading. They won't apply to all of the readings. Your responses may or may not include your answers to any of these questions.
How to write your response
A good response will clearly express a well-founded opinion about the work. The requires understanding the paper and thinking about what you liked/disliked about it. Your response will show that you read and understood the paper, without directly summarizing it. This might take some practice.
As a starting point, look at the discussion think through your answers to questions #5, #6, and #7 from above. Next, look at the discussion so far (if any). Do you agree? … or disagree? Think of 2-3 insights, including 1-2 points about the paper as a whole, and 1-2 points about specific details (i.e., methods, analysis, etc.). Try to avoid duplicating existing comments.
Updated 1/12/2017:
Email reading responses to the instructor by 10am before class by email with a subject line in the following format: reading YYYYMMDD Title [ece695cps] For example, the subject for the first reading (due Jan 17), will be as follows: reading 20170110 Soylent: a word processor with a crowd inside [ece695cps] Try to follow this exactly. That will help me keep them straight in my email box. Note there are no colons or quotation marks other than whatever might be in the paper title itself. "reading" is lowercase.
There is no hard requirement on length, but 2-3 paragraphs with 100-200 words (total) is probably a good target. 2 points will be reserved for especially insightful comments. I expect that most will receive 1 point.
Your lowest 3 reading responses will be dropped. If you have a conference or other super-busy week, you're welcome to use your drops as skips. If you need to do that, please at least read the abstract and conclusion, and skim over the rest before you come to class, since we will be discussing these. Also, send me a response in the format above with just a note saying you are skipping that one. That way, I'll know it wasn't just lost.
You do not need to do a response for any paper of which the instructor is an author.
Company profiles
Find a company related to human computation or crowd labor. Give a 5-10 minute presentation in class about it. Send me the company's name, URL(s), and a short (2-3 sentences) summary by email by the night before your presentation. I will post your summary to the course web site.
Companies focused on crowdsourcing and/or human computation:
- Mechanical Turk - microtask
- CrowdFlower - microtask
- Appen - microtask
- ClickWorker - microtask
- CloudFactory - microtask
- CrowdSource - microtask
- LeadGenius - microtask, lead generation
- Lionbridge - microtask
- MicroTask - microtask
- Samasource - microtask, workforce extension
- oDesk - freelancing
- Elance - freelancing
- Freelancer - freelancing
- Scripted - freelancing, writing
- LiveOps - customer service, call center
- Needle - customer service
- Anydoor dba Conyak - translation
- Babelverse - translation of live conversation
- Gengo - translation
- Lingohub - translation of software, localization
- Lingotek - translation
- SpeakLike - translation of social media and web sites
- Transfluent - translation of social media and web site
- Transfluent - translation of social media, games, etc
- VerbalizeIt - translation of live conversation
- Viki - fan translation of tv shows, movies, etc
- InnoCentive - ideation with bounties
- Be-Novative - ideation
- Cambrian House - ideation
- Chaordix - ideation
- eYeka - ideation
- IdeaScale - ideation
- Kluster - ideation
- Polisofia - ideation, Spanish
- Springleap - advertising
- Tongal - advertising?
- Zoopa - advertising
- Adtriboo - advertising, graphic design, video production
- 99designs - graphic design
- DesignCrowd - graphic design
- Minted - graphic design
- Crowdzu - graphic design
- Qukku - video production
- VoiceBunny - voice recording
- FanFootage - live video
- Snapwire - photography
- TapShield - public safety
- Zeef - information retrieval
- ImageBrief - photography
- Kanga - local delivery
- Mobee - retail analytics
- Roamler - retail analytics
- Locu - marketing, restaurant data
- Crowdmark - grading for schools
- RecruitLoop - recruiting
- RecruitiFi - recruiting
- Crowdcurity - security audits, software testing
- Darjeelin - travel planning
Major products driven by crowdsourcing and/or human computation:
- Facebook and Twitter - content moderation
- Google Translate - driven by Translate Communities
- Google Maps - via MapMaker and Waze (acquired for $1.3 billion)
- Google Books - via reCAPTCHA
(university project acquired by Google)
- Amazon - Mechanical Turk
- Microsoft Bing - via Appen
Butler Hill
- Microsoft Translator - via Lionbridge, Clickworker, and Appen Butler Hill
- Apple Maps - via Locationary and Hopstop
- US Postal Service - reading handwritten addresses
Patents held by big companies: