Austin

1601 E. 5th St. #109

Austin, Texas 78702

United States

Coimbatore

Module 002/2, Ground Floor, Tidel Park

Elcosez, Aerodome Post

Coimbatore, Tamil Nadu 641014 India

Coonoor

138G Grays Hill

Opp. BSNL GM Office, Sims Park

Coonoor, Tamil Nadu 643101 India

Laguna

Block 7, Lot 5,

Camella Homes Bermuda,

Phase 2B, Brgy. Banlic,

City of Cabuyao, Laguna,

Philippines

San Jose

Escazu Village

Calle 118B, San Rafael

San Jose, SJ 10203

Costa Rica

News & Insights

News & Insights

Basic Crowdsourcing Terminology for 2014

Like any language, crowdsourcing jargon has morphed over time. Some terms have been around for years and are very familiar, while others are relatively new and reflect crowdsourcing’s ongoing evolution. Here are some must-know terms for talking about crowdsourcing in 2014.

Crowdsourcing: Crowdsourcing uses online marketplaces, in particular Amazon Mechanical Turk, to outsource business tasks that can only be performed in a timely manner by gathering a large group of trained workers. It’s the 21st century version of having a network of freelancers—faster, more effective, and more cost efficient.

Turker: A term for a worker that originated from the group of 500,000+ workers that are signed up on Amazon Mechanical Turk (AMT). However, now the term is often applied to any crowd worker, whether the platform is AMT or not.

HIT: A “human intelligence task” that cannot be performed by a machine. Turkers work on HITs, and each HIT can have multiple sub-tasks associated with it.

Gold: For “gold” questions, the crowdsourcing manager already knows the correct response. These are also called “known” answers. Gold questions are incorporated into a campaign without the worker’s knowledge, and are used to measure the worker’s accuracy, critical for ensuring quality work for any campaign.

Microwork: Microwork involves the research of very simple data points such as a company name or phone number. These tasks can be performed very quickly at low cost.

Macrowork: Macrowork is far more involved and requires highly qualified workers with training for special tasks. Macrowork may include more complex things such as writing short biographies, reviewing a product, or geocoding (discerning the latitude and longitude of a particular operation) at a much higher pay rate for workers that are considered “Masters.”

Master: A master is a worker who has accurately completed a very high number of HITs. The two types are qualification masters and categorization masters.

Qualification Master: An expert at assessing whether or not something adheres to a strict set of criteria. If a datapoint, image, or location meets particular standards, then a Master will qualify or disqualify it. For example: Is a certain photograph acceptable to be posted on a specific website?

Categorization Master: An expert at assessing and categorizing particular tasks into specific lists. For example: Is this restaurant a fast food drive-thru, a food trailer, or fine dining?

Looping: Looping is the process of asking the same question multiple times to forge a consensus answer (for example, two out of three or three out of five). Looping can be dangerous, however, because a consensus can be reached based on incorrect responses. Ill-intentioned workers can also collude on looping projects, introducing false data into the campaign.

Private crowd: Private crowds are a relatively new phenomenon where groups of qualified workers work directly for an outsourcing company essentially as freelancers, bypassing the costs and fees associated with the largest crowdsourcing platforms. More and more BPOs are moving in this direction, which presents a direct challenge to crowdsource vendors in the marketplace.

“Wisdom of the crowd”: Originally popularized by James Surowiecki‘s 2004 book, this term is gaining a higher profile, particularly in financial markets. A March 2014 Wall Street Journal article detailed how crowdsourced stock analysis returns consistently beat Wall Street analysts and financial news articles. The beauty of crowd wisdom is that a collection of varying opinions is often far more valuable than one person’s expertise. (LINK to article: http://blogs.wsj.com/venturecapital/2014/03/19/study-crowdsourced-stock-opinions-beat-analysts-news/)
book

Co-creation: This is a new term describing a blend of crowd wisdom and macrotasking. It’s essentially collective problem-solving. Rather than relying on one supposed expert, co-creation brings together a group of people, one or more of whom has specific expertise on the particular issue or problem to be solved. These types of campaigns can be set up so that workers can choose to work on the part of the project on which they can best assist.

Share on facebook
Share on twitter
Share on linkedin

Keep on top of the information industry 
with our ‘Data Content Best Practices’ newsletter:

Keep on top of the information industry with our ‘Data Content Best Practices’ newsletter: