Archive for Post-Meetup Reflections

My First Crowdsourcing Project

Sitting at CrowdConf the business summit this afternoon (the pre-conference conference), someone commented to me that since I worked at Servio/Cloudcrowd for nearly 3 years I was an ‘old timer’ for crowdsourcing. That got me thinking back to when I first experimented with crowdsourcing–or at least the comment got me thinking about this later once I was home.

The first time I used crowdsourcing for my business objectives was for Power 9 Pro in 2008.

There are a lots of examples–all of Power 9 Pro was arguably “crowdsourced” but the example that really sticks out in my mind as a proto-crowdsourcing model was the development of the writing team.

Long before the dreaded “Panda update” from Google, I knew that unique, value-driven content was king on the internet. But how could I quickly compete with the volume of content needed to be found?

Even then, SEO was largely about structured keyword-dense content, so I knew to compete with businesses and people who’d been posting content (and had brand clout) for years before Power 9 Pro ever showed up on the scene I knew I needed to create a team of distributed writers. I certainly didn’t have a budget to pay for full-time writers and I was busy making sure the company’s products were being sold to stores.

Turn to open-call for writers.

I focused on the Magic: the Gathering community of players (a niche), knowing my content had to be authentic because crap writing on Magic is obvious to anyone who has played for longer than 10 minutes. This was a key element to me learning about community motivation first-hand.

Now, while this was a small-scale operation, it did provide me enough experience and insight to make my next major product/service: crowdsourced SEO content writing. This is a big service application for Servio and it’s pretty interesting to think about how a simple experiment could be applied to a much larger system (with some tweaks of course).

As I said, Power 9 Pro is conceivably a fully crowdsourced project but the screening for writers after an open call to everyone (albeit with a focused open-call), learning how to guide the “crowd” of writers on what is not necessarily an intuitive writing style (SEO) particularly for a culture raised on flashy journalism with pun-filled headlines, and at the same time rewarding/motivating people to keep at it.

That experience alone set the stage for me diving head first into crowdsourcing of all sorts.

And the adventure continues! :)

Presentation at CrowdConf 2011: Building Killer Crowds to Maximize ROI

Below is the presentation I gave at CrowdConf 2011. I focused on building and maintaining crowds. My general premise is that for a crowdsourcing company one of the top assets is the crowd. I discuss a few tips for insuring you have a strong crowd that understands expectations and holds itself accountable to those expectations. I focused on the minimum tools needed, different quality control mechanisms, and provided a few examples of where we learned some lessons (at Servio).

Thoughts on CrowdConf 2011

Because I was hosting a breakout session for CrowdConf 2011, I was pretty excited going into the conference. Who would be there? What cool demos would I get to see? Who would have the next big idea [that could I use it at Servio]?

Below I run through each of the companies and highlight what I found to be notable.

I conclude the post with some thoughts and speculation on the industry and the future of labor.


Coffee & Power
New company started by Philip Rosedale that feels similar to an Elance/Guru/oDesk. The marketplace was supposedly built using crowdsourcing only. The founders also created an iPhone app quickly after launching Coffee & Power, using only the “platform.” I use platform loosely because Rosedale refers to his site as a platform but it feels more like a marketplace than an extensible platform…
I like the fact that it has its own built-in currency even though there are a ton of complications related to digital currencies–the proliferation of them makes me think the headaches are worth it though…

Creating a job, posting a skill or bidding on work appears super easy but there’s only a light layer of organization. There is a form of credibility system but it feels pretty lightweight. (“so-and-so is trusted by x number of people”)

Ultimately, I feel like I’m looking at a nice version Craigslist.

These guys are an old guard outsourcing company founded back in the 60s as a back office record keeping service. They’re supposedly really big now–from what I can find they’re a multi-billion dollar per year business. Not bad.

What’s interesting to me is that they’re not even approaching the problem from a crowdsourcing perspective. The way they look at it, they’ve been solving complex, distributed workflows for years–and doing it in highly regulated markets (financials, mutual funds, etc).

These guys are launching what I would call a platform for getting work done. It’s the logical step for a business already working with distributed labor channels.

The product is suppose to be flexible enough to plug into companies such that the company can design a workflow with their software and then tap into on-demand labor when needed.

There was no real demo of this software, so I’m a little skeptical of the depth of execution: Is it all talk and lots of hand-waving about the future of labor organization or do they have something that could plug in at Coca-Cola?

From a strategy perspective, these guys don’t strike me as making a strong technology play but I do think they have the distribution channels in place (i.e customers paying top-dollar and sustained budgeting for expanding lean-labor needs).

These guys setup a site for crowdsourcing startups/businesses. The founders were saying “platform” but it also felt more like a website than a platform.

What they showed must have been mockups or screenshots from a private section of the site because I haven’t been able to locate a login button on the site (but there is a sign out button that doesn’t work!)

From what I can tell, they’re looking to blend the needs of different participants in a business venture (investors, idea-peeps and expertise-providers) in hopes that people on the internet can carry out the necessary steps to have successful businesses. They essentially posit that you can run a completely distributed business and that to get the services you require, you either need to provide capital, skills or ideas and UFOStart is a site for channeling each of those needs/offers. Supposedly everyone gets compensated through shares or money or both.

I definitely think there is room for collective ownership but it’s all about the details and without a login, there are basically no details. :(

This was a long sales pitch in my opinion. The speaker didn’t seem particularly prepared to talk about anything but LingoTech. It may have just been a case of thinking “buyers, buyers, buyers” and then that not being the audience (or maybe it was just me).

LingoTech is integrated with Oracle and Jive (and probably others I didn’t see) which I thought was good because it positions them to strong distribution channels. (i.e the localization manager can say, “well, they’re integrated with the corporate infrastructure, so we should use {this platform}.”)

WRT to Jing he said, “companies need translation in order to understand what they’re customers are saying on their own websites.”
LingoTech also have providing Translation Memory/Glossary Support, Fuzzy Match.

LingoTech uses the following MTs: Google, Bing, ProMT.

Ultimately, I saw a company really focusing in on the translation industry with tools that are needed to remain competitive. It should be interesting to see if LingoTech can challenge LionBridge. (My money is against LingoTech because nothing “oooo, ahhhh, that’s a winner” came out of that breakout and that’s what will be needed to unseat the market gorilla).

My friend Jason left Servio to become the first employee at this company, so first it is cool to see something public of the work he’s been doing. :)

Humanoid is a platform for doing simple tasks. They supposedly have a UI for building projects. I wasn’t able to get a demo. What I like is that “humanoid” grew out of SpeakerText’s need to have a better platform option than Amazon Mechanical Turk. Basically, what AMT was providing was total crap in terms of quality (this is what the founder Matt said). To circumvent this, they built their own platform. Cool.

This is a play taken right out of ReWork.

What I don’t like is that they claim it’s a platform that “anyone can use” but they don’t have any exposed UI for it–or at least one that I could access. This makes me feel like it’s all talk and no walk, but I know my friend Jason is good so there’s got to be something to see under the hood!

They also claim that hopefully through machine learning, they’ll be able to continually reduce the need for human labor because the “system will know how to answer the question once it’s been trained by the crowd.” That seems like a mighty claim that actually has some feasibility to it in my opinion. It’s all about the type and scope of work.

For example, there are some great synergies between their parent company SpeakerText and some of the alternative offerings they have (translation, OCR, etc). It’s likely that type of work they’re referring to and not “video creation” or something complex like that.

“Industry Champions” (Panel Discussion)
Panel consisting of CEO of Trada Niel Robertson, VP of Amazon Mechanical Turk Sharon Chiarella, and CMO of uTest Matt Johnson, LiveOps EVP of Sales Matt Fisher.  

Two notable take-aways:

1) The established, focused, skill-based crowdsourcing companies (uTest, LiveOps and Trada) start thinking about attrition and have solved the attrition problem before trying to tackle the enterprise market. uTest thinks they’ve got this down–and I will say from talking with their Director of Public Relations (reports to CMO who I’ve heard speak a few times), I believe they’ve put some real effort into building a sustainable crowd of testers.

2) The VP of Amazon Mechanical Turk basically said, “I don’t think there’s much merit to crowdsourcing companies claiming to ‘test’ people before accessing projects and using that as a quality control measure.” For some reason a company that has absolutely no quality control process in place making that claim smacks of denial and makes Ms. Chiarella sound like she has no idea what’s going on. It just may be that I can say this because I’ve seen the results of testing. I’m such a fan that crowd-focused quality control measures were a focus of my talk!

Their Advice for Companies:

  • Crowdsorucing is about humans (so treat them like that)
  • Crowds are constantly evolving and creating massive data problems (so solve the data problems)
  • Never lose site of the fact that you have two customers: workers and vendors
  • Iterate into Existence: Test, tweak, test, tweak, solution and then scale

Common Myths about Crowdsourcing:

  • It’s easy (it takes a lot of money, a real solution to a real problem and time to develop)
  • Crowdsouring is a “silver bullet” for all problems
  • Crowdsourcing means bad quality
These guys had a bunch of pretty cool graphics put together on the crowdsourcing industry. I really liked how they breakdown the different implementation types into easy-to-digest images.

I say “they” because I have no idea how many people work with or if it’s just Carl Esposti doing all the work.
These guys look like they do a lot of crowdsourcing consultation work–and mostly on the buy side. Esposti’s breakout session was a focus on designing crowdsourcing solutions for companies. We need to be a bit more cosy with him and clarify that our work platform can solve all of the complex workflows he showed (because he said, “Only some of these can be solved with crowdsourcing” which implies that his perception of the technologies is in some way skewed or biased).

Their GM/VP of Sales and Marketing Mark Allen claims they are “crossing the chasm” right now. It still seems like they’re struggling with low-end work and a lack of notable customers to carry the company across the chasm.

Reflections and Speculations

Overall I had a great time getting to spy on my industry cohorts. I certainly made some good contacts that I look forward to fostering in the future. So far as how people are solving real problems, my impression is that crowdsourcing as a business methodology is still all over the place. People are just scratching at the surface of what is the future of labor.

My current attitude is “wow, we are just lightyears ahead of everyone else.” I felt like “crowdsourcing” is still not thinking about labor in the future–or if the company is thinking about labor distribution, they’re seeing themselves as simple market places for labor. A market place for labor feels more like something we’d find back in a Renaissance-Era village than in a post-internet society.

I didn’t see anyone solving the really complex workflows. Most companies seem to skirt this by creating a one-to-one, task-creator to labor-provider model.

This means companies are either making Craigslist 2.0s or creating ill-defined platform plays designed to solve the same damn simple projects we had figured out two years ago–and quickly realized there is NO money in!

What I didn’t see was a company saying, “We built a platform that any company can use to mimic their own organizational structure and tap into other on-demand workflows.” DST was the closest to this vision and he didn’t provide a demo; he showed a picture of their existing workflows (for clients) that meld in-house and crowdsourced labor. Pretty lame but at least they get the trajectory of where labor structures will ultimately arrive.

I can also tell that companies in this space are still struggling. LiveOps and uTest struck me as the most up-beat. That isn’t to say that the other companies aren’t getting deals; I’m sure they are. I just don’t think we’re seeing a group of companies that are expanding quickly–i.e. there’s no run-away company starting to take the lion’s share of the current market (and so likely the lion’s share of the future market).

I’m encouraged that we’re getting a strong base in content creation but it’s clear that we still have a lot of experimenting to do before we’ll arrive at the billion-dollar business. I’m also convicted that when we do arrive there, we’ll have one of the most adaptable, 20th century-resembling labor forces that isn’t just a hodgepodge of “chaotic market place activities” but a truly on-demand labor force ranging up and down the skill spectrum.

Crowdsourcing (Done Right) is Like an Orchestra

Last night I had the pleasure to speak at a crowdsourcing meetup hosted by Crowdflower and sponsored by ClickWorker.

From my previous visits, the meetups always felt to have a fun mix of people with different experience levels in crowdsourcing. To address this, I tried to keep the talk general touching on points I thought would be interesting to everyone.

I believe the tactic worked well because after the talk I was approached by a number of people who wanted to speak with me more about the topic, get advice on projects they’re evaluating and discuss possible partnerships. Basically, people who think crowdsourcing is as awesome as I do. :)

The event was recorded, so I will post the video when available. In the meantime, here’s gist of what I aimed to discuss:

Topic: Dispelling Disbelief and Building Trust in CrowdSourcing

I firmly believe that crowdsourcing as a business methodology is early in the adoption cycle and so we as crowdsourcing companies need to be sure we position our business approach correctly.

To help provide a bit more context, this video clip from the show Archer reflects the common prevailing perception of crowdsourcing:
A Common View About Crowdsourcing

Download Video: MP4 Ogg

As you can see, there’s a [hilarious] guy in a suit just standing before a group prompting them for ideas. There isn’t much structure, it’s chaotic and ultimately nothing of value comes of it.

Thankfully, businesses using crowdsourcing are starting to frame analogies that better represent the reality of crowdsourcing. One common analogy is ‘The Factory.’

The Factory is a very interesting analogy because it illustrates that crowdsourcing [done correctly]:

  • Assembly Lines
  • Teams
  • Constant Quality Checking
  • Specialization of Workers

In my opinion, there are some serious problems with this analogy.

Crowdsourcing already has a very bad social stigma of equating to low pay (if any) and exploitative business practices. These are the same issues that people have with factories too.

The high level bad baggage that factories connote are:

  • Sweat Shops
  • Low Pay
  • Low Skill
  • Shoddy Work
  • Outsourcing (and to some extent the decline of Western Power)

So how can we distance ourselves from these social stigmas?

The answer is to use a better analogy and one that really reflects the value and potential of crowdsourcing [done correctly].

Queue solution: “The Orchestra”

Orchestras are:

  • Skilled Participants
  • Focused Effort
  • High Quality

This means that the musicians are the crowd. But what does that make the crowdsourcing company?

The conductor:

  • Synchronization
  • Teams
  • Constant Quality Checking
  • Specialization of Workers

I’ve personally used this analogy a number of times while talking with prospective business partners and it’s been extremely effective at achieving the same “ah, I get it now” reaction as the factory but it simultaneously distances crowdsourcing from the concept of ‘low pay, low skill, shitty quality.’

When building trust with businesses, it’s fine to have the high-level discussion but we invariably have to turn to specifics. And to help with that, here were three easy to remember points that I believe are absolutely critical every good crowdsourcing company (including contest and game companies) need to have in place:

1. Mimic the real world by creating social accountability:

  • Disallow future access to worker. This is probably the most basic form of social accountability.
  • Reputation Systems: These are becoming more and more common in the crowdsourcing space because they’re extremely powerful for maintaining “crowd accountability.” A reputation system can be as simple as something we see on Ebay with number of sales, number of positive reviews or something in-depth like on Elance, Guru and Trada which all maintain a slew of metrics to help contextualize the social standing of a crowdsourced labor provider (i.e. workers). A reputation system can also be a blend, using statistics to gauge worker reliability. Whichever route you go, be sure to maintain social accountability.

2. Be sure that quality control is part of the design process first, not an after thought

  • Thinking back in terms of the orchestra, we need to be sure play the role of the conductor and maintain control over the quality. As the crowdsourcing company (or participant if we’re simply hosting a contest or creating a game), if we forget to account for methods of controlling quality, we’ll quickly run into trouble due to the magnitude (i.e. scale) of work produced by the crowd.
  • A common quality control method is peer review. This is seen at most companies ranging from Clickworker to Servio and is extremely effective.
  • Another equally effective method of controlling quality is by gating users. Simple examples include ‘pre-tests’ like you’ll see with Crowdflower tasks on Amazon Mechanical Turk or credential tests as used by Servio. Other curation techniques exist and there is discussion whether this is ‘real crowdsourcing.’ It’s more a question of ‘pure’ versus ‘practical’ in my opinion.
  • Ultimately, you need to have some form of quality control up front and make sure it’s not an afterthought.

3. Don’t rush in blindly

  • It’s easy to think that crowdsourcing is easy: “All I have to do is put the work up and people will do it. Done.”
  • The reality of crowdsourcing is that it is not straight forward. Maintaining and building crowds takes time, money and commitment. Do not underestimate the commitment. This means, essentially, make sure you have a real business model in place that will sustain the crowd. If all you do is get one deal and no prospects of a future deal using the same skill set your developing in your crowd, then you’ll likely be wasting time and money that could be better spent in other areas. I’ve been down this road in the past and it’s very painful. The benefits of crowdsourcing are scale and consistency. If you don’t have a consistent flow of work to “feed the crowd,” then you’ll never achieve the level of scale and consistency that crowdsourcing can provide. You’ll basically just be a faking it–which means that you’ll really be following old school business processes (using freelancers or building internal teams) but claiming to be a crowdsourcing company. So, be sure to think about whether you’re truly committed to the business model and the crowd before jumping in.

Discussing Crowdsourcing: A Complete Idiot’s Guide

I just got home from a crowdsourcing meetup at CrowdFlower where we were introduced to Aliza Sherman author of The Complete Idiot’s Guide to Crowdsourcing and consultant at MediaEgg.

I was keen to meet Aliza since we share a similar topic, crowdsourcing, and thought it would be good to get more exposure to another weltanschauung on crowdsourcing.

Aliza gave a fairly brief introduction to her book, discussed its genesis (very insightful on the book topic and author selection), and her experience gathering material for the book.

One of the standouts of the introduction was her quest for a definition of crowdsourcing.

This is an issue I relate to completely.

Her story was that someone on her Facebook page said something along the lines of, “Why should I care about crowdsourcing if you can’t even give me a definition in 10 words or less?”

Completely valid question! I felt the definition of crowdsourcing was so nebulous that my first post was titled, My definition of Crowdsorucing.

Aliza, the clever author she is, came up with a definition, and did it in less than 10 words. (Were I a methodical reporter, you would have the eight word definition now.)

The emphasis for Aliza’s view on crowdsourcing was that there was a procedure or methodological approach for processing and synthesizing the data originating from a crowd. The value proposition of course was “scale of intellectual potential.”

(An endless topic for sure.)

After hearing her introduction, I was still interested in understanding more about were she sees the demarcation of crowdsourcing and “not-crowdsourcing-but-something-else”, so during a Q&A session I asked,

Frequently, I see companies or media outlets saying that when they post a question to Twitter, they’re “crowdsourcing for a solution.” Do you agree that companies gathering information through social media platforms are “crowdsourcing” information?

Her definitive answer was “No.” Her head was shaking no before I even finished.

She delved deeper into her perception that there is a difference between accessing a mob (my word) and leveraging crowds, and that this difference is in the approach. From the pseudo-discussion that followed, ala questions from the “crowd,” it was apparent this is a weak footing for an argument (a stance that I am sharing with Aliza). Basically, is it a formal process that defines “crowdsourcing” or can an informal process of posing an issue/work-problem/etc to a crowd which provides feedback that is then synthesized and acted upon be considered crowdsourcing? (I would love to hear your perspective because I will explore this more in the future.)

Anecdotal note: a lot of Aliza’s research included interviewing CEOs at crowdsourcing companies, (15 in all) and every single CEO defined crowdsourcing differently. She then qualified the statement that the definitions were a reflection of the company’s underlying business model. Go figure, a nascent business methodology with undefined market potential ($$) being defined differently by different CEOs.

I was fortunate enough to speak with a few more attendees after Aliza spoke, including the CFO/VP of Strategy for CrowdFlower Rich Arnold. Even while Rich and I were discussing crowdsourcing, I could tell he and I had ever so slightly different definitions and paradigmatic views on crowdsourcing and its applications—and these differences were absolutely rooted in the different approach our companies have toward business solutions.