Crowdsourcing Honesty?

I set an all-time personal record this past week: my MacBook was dormant for five consecutive days. I dedicate this triumph to the delightful friends with whom I spent New Year’s. Indeed, I had the pleasure of celebrating with friends from Digital Democracy, The Fletcher School and The Global Justice Center on a Caribbean island for some much needed time off.

Ariely

We all brought some good reading along and I was finally able to enjoy a number of books on my list. One of these, Dan Ariely’s “Predictably Irrational” was recommended to me by Erik Hersman, and I’m really glad he did. MIT Professor Ariely specializes in behavioral economics. His book gently discredits mainstream economics. Far from being rational agents, we are remarkably irrational in our decision-making, and predictably so.

Ariely draws on a number of social experiments to explicate his thesis.

For social scientists, experiments are like microscopes or strobe lights. They help us slow human behavior to a frame-by-frame narration of events, isolate individual forces, and examine those forces carefully and in more detail. They let us test directly and unambiguously what makes us tick.

In a series of fascinating experiments, Ariely seeks to understand what factors influence our decisions to be honest, especially when we can get away with dishonesty. In one experiment, participants complete a very simple math exercise. When done, the first set of participants (control group) are asked to hand in their answers for independent grading but the second set are subsequently given the answers and asked to report their own scores. At no point do the latter hand in their answers; hence the temptation to cheat.

In this experiment, some students are asked to list the names of 10 books they read in high school while others are asked to write down as many of the Ten Commandments as they can recall prior to the math exercise. Ariely’s wanted to know whether this would have any effect on the honesty of those participants reporting their scores? The statistically significant results surprised even him: “The students who had been asked to recall the Ten Commandments had not cheated at all.”

In fact, they averaged the same score as the (control) group that could not cheat. In contrast, participants who were asked to list their 10 high school books and self-report their scores cheated: they claimed grades that were 33% higher than those who could not cheat (control group).

What especially impressed me about the experiment […] was that the students who could remember only one or two commandments were as affected by them as the students who remembered nearly all ten. This indicated that it was not the Commandments themselves that encouraged honestly, but the mere contemplation of a moral benchmark of some kind.

Ariely carried out a follow up experiment in which he asked some of his MIT students to sign an honor code instead of listing the Commandments. The results were identical. What’s more, “the effect of signing a statement about an honor code is particularly amazing when we take into account that MIT doesn’t even have an honor code.”

In short, we are far more likely to be honest when reminded of morality, especially when temptation strikes. Ariely thus concludes that the act of taking an oath can make all the difference.

I’m intrigued by this finding and it’s potential application to crowdsourcing crisis information, e.g., Ushahidi‘s work in the DRC. Could some version of an honor code be introduced in the self-reporting process? Could the Ushahidi team create a control group to determine the impact on data quality? Even if impact were difficult to establish, would introducing an honor code still make sense given Ariely’s findings on basic behavioral psychology?

Patrick Philippe Meier

4 responses to “Crowdsourcing Honesty?

  1. Is it the code, per se, that made people honest, or was it a sense of grander purpose? I agree to ToS on websites, but don’t abide by them. The 10 Commandments or a formal honor code would seem to convey more importance to the subject of the experiment, so perhaps they were more honest?

    If so, then Ushahidi reporters would probably be fairly honest, regardless of an honor code. They know the importance of what they are doing.

  2. Hi Patrick,

    Great post as usual. I’m running a series of posts over at the Humanitarian Futures Programme blog about honesty and political motivation in crowd sourced news and crisis response.

    I wonder what you and your readers might make of it, particularly the potential for manipulation and partisan use of such tools? See the examples about the IDF’s use of Twitter or the Chinese Government’s hiring of pro-Chinese bloggers to influence public opinion.

    The post is here, would love your comments:

    http://humanitarianfutures.wordpress.com/2009/01/11/ushahidi-crowd-sourcing-crisis-information/

    It’s a wild and wooly world out there online!

  3. Pingback: Ushahidi Comes to India for the Elections (Updated) « iRevolution

  4. Pingback: The Best of iRevolution: Four Years of Blogging | iRevolution

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s