Category Archives: Crowdsourcing

Analyzing the Veracity of Tweets during a Major Crisis

A research team at Yahoo recently completed an empirical study (PDF) on the behavior of Twitter users after the 8.8 magnitude earthquake in Chile. The study was based on 4,727,524 indexed tweets, about 20% of which were replies to other tweets. What is particularly interesting about this study is that the team also analyzed the spread of false rumors and confirmed news that were disseminated on Twitter.

The authors “manually selected some relevant cases of valid news items, which were confirmed at some point by reliable sources.” In addition, they “manually selected important cases of baseless rumors which emerged during the crisis (confirmed to be false at some point).” Their goal was to determine whether users interacted differently when faced with valid news vs false rumors.

The study shows that about 95% of tweets related to confirmed reports validated that information. In contrast only 0.03% of tweets denied the validity of these true cases. Interestingly, the results also show  that “the number of tweets that deny information becomes much larger when the information corresponds to a false rumor.” In fact, about 50% of tweets will deny the validity of false reports. The table below lists the full results.

The authors conclude that “the propagation of tweets that correspond to rumors differs from tweets that spread news because rumors tend to be questioned more than news by the Twitter community. Notice that this fact suggests that the Twitter community works like a collaborative filter of information. This result suggests also a very promising research line: it could posible to detect rumors by using aggregate analysis on tweets.”

I think these findings are particularly important for projects like *Swift River, which try to validate crowdsourced crisis information in real-time. I would also be interested to see a similar study on tweets around the Haitian earthquake to explore whether this “collaborative filter” dynamic is an emergent phenomena in this complex systems or simply an artifact of something else.

Interested in learning more about “information forensics”? See this link and the articles below:

Disaster Relief 2.0: Towards a Multipolar System?

My colleague Adele Waugaman from the UN Foundation & Vodafone Foundation Technology Partnerships has kindly invited some colleagues and I to participate on the following panel at the Mashable Social Good Summit:

Disaster relief 2.0: collaborative technologies & the future of aid
In humanitarian crises, information-sharing and coordination among relief agencies is essential. But what about communications between aid groups and individuals? From Haiti to Pakistan, collaborative technologies are enabling survivors and concerned citizens alike to become important sources of information. Join innovation experts to discuss how new citizen-centered technologies are shaping the future of disaster relief.

I expect that the panel will set the stage and tone for the upcoming 2010 International Conference on Crisis Mapping (ICCM 2010) in Boston in two weeks time. What follows then is a quick recap of where we are in the field of disaster 2.0 and where we might be headed. The recap is based on conversations with colleagues at OCHA and the Crisis Mappers Network, particularly with Oliver Hall and Nigel Snoad.

What Happened

I think it’s fair to say that the disaster response to Haiti was a departure from the past in more ways than one. The Crisis Mappers Network while relatively new played an impressive role in catalyzing rapid collaboration and open information sharing as detailed in this empirical study. In addition, the response to Haiti saw widespread and global volunteer involvement from the Haitian Diaspora and university students: more than 2,000 volunteers based in some 40 countries used their cognitive surplus to try and help those affected by the earthquake thousands of miles away. Ushahidi-Haiti and Mission4636 are both examples of volunteer based projects.

Craig Fugate is the head of FEMA

My friend and colleague Chris Blow described the change best in his phenomenal presentation on Crisis Mapping and Interaction Design. The following slides depict what we are all experiencing–the shift to a multipolar system:

In sum, the rise in informal volunteer networks is shifting the disaster response system towards a more multipolar one, not only in terms of actors but also in terms of the new technologies they employ.

Where To From Here

What does this mean for the future of disaster relief? I think this remains to be seen. The new “world order” brings new possibilities and new consequences. The shift will require both formal and informal actors to adapt and interface in different ways. If the analogy to international relations (unipolar vs multipolar world orders) is apt, then this suggests that new “institutions” are perhaps needed to manage the new constellation of actors.

The launch of the Crisis Mappers Network is a direct response to this transition. The network comprises some 900 members from both state and non-state, formal and informal actors including all the major humanitarian organizations and technology groups in the world. The newly established group Communications with Disaster Affected Communities (CDAC) is an important member. The purpose of the Crisis Mappers Network is to catalyze information sharing, collaboration, partnerships and joint learning in this rapidly changing space—hence the Crisis Mapping Conference series, Annual Meeting of the Crisis Mappers Group, Crisis Mapping Trainings, monthly webinars, blog posts, online discussions and the dedicated Crisis Mappers Google Group.

I believe the Crisis Mappers Group provides an ideal forum to help shape the new conversations, policies and applied research necessary to improve disaster relief 2.0. We need to consider new coordination and cooperation frameworks that connect formal actors with informal networks. As a community, we also need to catalyze joint learning so that informal actors deploying new technologies can learn from more experienced actors who have established best practices in disaster response.

Our Questions

The world of disaster relief 2.0 also brings new possibilities to render humanitarian response more effective and accountable. How can formal actors and informal networks collaborate to foster and implement innovations in humanitarian technology? How do we evaluate this collaboration and the impact of individual crisis mapping initiatives? Information sharing and interoperability are two additional challenges that need to be tackled by the Crisis Mapping Community. This inevitably means that some basic data standards need to be defined—or have existing ones communicated in an accessible manner to volunteer and informal networks.

Some of the most pressing questions in this new “world order” have to do with replicability and sustainability—not to mention leadership—of new crisis mapping initiatives. The challenge of future replicability (and hence reliability) is an issue that colleagues at OCHA communicated to me just weeks after Haiti; rightly so since Ushahidi-Haiti and Mission4636 were both self-organized volunteer driven efforts. I do believe there is room to professionalize some volunteer groups, hence the launch of Universities for Ushahidi (U4U) next month. It is also worth noting that Ushahidi-Chile deployed even more rapidly than Ushahidi-Haiti, even though the former was also purely volunteer driven. The same is true of the Ushahidi-Pakistan deployment called PakReport.

Sustainability in my opinion is less of a challenge if these volunteer groups are well organized and linked to local communities. In the case of Ushahidi-Haiti, for example, the project was successfully transitioned to Solutions.ht, a local software company in Port-au-Prince. But the question of leadership—or governance—is one that has not been sufficiently addressed. An accessible code of conduct is needed to guide informal actors who wish to volunteer their time in aid of disaster response projects. Not all volunteers will add value. Some may actually be disruptive if not destructive. How should state-actors and informal networks manage such situations? This code of conduct should also focus on establishing standards for local ownership, data privacy and data security.

In Sum…

The level of information sharing, collaboration and volunteer involvement in the disaster response to Haiti was unprecedented. This also means it was completely reactive, which is why the disaster relief 2.0 panel and Crisis Mapping Conference are important. They give us the opportunity to begin aligning expectations and catalyze new, responsible partnerships between established actors and informal networks so they can be more deliberate and less reactive in future responses.

In sum, while the transition to a multipolar system may initially bring some disruption, we can all choose to collaborate, iterate and learn quickly to become a more adaptive, transparent and effective community.

Another title for this post might have been “Here Come the Crowd-Sorcerers…” I’ll be following up with Crowd-Sorcerer sequels soon (to answer many readers who have been asking) but before  I do, I want to look at a prequel. In 2005, Charles Leadbeater gave what is without doubt one of my all time favorite TED Talks ever. The examples he shares—mountain bikes, telescopes and computer games—provide excellent insights into the opportunities and challenges that companies like Ushahidi face. This talk foretells what may very well be the future of crisis mapping.

If you don’t have 20 minutes to watch the talk, just continue reading since I tease out the most salient points in this post. Charles gave this talk in 2005, before Jeff Howe had even coined the term “crowdsourcing”;  before Brafman and Beckstrom’s book “Spider and the Starfish: The Unstoppable Power of Leaderless Organizations”; and way before Clay Shirky wrote his book “Here Comes Everybody: The Power of Organizing without Organizations.”

Charles starts by asking: who invented the mountain bike?  Not a company with a large R&D team. Nor a lone innovative genius in some garage. The mountain bike came from young users in northern California who were frustrated by  heavy traditional bikes and old racing bikes. So they hacked a few bikes and voila, the mountain bike was born. But it wasn’t until 10-15 years later that a small company thought to create a business out of these hacked bikes. Today, mountain bike sales account for some 65% of the bike market in the US alone.

And so, the mountain bike was created entirely by consumers, not by the mainstream bike market because they didn’t see the need, opportunity or have the incentive to create the mountain bike.

Charles argues that it is now possible to “organize without organizations: you don’t need an organization to organize, to achieve large and complex tasks like innovating new software programs” (hint hint). He notes that people (previously consumers now producers) are  increasingly becoming the source of big disruptive ideas. Some of these individuals are amateurs so “they do what they do for the love of it but they want to do it to very high standards.” They take their leisure very seriously, they refine their skills, they invest their own time, etc. This has huge organizational implications for many sectors.

Take astronomy for example. Some 30 years ago, only professional astronomers with huge and very expensive telescopes could see far into space. Today, individuals using “open source” telescopes and the Internet can do what only professional astronomers could do and help discover new stars, meteors at virtually no cost. “So there is a huge competitive argument about sustaining capacity for open source and consumer-driven innovation because it is one of the greatest competitive levers against monopoly,” says Charles.

As a former journalist, Charles recounts from a personal view the significant change that has happened in his profession. He describes the thrill of seeing others in the subway reading his article. At the same time though, he notes that readers only had two places where they could contribute: letters to the editor or the op-ed page. In the case of the former, editors would select the ones they liked, cut them in half and print them three days later. As for op-eds, if readers “knew the editor, been to school with them, slept with their wife, then they could write an article for the op-ed page.”

“Shock horror now, the readers want to be writers and publishers. That’s not their role, they’re supposed to read what we write! But they don’t want to be journalists. The journalists think that the bloggers want to be journalists. They don’t want to be journalists. They just want to have a voice, they want to have a dialogue, a conversation. They want to be part of that flow of information.

So there’s going to be tremendous struggle. But also there’s going to be tremendous movement, from the closed to the open. What you’ll see is two things that are critical, and these are two challenges for the open movement. The first is, can we really survive on volunteers? If this is so critical, do we not need this funded, organized, supported in much more structured ways? Can we really organize that just on volunteers?

And finally, what you will see is the intelligent, closed organizations moving increasingly in the open direction. So it’s not going to be a contest between two camps, but in-between them you’ll find all sorts of interesting places that people will occupy. New organization models coming about, mixing closed and open in tricky ways. […] And those organizational models it turns out are incredibly powerful and the people who can understand them will be very very successful.”

Charles ends his presentation with a final example, the biggest computer games company in China with 250,000,000 subscribers. The CEO of the company only employs 500 people to service these gamers. “How can this be?” asks Charles?

“Because basically he doesn’t service them, he gives them a platform, he gives them some rules, he gives them the tools and then he kind of orchestrates the conversation, he orchestrates the action. But actually a lot of the content is created by the users themselves. And this creates a kind of stickiness between the community and the company which is really, really powerful. […] So this is about companies built on communities that provide communities with tools, resources platforms with which they can share.”

Wanted for Pakistan: A Turksourcing Plugin for Crisis Mapping

A few days after the Haiti earthquake, Ushahidi‘s Brian Herbert set up a dedicated website to crowdsource the translation and geo-location of text messages from Haitian Kreyol to English. This allowed thousands of volunteers from across the globe to help out in the disaster response. We need something similar for crisis mapping Pakistan but Mechanical Turk style.

I coined the term “turksourcing” a while back to mean crowdsourcing applied to micro-tasks. See this previous blog post for a quick introduction. A colleague from Pakistan recently launched this Crowdmap and short code to map flood related incidents. What I’d really like to see happen now is the development of a Turksourcing plugin for this and any other crisis mapping initiatives in Pakistan.

The idea would be to set up a simple website where incoming text messages could be pushed to for tagging and geo-location. Volunteers would use their email address and a password to access the platform. Once they login, they simply select an incoming SMS which they tag based on pre-set categories like those displayed on the Crowdmap for Pakistan. Volunteers would also map the location of the incident being reported. They would then press submit and move on to the next text message.

Each SMS would have to be validated by 3 or 5 volunteers before being officially mapped. This means that a given text message is only mapped if 3+ volunteers have each assigned the SMS the same tag(s) and approximate location. This is to ensure the quality of the data. If a given user consistently mis-tags/geo-locates incoming text messages, their contributions could be automatically ignored. (As opposed to barring them from the system which would prompt them to try and game it some other way).

Volunteers could also be awarded points for each correctly tagged and geo-located SMS. A public scoreboard could be displayed with the rank of most prolific volunteers to create further incentives to help out by rewarding turksourcing efforts. This introduces a gaming component to crisis mapping as I blogged about here. Colleagues of mine with Revision Labs in Seattle have termed  this “Playsourcing”.

The map below represents the location of volunteers who helped out with the Kreyol text messages in January. There’s no reason why we can’t rally volunteers around the world to do the same for the 20 million affected Pakistanis.

I have touched base with friends at Stanford, Crowdflower and with CrisisCommons and hope someone will be able to develop a quick turksourcing plugin for crisis mapping Pakistan and future disasters. Please do get in touch if you have bandwidth to take this on or help out. My email address is patrick at irevolution dot net.

The Crowd is Always There: A Marketplace for Crowdsourcing Crisis Response

This blog post is based on the recent presentation I gave at the Emergency Social Data Summit organized by the Red Cross this week. The title of my talk was “Collaborative Crisis Mapping” and the slides are available here.

What I want to expand on is the notion of a “marketplace for crowdsourcing” that I introduced at the Summit. The idea stems from my experience in the field of conflict early warning, the Ushahidi-Haiti deployment and my observations of the Ushahidi-DC and Ushahidi-Russia initiatives.

The crowd is always there. Paid Search & Rescue (SAR) teams and salaried emergency responders aren’t. Nor can they be on the corners of every street, whether that’s in Port-au-Prince, Haiti, Washington DC or Sukkur, Pakistan. But the real first responders, the disaster affected communities, are always there. Moreover, not all communities are equally affected by a crisis. The challenge is to link those who are most affected with those who are less affected (at least until external help arrives).

This is precisely what PIC Net and the Washington Post did when they  partnered to deploy this Ushahidi platform in response to the massive snow storm that paralyzed Washington DC earlier this year. They provided a way for affected residents to map their needs and for those less affected to map the resources they could share to help others. You don’t need to be a professional disaster response professional to help your neighbor dig out their car.

More recently, friends at Global Voices launched the most ambitious crowdsourcing initiative in Russia in response to the massive forest fires. But they didn’t use this Ushahidi platform to map the fires. Instead, they customized the public map so that those who needed help could find those who wanted to help. In effect, they created an online market place to crowdsource crisis response. You don’t need professional certification in disaster response to drive someone’s grandparents to the next town over.

There’s a lot that disaster affected populations can (and already do) to help each other out in times of crisis. What may help is to combine the crowdsourcing of crisis information with what I call crowdfeeding in order to create an efficient market place for crowdsourcing response. By crowdfeeding, I mean taking crowdsourced information and feeding it right back to the crowd. Surely they need that information as much if not more than external, paid responders who won’t get to the scene for hours or days.

We talk about top-down and bottom-up approaches. Crowdfeeding is a “bottom-bottom” approach; horizontal, meshed communication for local rapid response. Information of the crowd, by the crowd and for the crowd. For the marketplace to work at the technical level, users should easily be able to map their needs or map the resources they have to help others. They should be able to do this via webform, SMS, Twitter, smart phone apps, phone call, etc.

But users shouldn’t have to keep looking back at the map to check whether anyone has posted offers to help in their area, or vice versa. They should get an automated email and/or text message when a potential match is found. The matching should be done by a simple algorithm, a Match.com for crowdsourcing crisis response. (Just like online dating, users should take appropriate precautions when contacting their match). On a practical level, this marketplace will work best if it draws many traders. That’s why the data should be easily shared across platforms.

During the Summit, the Red Cross presented findings from this study which revealed that 75% of people now expect an almost-immediate response after posting a call for help on a social media platform during a disaster. The Red Cross and other humanitarian organizations are particularly troubled by this figure. They shouldn’t be. As the Head of FEMA noted at the summit, it is high time that crisis response organizations start viewing the public as part of the team. One way to make them part of the team is to create an open marketplace for crowdsourcing crisis response.

Is Ushahidi a Liberation Technology?

Professor Larry Diamond, one of my dissertation advisers, recently published a piece on “Liberation Technology” (PDF) in the Journal of Democracy in which he cites Ushahidi and FrontlineSMS amongst other tools. Is Ushahidi really a liberation technology?

Larry recently set up the Program on Liberation Technology at Stanford University together with colleagues Joshua Cohen and Terry Winograd to catalyze more rigorous, applied research on the role of technology in repressive environments—both in terms of liberation and repression. This explains why I’ll be joining the group as a Visiting Fellow this year. The program focuses on the core questions I’m exploring in my dissertation research and ties in technologies like Ushahidi which I’m directly working on.

What is Liberation Technology? Larry defines this technology as,

“… any form of information and communication technology (ICT) that can expand political, social, and economic freedom. In the contemporary era, it means essentially the modern, interrelated forms of digital ICT—the computer, the Internet, the mobile phone, and countless innovative applications for them, including “new social media” such as Facebook and Twitter.”

As is perfectly well known, however, technology can also be used to repress. This should not be breaking news. Liberation Technology vs Digital Repression. My dissertation describes this competition as an arms-race, a cyber game of cat-and-mouse. But the technology variable is not the most critical piece, as I argue in this recent Newsweek article:

“The technology variable doesn’t matter the most,” says Patrick Meier […] “It is the organizational structure that will matter the most. Rigid structures are unable to adapt as quickly to a rapidly changing environment as a decentralized system. Ultimately, it is a battle of organizational theory.”

As Larry writes,

“Democrats and autocrats now compete to master these technologies. Ultimately, however, not just technology but political organization and strategy and deep-rooted normative, social, and economic forces will determine who ‘wins’ the race.”

That is precisely the hypothesis I am testing in my dissertation research. As the Newsweek article put it,

“The only way to stay ahead in this cyberwar, though, is to play offense, not defense. ‘If it is a cat-and-mouse game,’ says Meier of Ushahidi, ‘by definition, the cat will adopt the mouse’s technology, and vice versa.’ His view is that activists will have to get better at adopting some of the same tactics states use. Just as authoritarian governments try to block Voice of America broadcasts, so protest movements could use newer technology to jam state propaganda on radio or TV.”

Larry rightly notes that,

“In the end, technology is merely a tool, open to both noble and nefarious purposes. Just as radio and TV could be vehicles of information pluralism and rational debate, so they could also be commandeered by totalitarian regimes for fanatical mobilization and total state control. Authoritarian states could commandeer digital ICT to a similar effect. Yet to the extent that innovative citizens can improve and better use these tools, they can bring authoritarianism down—as in several cases they have.”

A bold statement for sure. But as Larry recognizes, it is particularly challenging to disentangle political, social and technology factors. This is why more empirical research is needed in this space which is largely limited to qualitative case-studies. We need to bring mixed-methods research to the study of digital activism in repressive environments. This is why I’m part of the Meta-Activism Project (MAP) and why I’m particularly excited to be collaborating on the development of a Global Digital Activism Dataset (GDADS).

Larry writes that Liberation Technology is also “Accountability Technology” in that “it provides efficient and powerful tools for transparency and monitoring.” This is where he describes the FrontlineSMS and Ushahidi platforms. In some respects, these tools have already served as liberation technologies. The question is, will innovative citizens improve these tools and use them more effectively to be able to bring down dictators? I’d love to know your thoughts.

Patrick Philippe Meier

Here Come the Crowd-Sorcerers: “No We Can’t, No We Won’t” says Muggle Master

Sigh indeed. Yawn, even.

The purpose of this series is not to make it about Paul and Patrick. That’s boring as heck. The idea behind the series was not simply to provoke and use humorous analogies but to dispel confusion about crowdsourcing and thereby provide a more informed understanding of this methodology. I fear this is getting completely lost.

Recall that it was a humanitarian colleague who came up with the label “Crowd Sorcerer”. It made me laugh so I figured we’d have a little fun by using the label Muggle in return. But that’s all it is, good fun. And of course many humanitarians see eye to eye with the Crowd Sorcerer approach, so apologies to those who felt they were wrongly placed in the Muggle category. We’ll use the Sorting Hat next time.

Henry and Erik from Ushahidi

This is not about a division between Crowd Sorcerers and Muggles. As a colleague recently noted, “the line lies somewhere else, between effective implementation of new tools and methodologies versus traditional ways of collecting crisis information.” There are plenty of humanitarians who see value in trying out new approaches. Of course, there are some who simply say “No We Can’t, No We Won’t.”

There’s no point going back and forth with Paul on every one of his issues because many of these have actually little to do with crowdsourcing and more to do with him being provoked. In this post, I’m going to stick to the debate about the in’s and out’s of crowdsourcing in humanitarian response.

On Verification

Muggle Master: And of course the way in which Patrick interprets those words bears little relation to what those words actually said, which is this: “Unless there are field personnel providing “ground truth” data, consumers will never have reliable information upon which to build decision support products.”

I disagree. Again, the traditional mindset here is that unless you have field personnel (your own people) in charge, then there is no way to get accurate information. This implies that the disaster affected populations are all liars, which is clearly untrue.

Verification is of course important—no one said the contrary. Why would Ushahidi be dedicating time and resources to the Swift platform if the group didn’t think that verification was important.

The reality here is that verification is not always possible regardless of which methodology is employed. So it boils down to this: is having information that is not immediately verified better than having no information at all? If your answer is yes or “it depends”, then you’re probably a Crowd Sorcerer. If your answer is, “lets try to test some innovative ways to make rapid verification possible,” then again, you likely are a Crowd Sorcerer/ette.

Incidentally, no one I know has advocated for the use of crowdsourced data at the expense of any other information. Crowd Sorcerers and (many humanitarians) are simply suggesting that it be considered one of multiple feeds. Also, as I’ve argued before, a combined approach of bounded and unbounded crowdsourcing is the way to go.

On Impact Evaluation

The Fletcher Team has commissioned an independent evaluation of the Ushahidi deployment in Haiti to go beyond the informal testimonies of success provided by first responders. This is a four-week evaluation lead by Dr. Nancy Mock, a seasoned humanitarian and M&E expert with over 30 years of experience in the humanitarian and development field.

Nathan Morrow will be working directly with Nancy. Nathan is a geographer who has worked extensively on humanitarian and development information systems. He is a member of the European Evaluation Society and like Nancy a member of the American Evaluation Association. Nathan and Nancy will be aided by a public health student who has several years of experience in community development in Haiti and is a fluent Haitian Creole speaker.

The evaluation team has already gone through much of the data and been in touch with many of the first responders as well as other partners. Their job is to do as rigorous an evaluation  as possible and do this fully transparently. Nancy plans to present her findings publicly at the 2010 Crisis Mappers Conference where we’ve dedicated a roundtable to reviewing these findings, as well as other reviews.

As for background, the ToR (available here) was drafted by graduate students specializing in M&E and reviewed closely by Professor Cheyanne Church, who teaches advanced graduate courses on M&E. She is considered a leading expert on the subject. The ToR was then shared on a number of listserves including the ReliefWeb, CrisisMappers Group and Pelican (a listserve for professional evaluators).

Nancy and Nathan are both experienced in the method known as utilization-focused evaluation (UFE), an approach chosen by The Fletcher Team to ensure that the evaluation is useful to all primary users as well as the humanitarian field. The UFE approach means that the ToR is a living document and being adapted as necessary by the evaluators to ensure that the information gathered is useful and actionable, not just interesting.

We don’t have anything to hide here, Muggles. This was a complete first in terms of live crisis mapping and mobile crowdsourcing. Unlike the humanitarian community, we weren’t prepared at all, nor trained, nor had prior experience with live crisis mapping and mobile crowdsourcing, nor with the use of crowdsourcing for near real-time translation, nor with managing hundreds of unpaid volunteers, nor did the vast majority of them have any background in disaster response, nor were most able to focus on this full time because of their under/graduate coursework and mid-term exams, nor did they have direct links or contacts with first responders prior to the deployment, nor did the many responders know they existed and/or who they were. In sum, they had all the odds stacked against them.

If the evaluation shows that the deployment and the Fletcher Team’s efforts didn’t save lives or are unlikely to have saved any lives, rescued people, had no impact, etc., none of us will dispute this. Will we give up? Of course not, Crowd Sorcerers don’t give up. We’ll learn and do better next time.

One of the main reasons for having this evaluation is not only to assess the impact of the deployment but to create a concrete list of lessons learned so that what didn’t work then is more likely to work in the future. The point here is to assess the impact just as much as it is to assess the potential added value of the approach for future deployments.

How can anyone innovate in a space riddled with a “No We Can’t, No We Won’t” mindset? Trial and error is not allowed, iterative learning and adaptation is as illegal as the dark arts. Some Muggles really need to read this post “On Technology and Learning, or Why the Wright Brothers Did Not Create the 747.” If die-Hard Muggles had had their way, they would have forced the brothers to close up shop after just their first attempt because it “failed.”

Incidentally, the majority of development, humanitarian, aid, etc., projects are never evaluated in any rigorous or meaningful way (if at all, even). But that’s ok because these are double (Muggle) standards.

On Communicating with Local Communities

Concerns over security need not always be used as an excuse for not communicating with local communities. We need to find a way not to exclude potentially important informants. A little innovation and creative thinking wouldn’t hurt. Humanitarians working with Crowd Sorcerers could use SMS to crowdsource reports, triangulate as best as possible using manual means combined with Swift River, cross-reference with official information feeds and investigate reports that appear the most clustered and critical.

That way, if you see a significant number of text messages reporting the lack of water in an area of Port-au-Prince then at least this gives you an indication that something more serious may be happening in that location and you can cross-reference your other sources to check whether the issue has already been picked up. Again, it’s this clustering affect that can provide important insights on a given situation.

This would provide a mechanism to allow Haitians to report problems (or complaints for that matter) via SMS, phone, etc. Imogen Wall and other experienced humanitarians have long called for this to change. Hence the newly founded group Communicating with Disaster Affected Communities (CDAC).

Confusion to the End

Me: Despite what some Muggles may think, crowdsourcing is not actually magic. It’s just a methodology like any other, with advantages and disadvantages.

Muggle Master: That’s exactly what “Muggles” think.

Haha, well if that’s exactly what Muggles think, then this is yet more evidence of confusion in the land of Muggles. Crowdsourcing is just a methodology to collect information. There’s nothing new about non-probability sampling. Understanding the  advantages and disadvantages of this methodology doesn’t require an advanced degree in statistical physics.

Muggle Master: Crowdsourcing should not form part of our disaster response plans because there are no guarantees that a crowd is going to show up. Crowdsourcing is no different from any other form of volunteer effort, and the reason why we have professional aid workers now is because, while volunteers are important, you can’t afford to make them the backbone of the operation. The technology is there and the support is welcome, but this is not the future of aid work.

This just reinforces what I’ve already observed, many in the humanitarian space are still confused about crowdsourcing. The crowd is always there. Haitians were always there. And crowdsourcing is not about volunteering. Again, crowdsourcing is just a methodology to collect information. When the UN does it’s rapid needs assessment does the crowd all of a sudden vanish into thin air? Of course not.

As for volunteers, the folks at Fletcher and SIPA are joining forces to work together on deploying live crisis mapping projects in the future. They’re setting up their own protocols, operating procedures, etc. based on what they’ve learned over the past 6 months in order to replicate the “surge mapping capacity” they demonstrated in response to Haiti and Chile. (Swift River will make the need for a large number of volunteers unnecessary).

And pray tell who in the world has ever said that volunteers should be the backbone of a humanitarian operation? Please, do tell. That would be a nice magic trick.

Muggle Master: “The technology is there and the support is welcome, but this is not the future of aid work.”

The support is welcome? Great! But who said that crowdsourcing was the future of aid work? It’s just a methodology. How can one sole methodology be the future of aid work?

I’ll close with this observation. The email thread that started this Crowd-Sorcerer series ended with a second email written by the same group that wrote the first. That second email was far more constructive and conducive to building bridges. I’m excited by the prospects expressed in this second email and really appreciate the positive tone and interest they expressed in working together. I definitely look forward to working with them and learning more from them as we proceed forward in this space and collaboration.

Patrick Philippe Meier

Here Come the Crowd-Sorcerers: Highlighting Some Misunderstandings

Welcome back, folks. Here is the third episode in our “Crowd-Sorcerers Series.” You can read the first episode on “How Technology is Disrupting the Humanitarian Space and Why It’s Easy” right here. The second episode, which in a tongue-in-cheek way asks “Is it Possible to Teach an Old (Humanitarian) Dog New Tech’s?” is available here. Those episodes will highlight what this new “Crowd-Sorcerer Series” is all about.

Oh, but just before we go to episode 3, it seems someone following this series doesn’t appear to have the good sense to recognize the sarcasm and humorous tone in my posts and thereby  missed the point entirely. I’m just using these silly analogies and metaphors to get some points across. I’m drawing a caricature, so to speak, as some of these points often get overlooked in aid/dev speak.

This is not personal at all, and I very much welcome an open conversation with all interested, i.e, the point of this series. A Muggles and Crowd-Sorcerer comparison is just for fun, it isn’t about classy/not-classy, it’s about getting a point or two across to more than just a narrow segment of the aid/dev industry. So again, like I wrote in my first blog post in the series, lets please not take ourselves too seriously, ok?

Muggles: Internet-based platforms may be generating good data within a certain segment of the IT community, such as Open Street Maps, and others like Ushahidi are providing an interesting alternative to real-time news channels, but this data is not getting to where it is needed in an operational sense – the guy/gal sitting in the tent with no Internet connection trying to plan a (name your Cluster/sector/need) survey.

The Ushahidi platform allows end-users to subscribe to alerts via SMS. And that core feature is not new to the Haiti deployment, it’s been there for a good while. Not only can users get automated SMS alerts with the Haiti deployment, but they can also define exactly the type of alerts they wish to receive by setting geographic parameters, tags and even keywords. Thanks to a new plugin for the Ushahidi platform, visual voicemail is also an option for the Ushahidi platform.

In addition, a group deploying the Ushahidi platform can respond to incoming text messages directly from the same interface, allowing for near real-time, two-way communication with the disaster affected communities. See this blog post to find out how that all works.

By the way, not all guys/gals will be sitting in a tent and/or have no Internet access. Also, not all data need to go to guys/gals in tents in the first place.

On Ushahidi being an alternative to real-time news channels, the vast majority of the information mapped on the Haiti platform during the 5 days (before the 4636 SMS short code) came from:

  1. Mainstream media (television, radio, online newspapers)
  2. The Haitian Diaspora
  3. Social media (Twitter, Facebook, Flickr)
  4. Humanitarian sources (emails, situation reports, skype chats, phone calls)

One member of the Diaspora had this to say: “We are the country’s middle and upper class and Haitians living abroad. We do we monitor the Haiti radio, Facebook feeds and Twitter from all our contacts. Filter it and redistribute it […]. We also have a few contacts on the ground in Haiti. All the information we post has been confirmed to the best of our ability.” (Thanks to Rob Munro of Mission 4636 for sharing this).

Muggles: The crucial link that is required, and that the [Humanitarian Information Management] community seems to be drifting farther and farther from as we are collectively distracted by shiny objects and/or the latest, greatest thing since sliced bread, is field-based NGOs equipped with proper information-sharing platform(s) that can be used even when there is no Internet connectivity or Washington-based (or London-based or Paris-based) IT, mapping and GIS skills and support available.

Mobile pones are not new and shiny. Nor is Google Maps. Integrating both is not new either, SMS/map integration has been around for half-a-decade. The fact that the humanitarian community faces a challenge in innovating and keeping up with technology is certainly a problem. Free and open source platforms wouldn’t be filling a technology-information void if a gap didn’t exist in the first place.

Crowd-Sorcerers want to help (the ones I know at least) and they realize full well that they’re new to this space and don’t have all the answers. They want (and actually) do partner with a number of humanitarian organizations on joint projects. But are the rest of the Muggles ready to join forces with sorcerers? Or will it take a disaster like Voldermort to make that happen? (Just in case someone missed the humorous tone here, that was a joke). Incidentally, I never mentioned the humanitarian organization (from the email thread) in my blog posts. So they are completely anonymous unless they choose otherwise.

Actually, the same Muggles that started the email exchange wrote a second email which was far, far more constructive and conducive to building bridges between Muggles and Crowd-Sorcerers than other humanitarians. I’m excited by the prospects and really appreciate the positive tone and interest they expressed in working together. I definitely look forward to working with them and learning more from them as we proceed forward in this space and collaboration.

Patrick Philippe Meier

Here Come the Crowd-Sorcerers: Is it Possible to Teach an Old (Humanitarian) Dog New Tech’s?

Thanks for joining us for the second episode in the new “Crowd-Sorcerers Series.” If you missed the season premiere on “How Technology is Disrupting the Humanitarian Space and Why It’s Easy,” you can read it here. For a quick synopsis of what this is all about, I’m responding to some initial “anti-crowdsourcing” remarks made by a frustrated humanitarian group in a recent email exchange. I’m referring to this group as Muggles after they christened the Crowdsourcing Community as “Crowd-Sorcerers.” The name calling is of course all in good fun.

Here’s more from the original email exchange:

Muggles: [Our] view is that the focus [on crowdsourcing] needs to be turned around. Don’t use crowdsourcing as technology to collect data, but as a means to distribute verified, accurate and reliable information that has been collected according to recognized/accepted standards.

Well, well, well. Isn’t this interesting? Writing that “crowdsourcing is a technology” reveals how out of touch Muggles are. Crowdsourcing is a methodology, not a technology. See my blog posts on “Demystifying Crowdsourcing” and “Know What Ushahidi Is? Think Again.” Worse, to write that crowdsourcing should be used to disseminate information shows just how much confusion exists in the humanitarian space.

The importance of information dissemination has long been documented and has nothing to do with crowdsourcing! Perhaps the term they’re looking for is “crowdfeeding” but I coined this to highlight the need for technologies that promote information dissemination by the crowd for the crowd.

Confession: I shudder when reading language like “according to recognized/accepted standards.” Not because standards are not important, but just because I’m weary of the exclusive and at times elitist attitude that tends to come with this language. I get flashbacks from “Seeing Like a State.”

Perhaps an astute reader will have recognized that the title of this blog post (Here Come the Crowd-Sorcerers”) is inspired from Clay Shirky’s book “Here Comes Everybody: The Power of Organizing Without Organizations.” I won’t try to summarize all of Clay’s many lucid observations here but I do highly recommend the book to Muggles (along with Seeing Like a State).

This type of tension between regulation and innovation has been playing out in several other sectors as well, including banking (vs. mobile banking) and perhaps most notably in journalism (vs. citizen journalism). But the tensions there have matured somewhat (at least relatively). In the latter case, people are increasingly recognizing the value of citizen journalism while better understanding its limits—so much so that large media companies have themselves started to leverage crowdsourcing for content in their programming.

The journalism community’s initial reaction against bloggers is not too dissimilar to the frustration expressed by Muggles who keep hoping that crowdsourcing will just go away if they pout and stamp their feet hard enough. (Reminds me of the way that some Muggles freaked out at the invention of the printing press and later the telephone).

Here’s the bad news folks, you’ve seen nothing yet. The Crowd-Sorcerers are just getting warmed up. The level of crowdsourcing we’ve seen to date is just the tip of the wand. Haiti was a first, just a first. User-generated content is not about to vanish any time soon. In fact, it will continue growing exponentially. The vast majority of content available on the web will soon be user-generated.

The good news? Muggles can take this as an opportunity to demonstrate leadership and share their savoir-faire. What should Muggles not do? Let me share a real example from another sector: election monitoring. One of the world’s leading election monitoring groups actively discouraged local NGOs in a developing country from contributing any reports to an Ushahidi deployment that was run in-country by a local civil society network—lets call them the Gryffindors.

The Gryffindors discovered this interference when they spoke with other local NGOs. They want to partner with these NGOs for the next elections but these groups are now hesitant. So here we have a Western (i.e. external group) directly interfering by telling local NGOs they cannot participate in a local initiative to document their own elections in their own country. (Sound familiar to the LogBase example from Episode 1? Naturally). Who do the elections belong to? Citizens or foreigners?

Muggles have the opportunity to provide unique thought leadership here. Make Crowd-Sorcerers part of the solution, not the problem.

There’s more good news. Despite what some Muggles may think, crowdsourcing is not actually magic. It’s just a methodology like any other, with advantages and disadvantages. At the end of the day, you’re just collecting information and this information can also be triangulated and verified like any other type of information.

That’s the whole point behind Swift River, to provide a free and open source platform that can help validate large quantities of information in near real time. Is it the silver bullet that we’ve all been dreaming of? Of course not, this ain’t Hogwarts. What Swift River does, however, is make the triangulation of crowdsourced information far more efficient for Muggles than ever before. So to suggest that crowdsourced information is inherently unverifiable is rather shortsighted.

Was the technology community’s response to Haiti perfect? Not even close, hence the current M&E on the Ushahidi deployment and these blog posts that I wrote up earlier this year:

In fact, much of my own frustration during the emergency period stemmed from the reckless behavior of some in the technology community. In addition, some tech folks who mean well end up producing tech solutions that don’t solve anything and never get used. So as I’ve blogged about before, tech folks need to get up to speed and get their act together. Hacking away every other weekend is all fine and well as long as the tech produced is actually in line with the needs of the humanitarian and disaster affected communities.

But lets be clear that the humanitarian community’s response to Haiti was hardly stellar (c.f., John Holmes’s leaked email, etc.). No one’s perfect, of course, and that includes Crowd-Sorcerers. The volunteer community that mobilized around the Ushahidi platform had never done anything like this (because nothing like this had quite happened) before, they had no prior training nor did they have much (if any) humanitarian experience to speak of. I, for one, had never launched an Ushahidi platform before. So boy did we all learn a heck of a lot.

Haiti was a complete first as far as live crisis mapping and mobile crowdsourcing goes. Yet Muggles  blame Crowd-Sorcerers for not getting everything right on their first try. The importance of standards is repeatedly voiced by Muggles, as noted above. Well I call this a double-standard.

Stay tuned for Episode 3 in the new series: “Here Come the Crowd-Sorcerers: Highlighting Some Misunderstandings.”

Patrick Philippe Meier

Here Come the Crowd-Sorcerers: How Technology is Disrupting the Humanitarian Space and How Easy It Is

I’ve recently been cc’d on an email thread in which a humanitarian group has started to “air out some latent issues and frustrations” vis-a-vis the use of crowdsourcing in emergencies. I applaud them for speaking up and credit them for coining the fantastic term “crowd-sorcerers” which is brilliant! The group is apparently preparing to publish a report concerning Humanitarian Information Management in Haiti. I really hope to appear in their chapter on “The Crowd-Sorcerers.”

I wonder which kind of sorcerer I am...

I too prefer candid conversations over diplomatic pillow talk. Lets be honest, it’s not actually difficult to disrupt the humanitarian system. It’s  hierarchical, overly bureaucratic, slow, often unaccountable and at times spectacularly corrupt. But I want to make sure my tone here is not misunderstood. I want to be constructive but playful and provocative at the same time, to “lighten things” up a bit. We often take ourselves way too seriously, too often. That’s why I absolutely love the term Crowd-Sorcerer! Lets use Muggles for our humanitarian friends.

I’ll first lay out some of the frustrations aired by the Muggles in their own words so I don’t  misrepresent their concerns—some of which are obviously valid (but not necessarily new). I’ll be reviewing these concerns in a series of blog posts, so stay tuned for future episodes in the new Crowd-Sorcerer Series! Caution: in case it’s not yet obvious, I will be deliberately provocative and playful in this series.

Muggles: Unless there are field personnel providing “ground truth” data, consumers will never have reliable information upon which to build decision support products. Crowdsourcing may be a quick way to get a message out, but it is not good information unless there is on-the-ground verification going on.

Not sure how you’d interpret these words but what they say to me is this: unless information comes from official field personnel, i.e., Muggles, it’s absolutely useless and should be dumped in the trash. I personally find that somewhat… is colonial too provocative?

Crisis information that was crowdsourced using the distributed short code 4636 in Haiti helped save hundreds of lives according to the Marine Corps. The vast majority of this information could not be verified and yet both the Marine Corps and Coast Guard used this as one of their feeds while FEMA encouraged the crowd-sorcerers to continue mapping, calling the crisis map of Haiti the most comprehensive and up-to-date source of information available to Muggles.

There’s another extraordinary story here, and that’s the story of Mission 4636. Tens of thousands of incoming text messages from disaster affected communities in Haiti were translated from Haitian Kreyol to English in near real-time thanks to crowdsourcing. These text messages were translated by thousands of Haitian Kreyol speaking volunteers from all around the world.

Map of volunteer locations

Without this crowdsourcing, the Marine Corps, Coast Guard, FEMA and others could not have used the information streaming in from 4636 as effectively as they did. And guess what? The original platform that was used to do this translation-by-crowdsourcing was built overnight by Brian Herbert, a 20-something tech developer at Ushahidi.

Where were the Muggles then? I’m sorry to put it in these terms but if we listened to (and waited for) Muggles all the time, then perhaps several hundred more people would have needlessly lost their lives in Haiti.

A forthcoming USIP report that reviews the deployment of the Ushahidi platform found that Haitian NGO’s and local civil society groups were physically barred from entering LogBase—the humanitarian community’s compound near the airport in Port-au-Prince. One Haitian NGO rep who was interviewed said he felt like a foreigner in his own country when he wasn’t allowed to enter LogBase and attend meetings where he could share vital information on urgent needs.

Now tell me, how is trashing Haitian text messages any different than  physically excluding Haitians from having a voice at LogBase? Because the so-called “unwashed masses” don’t have the “right” credentials as defined by the Muggles? Either way, they are excluded from having a stake in the hierarchical system that is supposed help them.

Incidentally, a fully independent evaluation led by a team of three accomplished experts in M&E  (monitoring and evaluation) are currently carrying out their impact assessment of the Ushahidi deployment during the emergency period. They will be in Haiti to for the field work and yes, one member of the team speaks fluent Kreyol. The PI from Tulane University has over 20 years of relevant experience. It would make absolutely no sense for Ushahidi to carry out this review.

Ushahidi has little to no expertise in M&E and such a review would likely be viewed as biased if Ushahidi was authoring it. In fact, Ushahidi didn’t even commission the evaluation, The Fletcher Team did, and they should be applauded for doing so. By the way, as I have blogged here, it is misguided to assume that experts in, say development, are by definition experts at evaluating development projects. M&E is a separate area of expertise and profession in it’s own right. Anyone who has taken M&E 101 will know this from the first lecture.

We’re going to a commercial break now, but stay tuned for the next episode: “Here Come the Crowd-Sorcerers: Is it Possible to Teach an Old (Humanitarian) Dog New Tech’s?”

Patrick Philippe Meier