A Research Framework for Next Generation Humanitarian Technology and Innovation

Humanitarian donors and organizations are increasingly championing innovation and the use of new technologies for humanitarian response. DfID, for example, is committed to using “innovative techniques and technologies more routinely in humanitarian response” (2011). In a more recent strategy paper, DfID confirmed that it would “continue to invest in new technologies” (2012). ALNAP’s important report on “The State of the Humanitarian System” documents the shift towards greater innovation, “with new funds and mechanisms designed to study and support innovation in humanitarian programming” (2012). A forthcoming land-mark study by OCHA makes the strongest case yet for the use and early adoption of new technologies for humanitarian response (2013).

picme8

These strategic policy documents are game-changers and pivotal to ushering in the next wave of humanitarian technology and innovation. That said, the reports are limited by the very fact that the authors are humanitarian professionals and thus not necessarily familiar with the field of advanced computing. The purpose of this post is therefore to set out a more detailed research framework for next generation humanitarian technology and innovation—one with a strong focus on information systems for crisis response and management.

In 2010, I wrote this piece on “The Humanitarian-Technology Divide and What To Do About It.” This divide became increasingly clear to me when I co-founded and co-directed the Harvard Humanitarian Initiative’s (HHI) Program on Crisis Mapping & Early Warning (2007-2009). So I co-founded the annual Inter-national CrisisMappers Conference series in 2009 and have continued to co-organize this unique, cross-disciplinary forum on humanitarian technology. The CrisisMappers Network also plays an important role in bridging the humanitarian and technology divide. My decision to join Ushahidi as Director of Crisis Mapping (2009-2012) was a strategic move to continue bridging the divide—and to do so from the technology side this time.

The same is true of my move to the Qatar Computing Research Institute (QCRI) at the Qatar Foundation. My experience at Ushahidi made me realize that serious expertise in Data Science is required to tackle the major challenges appearing on the horizon of humanitarian technology. Indeed, the key words missing from the DfID, ALNAP and OCHA innovation reports include: Data Science, Big Data Analytics, Artificial Intelligence, Machine Learning, Machine Translation and Human Computing. This current divide between the humanitarian and data science space needs to be bridged, which is precisely why I joined the Qatar Com-puting Research Institute as Director of Innovation; to develop and prototype the next generation of humanitarian technologies by working directly with experts in Data Science and Advanced Computing.

bridgetech

My efforts to bridge these communities also explains why I am co-organizing this year’s Workshop on “Social Web for Disaster Management” at the 2013 World Wide Web conference (WWW13). The WWW event series is one of the most prestigious conferences in the field of Advanced Computing. I have found that experts in this field are very interested and highly motivated to work on humanitarian technology challenges and crisis computing problems. As one of them recently told me: “We simply don’t know what projects or questions to prioritize or work on. We want questions, preferably hard questions, please!”

Yet the humanitarian innovation and technology reports cited above overlook the field of advanced computing. Their policy recommendations vis-a-vis future information systems for crisis response and management are vague at best. Yet one of the major challenges that the humanitarian sector faces is the rise of Big (Crisis) Data. I have already discussed this here, here and here, for example. The humanitarian community is woefully unprepared to deal with this tidal wave of user-generated crisis information. There are already more mobile phone sub-scriptions than people in 100+ countries. And fully 50% of the world’s population in developing countries will be using the Internet within the next 20 months—the current figure is 24%. Meanwhile, close to 250 million people were affected by disasters in 2010 alone. Since then, the number of new mobile phone subscrip-tions has increased by well over one billion, which means that disaster-affected communities today are increasingly likely to be digital communities as well.

In the Philippines, a country highly prone to “natural” disasters, 92% of Filipinos who access the web use Facebook. In early 2012, Filipinos sent an average of 2 billion text messages every day. When disaster strikes, some of these messages will contain information critical for situational awareness & rapid needs assess-ment. The innovation reports by DfID, ALNAP and OCHA emphasize time and time again that listening to local communities is a humanitarian imperative. As DfID notes, “there is a strong need to systematically involve beneficiaries in the collection and use of data to inform decision making. Currently the people directly affected by crises do not routinely have a voice, which makes it difficult for their needs be effectively addressed” (2012). But how exactly should we listen to millions of voices at once, let alone manage, verify and respond to these voices with potentially life-saving information? Over 20 million tweets were posted during Hurricane Sandy. In Japan, over half-a-million new users joined Twitter the day after the 2011 Earthquake. More than 177 million tweets about the disaster were posted that same day, i.e., 2,000 tweets per second on average.

Screen Shot 2013-03-20 at 1.42.25 PM

Of course, the volume and velocity of crisis information will vary from country to country and disaster to disaster. But the majority of humanitarian organizations do not have the technologies in place to handle smaller tidal waves either. Take the case of the recent Typhoon in the Philippines, for example. OCHA activated the Digital Humanitarian Network (DHN) to ask them to carry out a rapid damage assessment by analyzing the 20,000 tweets posted during the first 48 hours of Typhoon Pablo. In fact, one of the main reasons digital volunteer networks like the DHN and the Standby Volunteer Task Force (SBTF) exist is to provide humanitarian organizations with this kind of skilled surge capacity. But analyzing 20,000 tweets in 12 hours (mostly manually) is one thing, analyzing 20 million requires more than a few hundred dedicated volunteers. What’s more, we do not have the luxury of having months to carry out this analysis. Access to information is as important as access to food; and like food, information has a sell-by date.

We clearly need a research agenda to guide the development of next generation humanitarian technology. One such framework is proposed her. The Big (Crisis) Data challenge is composed of (at least) two major problems: (1) finding the needle in the haystack; (2) assessing the accuracy of that needle. In other words, identifying the signal in the noise and determining whether that signal is accurate. Both of these challenges are exacerbated by serious time con-straints. There are (at least) two ways too manage the Big Data challenge in real or near real-time: Human Computing and Artificial Intelligence. We know about these solutions because they have already been developed and used by other sectors and disciplines for several years now. In other words, our information problems are hardly as unique as we might think. Hence the importance of bridging the humanitarian and data science communities.

In sum, the Big Crisis Data challenge can be addressed using Human Computing (HC) and/or Artificial Intelligence (AI). Human Computing includes crowd-sourcing and microtasking. AI includes natural language processing and machine learning. A framework for next generation humanitarian technology and inno-vation must thus promote Research and Development (R&D) that apply these methodologies for humanitarian response. For example, Verily is a project that leverages HC for the verification of crowdsourced social media content generated during crises. In contrast, this here is an example of an AI approach to verification. The Standby Volunteer Task Force (SBTF) has used HC (micro-tasking) to analyze satellite imagery (Big Data) for humanitarian response. An-other novel HC approach to managing Big Data is the use of gaming, something called Playsourcing. AI for Disaster Response (AIDR) is an example of AI applied to humanitarian response. In many ways, though, AIDR combines AI with Human Computing, as does MatchApp. Such hybrid solutions should also be promoted   as part of the R&D framework on next generation humanitarian technology. 

There is of course more to humanitarian technology than information manage-ment alone. Related is the topic of Data Visualization, for example. There are also exciting innovations and developments in the use of drones or Unmanned Aerial Vehicles (UAVs), meshed mobile communication networks, hyper low-cost satellites, etc.. I am particularly interested in each of these areas will continue to blog about them. In the meantime, I very much welcome feedback on this post’s proposed research framework for humanitarian technology and innovation.

 bio

17 responses to “A Research Framework for Next Generation Humanitarian Technology and Innovation

  1. Hi Patrick, great summary!

    One of the things that should be on the agenda is both the social and organizational behaviors and actions of stakeholders. I am hoping to research this during my PhD at GWU. While technological achievements are undoubtedly being made that advance the humanitarian cause, society and organizations will be prompted change how they organize themselves for collective action. Ultimately, technology is the impetus for change, but what should that change look like? How should organizations prepare? What is the future role of society in crisis? The emergency manager? What are their core competencies? How does this enable better “resilience”? Policy implications?

    Advancements in technology don’t automically translate into better resilience. The frameworks must be in place to enable technology and innovation to become sustainable and effective.

    • Hi Brandon, many thanks for your input. Yes, I completely agree, the organization behavior piece is hugely important. In some ways, as the forthcoming OCHA report acknowledges, new technologies are “displacing” established humanitarian organizations as more and more disaster-affected communities take matters into their own hands. Of course, they always have, they are by definition the real first responders. But now, unlike in the analog world, they can mobilize and self-organize even faster thanks to digital technologies and social media. So I would be particularly interested in understanding self-organizational behaviors as well as the organizational behaviors of established institutions. Thoughts?

      “Advancements in technology don’t automically translate into better resilience. The frameworks must be in place to enable technology and innovation to become sustainable and effective.”

      I completely agree again. I’ve blogged about this quite a bit, and made the same arguments in the field of conflict early warning and early response. My upcoming blog post on the forthcoming OCHA report also emphasizes this point again, particularly in light of the fact that according to said OCHA report only 3% of humanitarian spending is allocated to preparedness! But again, many local communities have proven themselves to be far more agile and innovative than established humanitarian organizations. They also learn more quickly and adapt new technologies faster. So they should be in the ones explaining how they translate advancements in new technology into disaster resilience, me thinks.

      Thanks again for taking the time comment!

      • In reference to your point about…”So I would be particularly interested in understanding self-organizational behaviors as well as the organizational behaviors of established institutions. Thoughts?”

        Right on! I totally agree as well. The question in this age is about how emergent networks and established organizations/networks can work collaboratively to fill preparedness, response and recovery gaps and achieve optimal efficiency and effectiveness. Technology, with its new architecture and capabilities, is the great enabler for this collective behavior. Command and control has to give way to networked approaches that recognize how collective cooperation can take us to so many new heights.

  2. The second best seller book ever is the Euclidean geometry; the first is the Torah (/Bible). I guess the peoples who have read them both are fewer than ones who have read only one of them!
    A group posting since more than one year great posts on drones is: https://mailman.stanford.edu/mailman/listinfo/drone-list I’m one of its subscribers since more than one year; I’m available to give you all of the posts I gathered since my subscription.

  3. Excellent piece! I could only hope NGO people grasp even a tiny fraction of your conviction about the importance of not just ICT and ICT4D, but the potential and necessities for crowdsourcing, artificial intelligence, big data and more.
    In March 2009, the OECD stated : « Information & Communication Technologies (ICT) improve the efficiency of emergency response. However, most international & non governmental organizations do not go beyond using email and web sites to distribute information and organize data collection ». That understanding may have improved a bit in the past couple of years, but far slower than the use of mobiles, facebook and the internet, not talking of the pace of technologies.
    The growing interest by donors sure will be an incentive, yet I believe success stories can be the greatest inspiration. The work you are doing is so important in that matter.

    • Thanks for reading, Agnes! Yes, I think that was the 2008 OECD Report authored by David Nyheim, which I happen to have served as a reviewer for. The full quote:

      “Technological advancements have played an important role in improving the efficiency and effectiveness of early warning systems. Most inter-governmental and non-governmental systems, however, have not gone beyond the use of email and websites for dissemination, and communication technology for data collection. Governmental and some inter-governmental systems do benefit from access and resources to use satellite and GIS in their analysis and reporting. However, access to technology remains very unequal between systems.”

      Key summary:

      https://earlywarning.wordpress.com/2008/07/02/nyheim-oecd

      Yes, I agree, the understanding has improved since then and slower than one would like. About two weeks after the Haiti Earthquake, colleagues at OCHA were receiving one information email per minute, which they needed to classify, analyze and forward to the relevant sectoral team. They had to do this manually! In 2010! These are not computationally difficult challenges. Solutions to this kind of problem have already been developed. Check out this screenshot of a platform built 5 years ago to parse through zillions of emails about saving polar bears! http://wp.me/aecFU-2Tp

  4. Reblogged this on DisasterNet and commented:
    Technology is changing how we operate and I believe we need an vision for the future in order to succeed. Patrick lays out a great intro for the technology aspects. What do you think is in our future?

  5. Hey Patrick,
    Great post, thanks very much.
    I really wonder what solutions you propose to get aid agencies to really invest in leveraging the power of tech and VTCs. There are of course important work done but we are far from idea.
    How much aid agencies (and tech groups) have advance since disaster 2.0?
    Thanks and speak soon.

    • Hey Jacobo, thanks for reading and sharing. I’m optimistic. The launch of the Digital Humanitarian Network will be seen as a milestone when we look back a few years from now. The ecosystem of digital volunteer communities is growing and forming important connections. In terms of other solutions, just look at the new DfID & USAID Technology Fund, for example. These are important steps forward and those innovation reports do help, in my opinion, not to mention the upcoming 2013 OCHA report. All in all, I’m actually surprised at how quickly things are shifting and remain excited for what is to come. Thanks again!

  6. Pingback: Zooniverse: The Answer to Big (Crisis) Data? | iRevolution

  7. Francesc Miralles

    Thanks for the post and the discussion / comments!
    At NITIM (www.nitim.org) we are developing a project on “Crisis Management Networks”. We plan to struggle with all these issues. Thanks again!

  8. Pingback: Humanitarianism in the Network Age: Groundbreaking Study | iRevolution

  9. Pingback: Honoring the Liebster Award… | Peace Is.......

  10. Pingback: What is Big (Crisis) Data? | iRevolution

  11. Pingback: Sentient Potential | Building Bridges: Humanitarian Efforts to Artificial Intelligence – Dr. Soenke Ziesche

  12. Pingback: World Disaster Report: Next Generation Humanitarian Technology | iRevolution

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s