Tag Archives: Future

A 10 Year Vision: Future Trends in Geospatial Information Management

Screen Shot 2016-02-07 at 12.35.09 PM

The United Nations Committee of Experts on Global Geospatial Information Management (UN-GGIM) recently published their second edition of Future Trends in Geospatial Information Management. I blogged about the first edition here. Below are some of the excerpts I found interesting or noteworthy. The report itself is a 50-page document (PDF 7.1Mb).

  • The integration of smart technologies and efficient governance models will increase and the mantra of ‘doing more for less’ is more relevant than ever before.
  • There is an increasing tendency to bring together data from multiple sources: official statistics, geospatial information, satellite data, big data and crowdsourced data among them.
  • New data sources and new data collection technologies must be carefully applied to avoid a bias that favors countries that are wealthier and with established data infrastructures. The use of innovative tools might also favor those who have greater means to access technology, thus widening the gap between the ‘data poor’ and the ‘data rich’.
  • The paradigm of geospatial information is changing; no longer is it used just for mapping and visualization, but also for integrating with other data sources, data analytics, modeling and policy-making.
  • Our ability to create data is still, on the whole, ahead of our ability to solve complex problems by using the data.  The need to address this problem will rely on the development of both Big Data technologies and techniques (that is technologies that enable the analysis of vast quantities of information within usable and practical timeframes) and artificial intelligence (AI) or machine learning technologies that will enable the data to be processed more efficiently.
  • In the future we may expect society to make increasing use of autonomous machines and robots, thanks to a combination of aging population, 
rapid technological advancement in unmanned autonomous systems and AI, and the pure volume of data being beyond a human’s ability to process it.
  • Developments in AI are beginning to transform the way machines interact with the world. Up to now machines have mainly carried out well-defined tasks such as robotic assembly, or data analysis using pre-defined criteria, but we are moving into an age where machine learning will allow machines to interact with their environment in more flexible and adaptive ways. This is a trend we expect to 
see major growth in over the next 5 to 10 years as the technologies–and understanding of the technologies–become more widely recognized.
  • Processes based on these principles, and the learning of geospatial concepts (locational accuracy, precision, proximity etc.), can be expected to improve the interpretation of aerial and satellite imagery, by improving the accuracy with which geospatial features can be identified.
  • Tools may run persistently on continuous streams of data, alerting interested parties to new discoveries and events.  Another branch of AI that has long been of interest has been the expert system, in which the knowledge and experience of human experts 
is taught to a machine.
  • The principle of collecting data once only at the highest resolution needed, and generalizing ‘on the fly’ as required, can become reality.  Developments of augmented and virtual reality will allow humans to interact with data in new ways.
  • The future of data will not be the conflation of multiple data sources into a single new dataset, rather there will be a growth in the number of datasets that are connected and provide models to be used across the world.
  • Efforts should be devoted to integrating involuntary sensors– mobile phones, RFID sensors and so
on–which aside from their primary purpose may produce information regarding previously difficult to collect information. This leads to more real-time information being generated.
  • Many developing nations have leapfrogged in areas such as mobile communications, but the lack of core processing power may inhibit some from taking advantage of the opportunities afforded by these technologies.
  • Disaggregating data at high levels down to small area geographies. This will increase the need to evaluate and adopt alternative statistical modeling techniques to ensure that statistics can be produced at the right geographic level, whilst still maintaining the quality to allow them to be reported against.
  • The information generated through use of social media and the use of everyday devices will further reveal patterns and the prediction of behaviour. This is not a new trend, but as the use of social media 
for providing real-time information and expanded functionality increases it offers new opportunities for location based services.
  • There seems to have been
 a breakthrough from 2D to 3D information, and
 this is becoming more prevalent.

 Software already exists to process this information, and to incorporate the time information to create 4D products and services. It 
is recognized that a growth area over the next five to ten years will be the use of 4D information in a wide variety of industries.
  • 
 The temporal element is crucial to a number of applications such as emergency service response, for simulations and analytics, and the tracking of moving objects. 
 4D is particularly relevant in the context of real-time information; this has been linked to virtual reality technologies.
  • Greater coverage, quality and resolution has been achieved by the availability of both low-cost and affordable satellite systems, and unmanned aerial vehicles (UAVs). This has increased both the speed of collection and acquisition in remote areas, but also reduced the cost barriers of entry.
  • UAVs can provide real-time information to decision-makers on the ground providing, for example, information for disaster manage-ment. They are
 an invaluable tool when additional information 
is needed to improve vital decision making capabilities and such use of UAVs will increase.
  • The licensing of data in an increasingly online world is proving to be very challenging. There is a growth in organisations adopting simple machine-readable licences, but these have not resolved the issues to data. Emerging technologies such as web services and the growth of big data solutions drawn from multiple sources will continue to create challenges for the licensing of data.
  • A wider issue is the training and education of a broader community of developers and users of location-enabled content. At the same time there is a need for more automated approaches to ensuring the non-geospatial professional community get the right data at the right time. 
Investment in formal training in the use of geospatial data and its implementation is still indispensable.
  • Both ‘open’ and ‘closed’ VGI 
data play an important and necessary part of the wider data ecosystem.

Humanitarian Response in 2025

I’ve been invited to give a “very provocative talk” on what humanitarian response will look like in 2025 for the annual Global Policy Forum organized by the UN Office for the Coordination of Humanitarian Affairs (OCHA) in New York. I first explored this question in early 2012 and my colleague Andrej Verity recently wrote up this intriguing piece on the topic, which I highly recommend; intriguing because he focuses a lot on the future of the pre-deployment process, which is often overlooked.

2025

I only have 7 minutes to give my talk so am thinking of leading with one or two of the following ideas−but I’m very interested in getting feedback from iRevolution readers and welcome additional ideas about what 2025 might look like for OCHA.

•  Situational Awareness: damage & needs assessments are instantaneous and 3D Crisis Maps are updated in real-time along with 3W’s information. Global communication networks are now hyper resilient, thus enabling uninterrupted communications after major disasters. More than 90% of the world’s population generates a readable, geo-referenced and multimedia digital footprint, which is used to augment 3D situational awareness; Fully 100% of all news media and citizen journalism content is now on the web and automatically translated & analyzed every second; high-resolution satellite and aerial imagery for 90% of the planet is updated and automatically analyzed every minute; Billions of physical sensors provide feedback loops on transportation, infrastructure, public health, weather-related and environmental dynamics in real-time. Big Data Analytics & advances in predictive modeling enables situational awareness to be predicted, allowing for IDP/refugee flows and disease outbreaks to be anticipated well ahead of any displacement.

•  Operational Response: disaster response is predominately driven by local communities. The real first responders, after all, have always been the disaster-affected communities. In 2025, this grassroots response is highly networked and hyper-tech enabled, thus significantly accelerating and improving the efficiency of self-help and mutual-aid. The Digital Humanitarian Network (DHN) is no longer a purely virtual network and has local chapters (with flocks of UAVs) in over 100 countries that each contribute to local response efforts. Meanwhile, close to 90% of the world’s population has an augmented-reality Personal Context Assistant (PCA), a wearable device that provides hyper-customized information (drawn in part from Quantified Self data) on urgent needs, available resources and logistics. National humanitarian response organizations have largely replaced the need for external assistance and coordination save for extreme events. International humanitarian organizations increasingly play a financial, certification and accountability role.

•  Early Recovery: There are more 3D printers than 2D printers in 2025. The former are extensively used for rapid reconstruction and post-disaster resilient development using local resources and materials. Mobile-money is automatically disbursed to enable this recovery based on personal insurance & real-time needs assessments. In addition, the Share Economy is truly global, which means that communication, transportation, accommodation and related local services are all readily available in the vast majority of urban areas. During disasters, Share Economy companies play an active role by offering free use of their platforms.

•  Data Access & Privacy: Telecommunications companies, satellite imagery firms and large technology & social media companies have all signed up to the International Data Philanthropy Charter, enabling them to share anonymized emergency data (albeit temporarily) that is directly relevant for humanitarian response. User-generated content is owned by the user who can limit the use of this data along the lines of the Open Paths model.

If you feel like this future is a little too rosy, that’s because I’m thinking of presenting two versions of the future, one that is optimistic and the other less so. The latter would be a world riddled with ad hoc decision-making based on very subjective damage & needs-assessments, highly restrictive data-sharing licenses and even the continued use of PDFs for data dissemination. This less-than pleasant world would also be plagued by data privacy, protection and security challenges. A new digital volunteer group called “Black Hat Humanitarians” rises to prominence and has little patience for humanitarian principles or codes of conduct. In this future world, digital data is collected and shared with no concern for informed consent. In addition, the vast majority of data relevant for saving lives in humanitarian crises remains highly proprietary. Meanwhile, open data that is publicly shared during disasters is used by tech-savvy criminals to further their own ends.

These two future worlds may be extremes but whether we lean towards one or the other will depend in part on enlightened leadership and policymaking. What do you think humanitarian response will look like in 2025? Where am I off and/or making unfounded assumptions? What aspects of the pictures I’m painting are more likely to become reality? What am I completely missing?

Update: Video of presentation available here.

bio

Humanitarianism in the Network Age: Groundbreaking Study

My colleagues at the United Nations Office for the Coordination of Humanitarian Affairs (OCHA) have just published a groundbreaking must-read study on Humanitarianism in the Network Age; an important and forward-thinking policy document on humanitarian technology and innovation. The report “imagines how a world of increasingly informed, connected and self-reliant communities will affect the delivery of humanitarian aid. Its conclusions suggest a fundamental shift in power from capital and headquarters to the people [that] aid agencies aim to assist.” The latter is an unsettling prospect for many. To be sure, Humanitarianism in the Network Age calls for “more diverse and bottom-up forms of decision-making—something that most Governments and humanitarian organizations were not designed for. Systems constructed to move information up and down hierarchies are facing a new reality where information can be generated by any-one, shared with anyone and acted by anyone.”

Screen Shot 2013-04-04 at 10.35.40 AM

The purpose of this blog post (available as a PDF) is to summarize the 120-page OCHA study. In this summary, I specifically highlight the most important insights and profound implications. I also fill what I believe are some of the report’s most important gaps. I strongly recommend reading the OCHA publication in full, but if you don’t have time to leaf through the study, reading this summary will ensure that you don’t miss a beat. Unless otherwise stated, all quotes and figures below are taken directly from the OCHA report.

All in all, this is an outstanding, accurate, radical and impressively cross-disciplinary study. In fact, what strikes me most about this report is how far we’ve come since the devastating Haiti Earthquake of 2010. Just three short years ago, speaking the word “crowdsourcing” was blasphemous, like “Voldermort” (for all you Harry Potter fans). This explains why some humanitarians called me the CrowdSorcerer at the time (thinking it was a derogatory term). CrisisMappers was only launched three months before Haiti. The Standby Volunteer Task Force (SBTF) didn’t even exist at the time and the Digital Humanitarian Network (DHN) was to be launched 2 years hence. And here we are, just three short years later, with this official, high-profile humanitarian policy document that promotes crowdsourcing, digital humanitarian response and next generation humanitarian technology. Exciting times. While great challenges remain, I dare say we’re trying our darned best to find some solutions, and this time through collaboration, CrowdSorcerers and all. The OCHA report is a testament to this collaboration.

Screen Shot 2013-04-04 at 10.43.15 AM

Summary

the Rise of big (crisis) data

Over 100 countries have more mobile phone subscriptions than they have people. One in four individuals in developing countries use the Internet. This figure will double within 20 months. About 70% of Africa’s total population are mobile subscribers. In short, “The planet has gone online, producing and sharing vast quantities of information.” Meanwhile, however, hundreds of millions of people are affected by disasters every year—more than 250 million in 2010 alone. There have been over 1 billion new mobile phone subscriptions since 2010. In other words, disaster affected communities are becoming increasingly “digital” as a result of the information revolution. These new digital technologies continue are evolving new nervous system for our planet, taking the pulse of our social, economic and political networks in real-time.

“Filipinos sent an average of 2 billion SMS messages every day in early 2012,” for example. When disaster strikes, many of these messages are likely to relay crisis information. In Japan, over half-a-million new users joined Twitter the day after the 2011 Earthquake. More than 177 million tweets about the disaster were posted that same day—that is, 2,000 tweets per second on average. Welcome to “The Rise of Big (Crisis) Data.” Meanwhile, back in the US, 80% of the American public expects emergency responders to monitor social media; and almost as many expect them to respond within three hours of posting a request on social media (1). These expectations have been shown to increase year-on year. “At the same time,” however, the OCHA report notes that “there are greater numbers of people […] who are willing and able to respond to needs.”

communities first

A few brave humanitarian organizations are embracing these changes and new realities, “reorienting their approaches around the essential objectives of helping people to help themselves.” That said, “the frontline of humanitarian action has always consisted of communities helping themselves before outside aid arrives.” What is new, however, is “affected people using technology to communicate, interact with and mobilize their social networks quicker than ever before […].” To this end, “by rethinking how aid agencies work and communicate with people in crisis, there is a chance that many more lives can be saved.” In sum, “the increased reach of communications networks and the growing network of people willing and able to help, are defining a new age—a network age—for humanitarian assistance.”

This stands in stark contrast to traditional notions of humanitarian assistance, which refer to “a small group of established international organizations, often based in and funded by high-income countries, providing help to people in a major crisis. This view is now out of date.” As my colleague Tim McNamara noted on the CrisisMappers list-serve, (cited in the OCHA report), this is “…not simply a technological shift [but] also a process of rapid decentralization of power. With extremely low barriers to entry, many new entrants are appearing in the fields of emergency and disaster response. They are ignoring the traditional hierarchies, because the new entrants perceive that there is something they can do which benefits others.” In other words, the humanitarian “world order” is shifting towards a more multipolar system. And so, while Tim was “referring to the specific case of volunteer crisis mappers […], the point holds true across all types of humanitarian work.”

Take the case of Somalia Speaks, for example. A journalist recently asked me to list the projects I am most proud of in this field. Somalia Speaks ranks very high. I originally pitched the idea to my Al-Jazeera colleagues back in September 2011; the project was launched three months later. Together with my colleagues at Souktelwe texted 5,000 Somalis across the country to ask how were personally affected by the crisis.

SomaliaSpeaksPic

As the OCHA study notes, we received over 3,000 responses, which were translated into English and geotagged by the Diaspora and subsequently added to a crisis map hosted on the Al-Jazeera website. From the OCHA report: “effective communication can also be seen as an end itself in promoting human dignity. More than 3,000 Somalis responded to the Somalia Speaks project, and they seemed to feel that speaking out was a worthwhile activity.” In sum, “The Somalia Speaks project enabled the voices of people from one of the world’s most inaccessible, conflict-ridden areas, in a language known to few outside their community, to be heard by decision makers from across the planet.” The project has since been replicated several times; see Uganda Speaks for example. The OCHA study refers to Somalia Speaks at least four times, highlighting the project as an example of networked humanitarianism.

PRIVACY, SECURITY & PROTECTION

The report also emphasizes the critical importance of data security, privacy and protection in the network age. OCHA’s honest and balanced approach to the topic is another reason why this report is so radical and forward thinking. “Concern over the protection of information and data is not a sufficient reason to avoid using new communications technologies in emergencies, but it must be taken into account. To adapt to increased ethical risks, humanitarian responders and partners need explicit guidelines and codes of conduct for managing new data sources.” This is precisely why I worked with GSMA’s Disaster Response Program to draft and publish the first ever Code of Conduct for the Use of SMS in Disaster Response. I have also provided extensive feedback to the International Committee of the Red Cross’s (ICRC) latest edition of the “Professional Standards for Protection Work,” which was just launched in Geneva this month. My colleagues Emmanuel Letouzé and Patrick Vinck also included a section on data security and ethics in our recent publication on the use of Big Data for Conflict Prevention. In addition, I have blogged about this topic quite a bit: herehere and here, for example.

crisis in decision making

“As the 2010 Haiti crisis revealed, the usefulness of new forms of information gathering is limited by the awareness of responders that new data sources exist, and their applicability to existing systems of humanitarian decision-making.” The fact of the matter is that humanitarian decision-making structures are simply not geared towards using Big Crisis Data let alone new data sources. More pointedly, however, humanitarian decision-making processes are often not based on empirical data in the first place, even when the data originate from traditional sources. As DfID notes in this 2012 strategy document, “Even when good data is available, it is not always used to inform decisions. There are a number of reasons for this, including data not being available in the right format, not widely dispersed, not easily accessible by users, not being transmitted through training and poor information management. Also, data may arrive too late to be able to influence decision-making in real time operations or may not be valued by actors who are more focused on immediate action.”

This is the classic warning-response gap, which has been discussed ad nauseum for decades in the field of famine early warning systems and conflict early warning systems. More data in no way implies action. Take the 2011 Somalia Famine, which was one of the best documented crises yet. So the famine didn’t occur because data was lacking. “Would more data have driven a better decision making process that could have averted disaster? Unfortunately, this does not appear to be the case. There had, in fact, been eleven months of escalating warnings emanating from the famine early warning systems that monitor Somalia. Somalia was, at the time, one of the most frequently surveyed countries in the world, with detailed data available on malnutrition prevalence, mortality rates, and many other indicators. The evolution of the famine was reported in almost real time, yet there was no adequate scaling up of humanitarian intervention until too late” (2).

At other times, “Information is sporadic,” which is why OCHA notes that “decisions can be made on the basis of anecdote rather than fact.” Indeed, “Media reports can significantly influence allocations, often more than directly transmitted community statements of need, because they are more widely read or better trusted.” (It is worth keeping in mind that the media makes mistakes; the New York Times alone makes over 7,000 errors every year). Furthermore, as acknowledged, by OCHA, “The evidence suggests that new information sources are no less representative or reliable than more traditional sources, which are also imperfect in crisis settings.” This is one of the most radical statements in the entire report. OCHA should be applauded for their remarkable fortitude in plunging into this rapidly shifting information landscape. Indeed, they go on to state that, “Crowdsourcing has been used to validate information, map events, translate text and integrate data useful to humanitarian decision makers.”

Screen Shot 2013-04-04 at 10.40.50 AM

The vast major of disaster datasets are not perfect, regardless of whether they are drawn from traditional or non-traditional sources. “So instead of criticizing the lack of 100% data accuracy, we need to use it as a base and ensure our Monitoring and Evaluation (M&E) and community engagement pieces are strong enough to keep our programming relevant” (Bartosiak 2013). And so, perhaps the biggest impact of new technologies and recent disasters on the humanitarian sector is the self disrobing of the Emperor’s Clothes (or Data). “Analyses of emergency response during the past five years reveal that poor information management has severely hampered effective action, costing many lives.” Disasters increasingly serve as brutal audits of traditional humanitarian organizations; and the cracks are increasingly difficult to hide in an always-on social media world. The OCHA study makes clear that  decision-makers need to figure out “how to incorporate these sources into decisions.”

Fact is, “To exploit the opportunity of the network age, humanitarians must understand how to use the new range of available data sources and have the capacity to transform this data into useful information.” Furthermore, it is imperative “to ensure new partners have a better understanding of how [these] decisions are made and what information is useful to improve humanitarian action.” These new partners include the members of the Digital Humanitarian Network (DHN), for example. Finally, decision-makers also need to “invest in building analytic capacity across the entire humanitarian network.” This analytic capacity can no longer rest on manual solutions alone. The private sector already makes use of advanced computing platforms for decision-making purposes. The humanitarian industry would be well served to recognize that their problems are hardly unique. Of course, investing in greater analytic capacity is an obvious solution but many organizations are already dealing with limited budgets and facing serious capacity constraints. I provide some creative solutions to this challenge below, which I refer to as “Data Science Philanthropy“.

Commentary

Near Perfection

OCHA’s report is brilliant, honest and forward thinking. This is by far the most important official policy document yet on humanitarian technology and digital humanitarian response—and thus on the very future of humanitarian action. The study should be required reading for everyone in the humanitarian and technology communities, which is why I plan to organize a panel on the report at CrisisMappers 2013 and will refer to the strategy document in all of my forthcoming talks and many a future blog post. In the meantime, I would like to highlight and address a some of the issues that I feel need to be discussed to take this discussion further.

Ironically, some of these gaps appear to reflect a rather limited understanding of advanced computing & next generation humanitarian technology. The following topics, for example, are missing from the OCHA report: Microtasking, Sentiment Analysis and Information Forensics. In addition, the report does not relate OCHA’s important work to disaster resilience and people-centered early warning. So I’m planning to expand on the OCHA report in the technology chapter for this year’s World Disaster Report (WDR 2013). This high-profile policy document is an ideal opportunity to amplify OCHA’s radical insights and to take these to their natural and logical conclusions vis-à-vis Big (Crisis) Data. To be clear, and I must repeat this, the OCHA report is the most important forward thinking policy document yet on the future of humanitarian response. The gaps I seek to fill in no way make the previous statement any less valid. The team at OCHA should be applauded, recognized and thanked for their tremendous work on this report. So despite some of the key shortcomings described below, this policy document is by far the most honest, enlightened and refreshing look at the state of the humanitarian response today; a grounded and well-researched study that provides hope, leadership and a clear vision for the future of humanitarianism in the network age.

BIG DATA HOW

OCHA recognizes that “there is a significant opportunity to use big data to save lives,” and they also get that, “finding ways to make big data useful to humanitarian decision makers is one of the great challenges, and opportunities, of the network age.” Moreover, they realize that “While valuable information can be generated anywhere, detecting the value of a given piece of data requires analysis and understanding.” So they warn, quite rightly, that “the search for more data can obscure the need for more analysis.” To this end, they correctly conclude that “identifying the best uses of crowdsourcing and how to blend automated and crowdsourced approaches is a critical area for study.” But the report does not take these insights to their natural and logical conclusions. Nor does the report explore how to tap these new data sources let alone analyze them in real time.

Yet these Big Data challenges are hardly unique. Our problems in the humanitarian space are not that “special” or  different. OCHA rightly notes that “Understanding which bits of information are valuable to saving lives is a challenge when faced with this ocean of data.” Yes. But such challenges have been around for over a decade in other disciplines. The field of digital disease detection, for example, is years ahead when it comes to real-time analysis of crowdsourced big data, not to mention private sector companies, research institutes and even new startups whose expertise is Big Data Analytics. I can also speak to this from my own professsional experience. About a decade ago, I worked with a company specializing in conflict forecasting and early using Reuters news data (Big Data).

In sum, the OCHA report should have highlighted the fact that solutions to many of these Big Data challenges already exist, which is precisely why I joined the Qatar Computing Research Institute (QCRI). What’s more, a number of humanitarian technology projects at QCRI are already developing prototypes based on these solutions; and OCHA is actually the main partner in one such project, so it is a shame they did not get credit for this in their own report.

sentiment analysis

While I introduced the use of sentiment analysis during the Haiti Earthquake, this has yet to be replicated in other humanitarian settings. Why is sentiment analysis key to humanitarianism in the network age? The answer is simple: “Communities know best what works for them; external actors need to listen and model their response accordingly.” Indeed, “Affected people’s needs must be the starting point.” Actively listening to millions of voices is a Big Data challenge that has already been solved by the private sector. One such solution is real-time sentiment analysis to capture brand perception. This is a rapidly growing multimillion dollar market, which is why many companies like Crimson Hexagon exist. Numerous Top 500 Fortune companies have been actively using automated sentiment analysis for years now. Why? Because these advanced listening solutions enable them to better understand customer perceptions.

Screen Shot 2013-04-08 at 5.49.56 AM

In Haiti, I applied this approach to tens of thousands of text messages sent by the disaster-affected population. It allowed us to track the general mood of this population on a daily basis. This is important because sentiment analysis as a feedback loop works particularly well with Big Data, which explains why the private sector is all over it. If just one or two individuals in a community are displeased with service delivery during a disaster, they may simply be “an outlier”  or perhaps exaggerating. But if the sentiment analysis at the community level suddenly starts to dip, then this means hundreds, perhaps thousands of affected individuals are now all feeling the same way about a situation. In other words, sentiment analysis serves as a triangulating mechanism. The fact that the OCHA report makes no mention of this existing solution is unfortunate since sentiment feedback loops enable organizations to assess the impact of their interventions by capturing their clients’ perceptions.

Information forensics

“When dealing with the vast volume and complexity of information available in the network age, understanding how to assess the accuracy and utility of any data source becomes critical.” Indeed, and the BBC’s User-Generated Content (UGC) Hub has been doing just this since 2005—when Twitter didn’t even exist. The field of digital information forensics may be new to the humanitarian sector, but that doesn’t mean it is new to every other sector on the planet. Furthermore, recent research on crisis computing has revealed that the credibility of social media reporting can be modeled and even predicted. Twitter has even been called a “Truth Machine” because of the self-correcting dynamic that has been empirically observed. Finally, one of QCRI’s humanitarian technology projects, Verily, focuses precisely on the issue of verifying crowdsourced social media information from social media. And the first organization I reached out to for feedback on this project was OCHA.

microtasking

The OCHA report overlooks microtasking as well. Yes, the study does address and promote the use of crowdsourcing repeatedly, but again, this  tends to focus on the collection of information rather than the processing of said information. Microtasking applications in the humanitarian space are not totally unheard of, however. Microtasking was used to translate and geolocate tens of thousands of text messages following the Haiti Earthquake. (As the OCHA study notes, “some experts estimated that 90 per cent [of the SMS’s] were ‘repetition’, or ‘white noise’, meaning useless chatter”). There have been several other high profile uses of microtasking for humanitarian operations such as this one thanks to OCHA’s leadership in response to Typhoon Pablo. In sum, microtasking has been used extensively in other sectors to manage the big data and quality control challenge for many years now. So this important human computing solution really ought to have appeared in the OCHA report along with the immense potential of microtasking humanitarian information using massive online multiplayer games (more here).

Open Data is Open Power

OCHA argues that “while information can be used by anyone, power remains concentrated in the hands of a limited number of decision makers.” So if the latter “do not use this information to make decisions in the interests of the people they serve, its value is lost.” I don’t agree that the value is lost. One of the reports’ main themes is the high-impact agency and ingenuity of disaster-affected communities. As OCHA rightly points out, “The terrain is continually shifting, and people are finding new and brilliant ways to cope with crises every day.” Openly accessible crisis information posted on social media has already been used by affected populations for almost a decade now. In other words, communities affected by crises are (quite rightly) taking matters into their own hands in today’s networked world—just like they did in the analog era of yesteryear. As noted earlier, “affected people [are] using technology to communicate, interact with and mobilize their social networks quicker than ever before […].” This explains why “the failure to share [information] is no longer a matter of institutional recalcitrance: it can cost lives.”

creative partnerships

The OCHA study emphasizes that “Humanitarian agencies can learn from other agencies, such as fire departments or militaries, on how to effectively respond to large amounts of often confusing information during a fast-moving crisis.” This is spot on. Situational awareness is first and foremost a military term. The latest Revolution in Military Affairs (RMA) provides important insights into the future of humanitarian technology—see these recent developments, for example. Mean-while, the London Fire Brigade has announced plans to add Twitter as a communication channel, which means city residents will have the option of reporting a fire alert via Twitter. Moreover, the 911 service in the US (999 in the UK) is quite possibly the oldest and longest running crowdsourced emergency service in the world. So there much that humanitarian can learn from 911. But the fact of the matter is that most domestic emergency response agencies are completely unprepared to deal with the tidal wave of Big (Crisis) Data, which is precisely why the Fire Department of New York City (FDNY) and San Francisco City’s Emergency Response Team have recently reached out to me.

Screen Shot 2013-04-04 at 11.08.13 AM

But some fields are way ahead of the curve. The OCHA report should thus have pointed to crime mapping and digital disease detection since these fields have more effectively navigated the big data challenge. As for the American Red Cross’s Digital Operations Center, the main technology they are using, Radian6, has been used by private sector clients for years now. And while the latter can afford the very expensive licensing fees, it is unlikely that cash-strapped domestic emergency response officers and international humanitarian organizations will ever be able to afford these advanced solutions. This is why we need more than just “Data Philanthropy“.

We also need “Data Science Philanthropy“. As the OCHA report states, decision-makers need to “invest in building analytic capacity across the entire humanitarian network.” This is an obvious recommendation, but perhaps not particularly realistic given the limited budgets and capacity constraints in the humanitarian space. This means we need to create more partnerships with Data Science groups like DataKind, Kaggle and the University of Chicago’s Data Science for Social Good program. I’m in touch with these groups and others for this reason. I’ve also been (quietly) building a global academic network called “Data Science for Humanitarian Action” which will launch very soon. Open Source solutions are also imperative for building analytic capacity, which is why the humanitarian technology platforms being developed by QCRI will all be Open Source and freely available.

DISASTER RESILIENCE

This points to the following gap in the OCHA report: there is no reference whatsoever to resilience. While the study does recognize that collective self-help behavior is typical in disaster response and should be amplified, the report does not make the connection that this age-old mutual-aid dynamic is the humanitarian sector’s own lifeline during a major disaster. Resilience has to do with a community’s capacity for self-organization. Communication technologies increasingly play a pivotal role in self-organization. This explains why disaster preparedness and disaster risk reduction programs ought to place greater emphasis on building the capacity of at-risk communities to self-organize and mitigate the impact of disasters on their livelihoods. More about this here. Creating resilience through big data is also more academic curiosity, as explained here.

DECENTRALIZING RESPONSE

As more and more disaster-affected communities turn to social media in time of need, “Governments and responders will soon need answers to the questions: ‘Where were you? We Facebooked/tweeted/texted for help, why didn’t someone come?'” Again, customer support challenges are hardly unique to the humanitarian sector. Private sector companies have had to manage parallel problems by developing more advanced customer service platforms. Some have even turned to crowdsourcing to manage customer support. I blogged about this here to drive the point home that solutions to these humanitarian challenges already exist in other sectors.

Yes, that’s right, I am promoting the idea of crowdsourcing crisis response. Fact is, disaster response has always been crowdsourced. The real first responders are the disaster affected communities themselves. Thanks to new technologies, this crowdsourced response can be accelerated and made more efficient. And yes, there’s an app (in the making) for that: MatchApp. This too is a QCRI humanitarian technology project (in partnership with MIT’s Computer Science and Artificial Intelligence Lab). The purpose of MatchApp is to decentralize disaster response. After all, the many small needs that arise following a disaster rarely require the attention of paid and experienced emergency responders. Furthermore, as a colleague of mine at NYU shared based on her disaster efforts following Hurricane Sandy, “Solving little challenges can make the biggest differences” for disaster-affected communities.

As noted above, more and more individuals believe that emergency responders should monitor social media during disasters and respond accordingly. This is “likely to increase the pressure on humanitarian responders to define what they can and cannot provide. The extent of communities’ desires may exceed their immediate life-saving needs, raising expectations beyond those that humanitarian responders can meet. This can have dangerous consequences. Expectation management has always been important; it will become more so in the network age.”

Screen Shot 2013-04-04 at 11.20.15 AM

PEOPLE-CENTERED

“Community early warning systems (CEWS) can buy time for people to implement plans and reach safety during a crisis. The best CEWS link to external sources of assistance and include the pre-positioning of essential supplies.” At the same time, “communities do not need to wait for information to come from outside sources, […] they can monitor local hazards and vulnerabilities themselves and then shape the response.” This sense and shaping capacity builds resilience, which explains why “international humanitarian organizations must embrace the shift of warning systems to the community level, and help Governments and communities to prepare for, react and respond to emergencies using their own resources and networks.”

This is absolutely spot on and at least 7 years old as far as  UN policy goes. In 2006, the UN’s International Strategy for Disaster Risk Reduction (UNISDR) published this policy document advocating for a people-centered approach to early warning and response systems. They defined the purpose of such as systems as follows:

“… to empower individuals and communities threatened by hazards to act in sufficient time and in an appropriate manner so as to reduce the possibility of personal injury, loss of life, damage to property and the environment, and loss of livelihoods.”

Unfortunately, the OCHA report does not drive these insights to their logical conclusion. Disaster-affected communities are even more ill-equipped to manage the rise of Big (Crisis) Data. Storing, let alone analyzing Big Data Analytics in real-time, is a major technical challenge. As noted here vis-à-vis Big Data Analytics on Twitter, “only corporate actors and regulators—who possess both the intellectual and financial resources to succeed in this race—can afford to participate […].” Indeed, only a handful of research institutes have the technical ability and large funding base carry out the real-time analysis of Big (Crisis) Data. My team and I at QCRI, along with colleagues at UN Global Pulse and GSMA are trying to change this. In the meantime, however, the “Big Data Divide” is already here and very real.

information > Food

“Information is not water, food or shelter; on its own, it will not save lives. But in the list of priorities, it must come shortly after these.” While I understand the logic behind this assertion, I consider it a step back, not forward from the 2005 World Disaster Report (WDR 2005), which states that “People need information as much as water, food, medicine or shelter. Information can save lives, livelihoods and resources.” In fact, OCHA’s assertion contradicts an earlier statement in the report; namely that “information in itself is a life-saving need for people in crisis. It is as important as water, food and shelter.” Fact is: without information, how does one know where/when and from whom clean water and food might be available? How does one know which shelters are open, whether they can accommodate your family and whether the road to the shelter is safe to drive on?

Screen Shot 2013-04-08 at 5.39.51 AM

OCHA writes that, “Easy access to data and analysis, through technology, can help people make better life-saving decisions for themselves and mobilize the right types of external support. This can be as simple as ensuring that people know where to go and how to get help. But to do so effectively requires a clear understanding of how information flows locally and how people make decisions.” In sum, access to information is paramount, which means that local communities should have easy access to next generation humanitarian technologies that can manage and analyze Big Crisis Data. As a seasoned humanitarian colleague recently told me, “humanitarians sometimes have a misconception that all aid and relief comes through agencies.  In fact, (especially with things such a shelter) people start to recover on their own or within their communities. Thus, information is vital in assuring that they do this safely and properly.  Think of the Haiti, build-back-better campaign and the issues with cholera outbreaks.”

Them not us

The technologies of the network age should not be restricted to empowering second- and third-level responders. Unfortunately, as OCHA rightly observes, “there is still a tendency for people removed from a crisis to decide what is best for the people living through that crisis.” Moreover, these paid responders cannot be everywhere at the same time. But the crowd is always there. And as OCHA points out, there are “growing groups of people willing able to help those in need;” groups that unlike their analog counterparts of yesteryear now operate in the “network age with its increased reach of communications networks.” So information is not simply or “primarily a tool for agencies to decide how to help people, it must be understood as a product, or service, to help affected communities determine their own priorities.” Recall the above definition of people-centered early warning. This definition does not all of a sudden become obsolete in the network age. The purpose of next generation technologies is to “empower individuals and communities threatened by hazards to act in sufficient time and in an appropriate manner so as to reduce the possibility of personal injury, loss of life, damage to property and the environment, and loss of livelihoods.”

Screen Shot 2013-04-08 at 5.36.05 AM

Digital humanitarian volunteers are also highly unprepared to deal with the rise of Big Crisis Data, even though they are at the frontlines and indeed the pioneers of digital response. This explains why the Standby Volunteer Task Force (SBTF), a network of digital volunteers that OCHA refers to half-a-dozen times throughout the report, are actively looking to becoming early adopters of next generation humanitarian technologies. Burn out is a serious issue with digital volunteers. They too require access to these next generation technologies, which is precisely why the American Red Cross equips their digital volunteers with advanced computing platforms as part of their Digital Operations Center. Unfortunately, some humanitarians still think that they can just as easily throw more (virtual) volunteers at the Big Crisis Data challenge. Not only are they terribly misguided but also insensitive, which is why, As OCHA notes, “Using new forms of data may also require empowering technical experts to overrule the decisions of their less informed superiors.” As the OCHA study concludes, “Crowdsourcing is a powerful tool, but ensuring that scarce volunteer and technical resources are properly deployed will take further research and the expansion of collaborative models, such as SBTF.”

Conclusion

So will next generation humanitarian technology solve everything? Of course not, I don’t know anyone naïve enough to make this kind of claim. (But it is a common tactic used by the ignorant to attack humanitarian innovation). I have already warned about techno-centric tendencies in the past, such as here and here (see epilogue). Furthermore, one of the principal findings from this OECD report published in 2008 is that “An external, interventionist, and state-centric approach in early warning fuels disjointed and top down responses in situations that require integrated and multilevel action.” You can throw all the advanced computing technology you want at this dysfunctional structural problem but it won’t solve a thing. The OECD thus advocates for “micro-level” responses to crises because “these kinds of responses save lives.” Preparedness is obviously central to these micro-level responses and self-organization strategies. Shockingly, however, the OCHA study reveals that, “only 3% of humanitarian aid goes to disaster prevention and preparedness,” while barely “1% of all other development assistance goes towards disaster risk reduction.” This is no way to build disaster resilience. I doubt these figures will increase substantially in the near future.

This reality makes it even more pressing to ensure that “responders listen to affected people and find ways to respond to their priorities will require a mindset change.” To be sure, “If aid organizations are willing to listen, learn and encourage innovation on the front lines, they can play a critical role in building a more inclusive and more effective humanitarian system.” This need to listen and learn is why next generation humanitarian technologies are not optional. Ensuring that first, second and third-level responders have access to next generation humanitarian technologies is critical for the purposes of self-help, mutual aid and external response.

bio

Predicting the Future of Global Geospatial Information Management

The United Nations Committee of Experts on Global Information Management (GGIM) recently organized a meeting of thought-leaders and visionaries in the geo-spatial world to identify the future of this space over the next 5-10 years. These experts came up with some 80+ individual predictions. I’ve included some of the more interesting ones below.

  • The use of Unmanned Aerial Vehicles (UAVs) as a tool for rapid geospatial data collection will increase.
  • 3D and even 4D geospatial information, incorporating time as the fourth dimension, will increase.
  • Technology will move faster than legal and governance structures.
  • The link between geospatial information and social media, plus other actor networks, will become more and more important.
  • Real-time info will enable more dynamic modeling & response to disasters.
  • Free and open source software will continue to grow as viable alternatives both in terms of software, and potentially in analysis and processing.
  • Geospatial computation will increasingly be non-human consumable in nature, with an increase in fully-automated decision systems.
  • Businesses and Governments will increasingly invest in tools and resources to manage Big Data. The technologies required for this will enable greater use of raw data feeds from sensors and other sources of data.
  • In ten years time it is likely that all smart phones will be able to film 360 degree 3D video at incredibly high resolution by today’s standards & wirelessly stream it in real time.
  • There will be a need for geospatial use governance in order to discern the real world from the virtual/modelled world in a 3D geospatial environ-ment.
  • Free and open access to data will become the norm and geospatial information will increasingly be seen as an essential public good.
  • Funding models to ensure full data coverage even in non-profitable areas will continue to be a challenge.
  • Rapid growth will lead to confusion and lack of clarity over data ownership, distribution rights, liabilities and other aspects.
  • In ten years, there will be a clear dividing line between winning and losing nations, dependent upon whether the appropriate legal and policy frameworks have been developed that enable a location-enabled society to flourish.
  • Some governments will use geospatial technology as a means to monitor or restrict the movements and personal interactions of their citizens. Individuals in these countries may be unwilling to use LBS or applications that require location for fear of this information being shared with authorities.
  • The deployment of sensors and the broader use of geospatial data within society will force public policy and law to move into a direction to protect the interests and rights of the people.
  • Spatial literacy will not be about learning GIS in schools but will be more centered on increasing spatial awareness and an understanding of the value of understanding place as context.
  • The role of National Mapping Agencies as an authoritative supplier of high quality data and of arbitrator of other geospatial data sources will continue to be crucial.
  • Monopolies held by National Mapping Agencies in some areas of specialized spatial data will be eroded completely.
  • More activities carried out by National Mapping Agencies will be outsourced and crowdsourced.
  • Crowdsourced data will push National Mapping Agencies towards niche markets.
  • National Mapping Agencies will be required to find new business models to provide simplified licenses and meet the demands for more free data from mapping agencies.
  • The integration of crowdsourced data with government data will increase over the next 5 to 10 years.
  • Crowdsourced content will decrease cost, improve accuracy and increase availability of rich geospatial information.
  •  There will be increased combining of imagery with crowdsourced data to create datasets that could not have been created affordably on their own.
  • Progress will be made on bridging the gap between authoritative data and crowdsourced data, moving towards true collaboration.
  • There will be an accelerated take-up of Volunteer Geographic Information over the next five years.
  • Within five years the level of detail on transport systems within OpenStreetMap will exceed virtually all other data sources & will be respected/used by major organisations & governments across the globe.
  • Community-based mapping will continue to grow.
  • There is unlikely to be a market for datasets like those currently sold to power navigation and location-based services solutions in 5 years, as they will have been superseded by crowdsourced datasets from OpenStreetMaps or other comparable initiatives.

Which trends have the experts missed? Do you think they’re completely off on any of the above? The full set of predictions on the future of global geospatial information management is available here as a PDF.

Does the Humanitarian Industry Have a Future in The Digital Age?

I recently had the distinct honor of being on the opening plenary of the 2012 Skoll World Forum in Oxford. The panel, “Innovation in Times of Flux: Opportunities on the Heels of Crisis” was moderated by Judith Rodin, CEO of the Rockefeller Foundation. I’ve spent the past six years creating linkages between the humanitarian space and technology community, so the conversations we began during the panel prompted me to think more deeply about innovation in the humanitarian industry. Clearly, humanitarian crises have catalyzed a number of important innovations in recent years. At the same time, however, these crises extend the cracks that ultimately reveal the inadequacies of existing organiza-tions, particularly those resistant to change; and “any organization that is not changing is a battle-field monument” (While 1992).

These cracks, or gaps, are increasingly filled by disaster-affected communities themselves thanks in part to the rapid commercialization of communication technology. Question is: will the multi-billion dollar humanitarian industry change rapidly enough to avoid being left in the dustbin of history?

Crises often reveal that “existing routines are inadequate or even counter-productive [since] response will necessarily operate beyond the boundary of planned and resourced capabilities” (Leonard and Howitt 2007). More formally, “the ‘symmetry-breaking’ effects of disasters undermine linearly designed and centralized administrative activities” (Corbacioglu 2006). This may explain why “increasing attention is now paid to the capacity of disaster-affected communities to ‘bounce back’ or to recover with little or no external assistance following a disaster” (Manyena 2006).

But disaster-affected populations have always self-organized in times of crisis. Indeed, first responders are by definition those very communities affected by disasters. So local communities—rather than humanitarian professionals—save the most lives following a disaster (Gilbert 1998). Many of the needs arising after a disaster can often be met and responded to locally. One doesn’t need 10 years of work experience with the UN in Darfur or a Masters degree to know basic first aid or to pull a neighbor out of the rubble, for example. In fact, estimates suggest that “no more than 10% of survival in emergencies can be attributed to external sources of relief aid” (Hilhorst 2004).

This figure may be higher today since disaster-affected communities now benefit from radically wider access to information and communication technologies (ICTs). After all, a “disaster is first of all seen as a crisis in communicating within a community—that is as a difficulty for someone to get informed and to inform other people” (Gilbert 1998). This communication challenge is far less acute today because disaster-affected communities are increasingly digital, and thus more and more the primary source of information communicated following a crisis. Of course, these communities were always sources of information but being a source in an analog world is fundamentally different than being a source of information in the digital age. The difference between “read-only” versus “read-write” comes to mind as an analogy. And so, while humanitarian organiza-tions typically faced a vacuum of information following sudden onset disasters—limited situational awareness that could only be filled by humanitarians on the ground or via established news organizations—one of the major challenges today is the Big Data produced by disaster-affected communities themselves.

Indeed, vacuums are not empty and local communities are not invisible. One could say that disaster-affected communities are joining the quantified self (QS) movement given that they are increasingly quantifying themselves. If inform-ation is power, then the shift of information sourcing and sharing from the select few—the humanitarian professionals—to the masses must also engender a shift in power. Indeed, humanitarians rarely have access to exclusive information any longer. And even though affected populations are increasingly digital, some groups believe that humanitarian organizations have largely failed at commu–nicating with disaster-affected communities. (Naturally, there are important and noteworthy exceptions).

So “Will Twitter Put the UN Out of Business?” (Reuters), or will humanitarian organizations cope with these radical changes by changing themselves and reshaping their role as institutions before it’s too late? Indeed, “a business that doesn’t communicate with its customers won’t stay in business very long—it’ll soon lose track of what its clients want, and clients won’t know what products or services are on offer,” whilst other actors fill the gaps (Reuters). “In the multi-billion dollar humanitarian aid industry, relief agencies are businesses and their beneficiaries are customers. Yet many agencies have muddled along for decades with scarcely a nod towards communicating with the folks they’re supposed to be serving” (Reuters).

The music and news industries were muddling along as well for decades. Today, however, they are facing tremendous pressures and are undergoing radical structural changes—none of them by choice. Of course, it would be different if affected communities were paying for humanitarian services but how much longer do humanitarian organizations have until they feel similar pressures?

Whether humanitarian organizations like it or not, disaster affected communities will increasingly communicate their needs publicly and many will expect a response from the humanitarian industry. This survey carried out by the American Red Cross two years ago already revealed that during a crisis the majority of the public expect a response to needs they communicate via social media. Moreover, they expect this response to materialize within an hour. Humanitarian organizations simply don’t have the capacity to deal with this surge in requests for help, nor are they organizationally structured to do so. But the fact of the matter is that humanitarian organizations have never been capable of dealing with this volume of requests in the first place. So “What Good is Crowd-sourcing When Everyone Needs Help?” (Reuters). Perhaps “crowdsourcing” is finally revealing all the cracks in the system, which may not be a bad thing. Surely by now it is no longer a surprise that many people may be in need of help after a disaster, hence the importance of disaster risk reduction and preparedness.

Naturally, humanitarian organizations could very well chose to continue ignoring calls for help and decide that communicating with disaster affected communities is simply not tenable. In the analog world of the past, the humanitarian industry was protected by the fact that their “clients” did not have a voice because they could not speak out digitally. So the cracks didn’t show. Today, “many traditional humanitarian players see crowdsourcing as an unwelcome distraction at a time when they are already overwhelmed. They worry that the noise-to-signal ration is just too high” (Reuters). I think there’s an important disconnect here worth emphasizing. Crowdsourced information is simply user-generated content. If humanitarians are to ignore user-generated content, then they can forget about two-way communications with disaster-affected communities and drop all the rhetoric. On the other hand, “if aid agencies are to invest time and resources in handling torrents of crowdsourced information in disaster zones, they should be confident it’s worth their while” (Reuters).

This last comment is … rather problematic for several reasons (how’s that for being diplomatic?). First of all, this kind of statement continues to propel the myth that we the West are the rescuers and aid does not start until we arrive (Barrs 2006). Unfortunately, we rarely arrive: how many “neglected crises” and so-called “forgotten emergencies” have we failed to intervene in? This kind of mindset may explain why humanitarian interventions often have the “propensity to follow a paternalistic mode that can lead to a skewing of activities towards supply rather than demand” and towards informing at the expense of listening (Manyena 2006).

Secondly, the assumption that crowdsourced data would be for the exclusive purpose of the humanitarian cavalry is somewhat arrogant and ignores the reality that local communities are by definition the first responders in a crisis. Disaster-affected communities (and Diasporas) are already collecting (and yes crowdsourcing) information to create their own crisis maps in times of need as a forthcoming report shows. And they’ll keep doing this whether or not humanita-rian organizations approve or leverage that information. As my colleague Tim McNamara has noted “Crisis mapping is not simply a technological shift, it is also a process of rapid decentralization of power. With extremely low barriers to entry, many new entrants are appearing in the fields of emergency and disaster response. They are ignoring the traditional hierarchies, because the new entrants perceive that there is something that they can do which benefits others.”

Thirdly, humanitarian organizations are far more open to using free and open source software than they were just two years ago. So the resources required to monitor and map crowdsourced information need not break the bank. Indeed, the Syria Crisis Map uses a free and open source data-mining platform called HealthMap, which has been monitoring some 2,000 English-based sources on a daily basis for months. The technology powering the map itself, Ushahidi, is also free and open source. Moreover, the team behind the project is comprised of just a handful of volunteers doing this in their own free time (for almost an entire year now). And as a result of this initiative, I am collaborating with a colleague from UNDP to pilot HealthMap’s data mining feature for conflict monitoring and peacebuilding purposes.

Fourth, other than UN Global Pulse, humanitarian agencies are not investing time and resources to manage Big (Crisis) Data. Why? Because they have neither the time nor the know-how. To this end, they are starting to “outsource” and indeed “crowdsource” these tasks—just as private sector businesses have been doing for years in order to extend their reach. Anyone actually familiar with this space and developments since Haiti already knows this. The CrisisMappers Network, Standby Volunteer Task Force (SBTF), Humanitarian OpenStreetMap (HOT) and Crisis Commons (CC) are four volunteer/technical networks that have already collaborated actively with a number of humanitarian organizations since Haiti to provide the “surge capacity” requested by the latter; this includes UN OCHA in Libya and Colombia, UNHCR in Somalia and WHO in Libya, to name a few. In fact, these groups even have their own acronym: Volunteer & Technical Communities (V&TCs).

As the former head of OCHA’s Information Services Section (ISS) noted after the SBTF launched the Libya Crisis Map, “Your efforts at tackling a difficult problem have definitely reduced the information overload; sorting through the multitude of signals on the crisis is not easy task” (March 8, 2011). Furthermore, the crowdsourced social media information mapped on the Libya Crisis Map was integrated into official UN OCHA information products. I dare say activating the SBTF was worth OCHA’s while. And it cost the UN a grand total of $0 to benefit from this support.

Credit: Chris Bow

The rapid rise of V&TC’s has catalyzed the launch of the Digital Humanitarian Network (DHN), formerly called the Humanitarian Standby Task Force (H-SBTF). Digital Humanitarians is a network-of-network catalyzed by the UN and comprising some of the most active members of the volunteer & technical co-mmunity. The purpose of the Digital Humanitarian platform (powered by Ning) is to provide a dedicated interface for traditional humanitarian organizations to outsource and crowdsource important information management tasks during and in-between crises. OCHA has also launched the Communities of Interest (COIs) platform to further leverage volunteer engagement in other areas of humanitarian response.

These are not isolated efforts. During the massive Russian fires of 2010, volunteers launched their own citizen-based disaster response agency that was seen by many as more visible and effective than the Kremlin’s response. Back in Egypt, volunteers used IntaFeen.com to crowdsource and coordinate their own humanitarian convoys to Libya, for example. The company LinkedIn has also taken innovative steps to enable the matching of volunteers with various needs. They recently added a “Volunteer and Causes” field to its member profile page, which is now available to 150 million LinkedIn users worldwide. Sparked.com is yet another group engaged in matching volunteers with needs. The company is the world’s first micro-volunteering network, sending challenges to registered volunteers that are targeted to their skill set and the causes that they are most passionate about.

It is not farfetched to envisage how these technologies could be repurposed or simply applied to facilitate and streamline volunteer management following a disaster. Indeed, researchers at the University of Queensland in Australia have already developed a new smart phone app to help mobilize and coordinate volunteer efforts during and following major disasters. The app not only provides information on preparedness but also gives real-time updates on volunteering opportunities by local area. For example, volunteers can register for a variety of tasks including community response to extreme weather events.

Meanwhile, the American Red Cross just launched a Digital Operations Center in partnership with Dell Labs, which allows them to leverage digital volunteers and Dell’s social media monitoring platforms to reduce the noise-to-signal ratio. This is a novel “social media-based operation devoted to humanitarian relief, demonstrating the growing importance of social media in emergency situations.” As part of this center, the Red Cross also “announced a Digital Volunteer program to help respond to question from and provide information to the public during disasters.”

While important challenges do exist, there are many positive externalities to leveraging digital volunteers. As deputy high commissioner of UNHCR noted about this UNHCR-volunteer project in Somalia, these types of projects create more citizen-engagement and raises awareness of humanitarian organizations and projects. This in part explains why UNHCR wants more, not less, engage-ment with digital volunteers. Indeed, these volunteers also develop important skills that will be increasingly sought after by humanitarian organizations recruit-ing for junior full-time positions. Humanitarian organizations are likely to be come smarter and more up to speed on humanitarian technologies and digital humanitarian skills as a result. This change should be embraced.

So given the rise of “self-quantified” disaster-affected communities and digitally empowered volunteer communities, is there a future for traditional humani-tarian organizations? Of course, anyone who suggests otherwise is seriously misguided and out of touch with innovation in the humanitarian space. Twitter will not put the UN out of business. Humanitarian organizations will continue to play some very important roles, especially those relating to logistics and coor-dination. These organizations will continue outsourcing some roles but will also take on some new roles. The issue here is simply one of comparative advantage. Humanitarian organizations used to have a comparative advantage in some areas, but this has shifted for all the reasons described above. So outsourcing in some cases makes perfect sense.

Interestingly, organizations like UN OCHA are also changing some of their own internal information management processes as a result of their collaboration with volunteer networks like the SBTF, which they expect will lead to a number of efficiency gains. Furthermore, OCHA is behind the Digital Humanitarians initiative and has also been developing a check-in app for humanitarian pro-fessionals to use in disaster response—clear signs of innovation and change. Meanwhile, the UK’s Department for International Development (DfID) has just launched a $75+ million fund to leverage new technologies in support of humani-tarian response; this includes mobile phones, satellite imagery, Twitter as well as other social media technologies, digital mapping and gaming technologies. Given that crisis mapping integrates these new technologies and has been at the cutting edge of innovation in the humanitarian space, I’ve invited DfID to participate in this year’s International Conference on Crisis Mapping (ICCM 2012).

In conclusion, and as argued two years ago, the humanitarian industry is shifting towards a more multi-polar system. The rise of new actors, from digitally empowered disaster-affected communities to digital volunteer networks, has been driven by the rapid commercialization of communication technology—particularly the mobile phone and social networking platforms. These trends are unlikely to change soon and crises will continue to spur innovations in this space. This does not mean that traditional humanitarian organizations are becoming obsolete. Their roles are simply changing and this change is proof that they are not battlefield monuments. Of course, only time will tell whether they change fast enough.

Detecting Emerging Conflicts with Web Mining and Crisis Mapping

My colleague Christopher Ahlberg, CEO of Recorded Future, recently got in touch to share some exciting news. We had discussed our shared interests a while back at Harvard University. It was clear then that his ideas and existing technologies were very closely aligned to those we were pursuing with Ushahidi’s Swift River platform. I’m thrilled that he has been able to accomplish a lot since we last spoke. His exciting update is captured in this excellent co-authored study entitled “Detecting Emergent Conflicts Through Web Mining and Visualization” which is available here as a PDF.

The study combines almost all of my core interests: crisis mapping, conflict early warning, conflict analysis, digital activism, pattern recognition, natural language processing, machine learning, data visualization, etc. The study describes a semi-automatic system which automatically collects information from pre-specified sources and then applies linguistic analysis to user-specified extract events and entities, i.e., structured data for quantitative analysis.

Natural Language Processing (NLP) and event-data extraction applied to crisis monitoring and analysis is of course nothing new. Back in 2004-2005, I worked for a company that was at the cutting edge of this field vis-a-vis conflict early warning. (The company subsequently joined the Integrated Conflict Early Warning System (ICEWS) consortium supported by DARPA). Just a year later, Larry Brilliant told TED 2006 how the Global Public Health Information Net-work (GPHIN) had leveraged NLP and machine learning to detect an outbreak of SARS 3 months before the WHO. I blogged about this, Global Incident Map, European Media Monitor (EMM), HavariaHealthMap and Crimson Hexagon back in 2008. Most recently, my colleague Kalev Leetaru showed how applying NLP to historical data could have predicted the Arab Spring. Each of these initiatives represents an important effort in leveraging NLP and machine learning for early detection of events of interest.

The RecordedFuture system works as follows. A user first selects a set of data sources (websites, RSS feeds, etc) and determines the rate at which to update the data. Next, the user chooses one or several existing “extractors” to find specific entities and events (or constructs a new type). Finally, a taxonomy is selected to specify exactly how the data is to be grouped. The data is then automatically harvested and passed through a linguistics analyzer which extracts useful information such as event types, names, dates, and places. Finally, the reports are clustered and visualized on a crisis map, in this case using an Ushahidi platform. This allows for all kinds of other datasets to be imported, compared and analyzed, such as high resolution satellite imagery and crowdsourced data.

A key feature of the RecordedFuture system is that extracts and estimates the time for the event described rather than the publication time of the newspaper article parsed, for example. As such, the harvested data can include both historic and future events.

In sum, the RecordedFuture system is composed of the following five features as described in the study:

1. Harvesting: a process in which text documents are retrieved from various sources and stored in the database. The documents are stored for long-term if permitted by terms of use and IPR legislation, otherwise they are only stored temporarily for the needed analysis.

2. Linguistic analysis: the process in which the retrieved texts are analyzed in order to extract entities, events, time and location, etc. In contrast to other components, the linguistic analysis is language dependent.

3. Refinement: additional information can be obtained in this process by synonym detection, ontology analysis, and sentiment analysis.

4. Data analysis: application of statistical and AI-based models such as Hidden Markov Models (HMMs) and Artificial Neural Networks (ANNs) to generate predictions about the future and detect anomalies in the data.

5. User experience: a web interface for ordinary users to interact with, and an API for interfacing to other systems.

The authors ran a pilot that “manually” integrated the RecordedFuture system with the Ushahidi platform. The result is depicted in the figure below. In the future, the authors plan to automate the creation of reports on the Ushahidi platform via the RecordedFuture system. Intriguingly, the authors chose to focus on protest events to demo their Ushahidi-coupled system. Why is this intriguing? Because my dissertation analyzed whether access to new information and communication technologies (ICTs) are statistically significant predictors of protest events in repressive states. Moreover, the protest data I used in my econometric analysis came from an automated NLP algorithm that parsed Reuters Newswires.

Using RecordedFuture, the authors extracted some 6,000 protest event-data for Quarter 1 of 2011. These events were identified and harvested using a “trained protest extractor” constructed using the system’s event extractor frame-work. Note that many of the 6,000 events are duplicates because they are the same events but reported by different forces. Not surprisingly, Christopher and team plan to develop a duplicate detection algorithm that will also double as a triangulation & veracity scoring feature. I would be particularly interested to see them do this kind of triangulation and validation of crowdsourced data on the fly.

Below are the protest events picked up by RecordedFuture for both Tunisia and Egypt. From these two figures, it is possible to see how the Tunisian protests preceded those in Egypt.

The authors argue that if the platform had been set up earlier this year, a user would have seen the sudden rise in the number of protests in Egypt. However, the authors acknowledge that their data is a function of media interest and attention—the same issue I had with my dissertation. One way to overcome this challenge might be by complementing the harvested reports with crowdsourced data from social media and Crowdmap.

In the future, the authors plan to have the system auto-detect major changes in trends and to add support for the analysis of media in languages beyond English. They also plan to test the reliability and accuracy of their conflict early warning algorithm by comparing their forecasts of historical data with existing conflict data sets. I have several ideas of my own about next steps and look forward to speaking with Christopher’s team about ways to collaborate.