Tag Archives: Disaster Response

JRC: Geo-Spatial Analysis for Global Security

The European Commission’s Joint Research Center (JRC) is doing some phenomenal work on Geo-Spatial Information Analysis for Global Security and Stability. I’ve had several meetings with JRC colleagues over the years and have always been very impressed with their projects.

The group is not very well known outside Europe so the purpose of this blog post is to highlight some of the Center’s projects.

  • Enumeration of Refugee Camps: The project developed an operational methodology to estimate refugee populations using very high resolution (VHR) satellite imagery. “The methodology relies on a combination of machine-assisted procedures, photo-interpretation and statistical sampling.”

jrc1

  • Benchmarking Hand Held Equipment for Field Data Collection: This project tested new devices for the collection for geo-referenced information. “The assessment of the instruments considered their technical characteristics, like the availability of necessary instruments or functionalities, technical features, hardware specifics, software compatibility and interfaces.”

jrc3

  • GEOCREW – Study on Geodata and Crisis Early Warning: This project analyzed the use of geo-spatial technology in the decision-making process of institutions dealing with international crises. The project also aimed to show best practice in the use of geo-spatial technologies in the decision-making process.
  • Support to Peacekeeping Operations in the Sudan: Maps are generally not available or often are out of date for most of the conflict areas in which peacekeping personnel is deployed,  This UNDPKO Darfur mapping initiative aimed to create an alliance of partners that addressed this gap and shared the results.

jrc4

  • Temporary Settlement Analysis by Remote Sensing: The project analyzes different types of refugee and IDP settlements to identify single structures inside refugee settlements. “The objective of the project is to establish the first comprehensive catalog of image interpretation keys, based on last-generation satellite data and related to the analysis of transitional settlements.”

JRC colleagues often publish papers on their work and I highly recommend having a look at this book when it comes out in June 2009:

jrc5

Patrick Philippe Meier

Video Introduction to Crisis Mapping

I’ve given many presentations on crisis mapping over the past two years but these were never filmed. So I decided to create this video presentation with narration in order to share my findings more widely and hopefully get a lot of feedback in the process. The presentation is not meant to be exhaustive although the video does run to about 30 minutes.

The topics covered in this presentation include:

  • Crisis Map Sourcing – information collection;
  • Mobile Crisis Mapping – mobile technology;
  • Crisis Mapping Visualization – data visualization;
  • Crisis Mapping Analysis – spatial analysis.

The presentation references several blog posts of mine in addition to several operational projects to illustrate the main concepts behind crisis mapping. The individual blog posts featured in the presentation are listed below:

This research is the product of a 2-year grant provided by Humanity United  (HU) to the Harvard Humanitarian Initiative’s (HHI) Program on Crisis Mapping and Early Warning, where I am a doctoral fellow.

I look forward to any questions/suggestions you may have on the video primer!

Patrick Philippe Meier

Internews, Ushahidi and Communication in Crises

I had the pleasure of participating in two Internews sponsored meetings in New York today. Fellow participants included OCHA, Oxfram, Red Cross, Save the Children, World Vision, BBC World Service Trust, Thomson Reuters Foundation, Humanitarian Media Foundation, International Media Support and several others.

img_0409

The first meeting was a three-hour brainstorming session on “Improving Humanitarian Information for Affected Communities” organized in preparation for the second meeting on “The Unmet Need for Communication in Humanitarian Response,” which was held at the UN General Assembly.

img_0411

The meetings presented an ideal opportunity for participants to share information on current initiatives that focus on communications with crisis-affected populations. Ushahidi naturally came to mind so I introduced the concept of crowdsourcing crisis information. I should have expected the immediate push back on the issue of data validation.

Crowdsourcing and Data Validation

While I have already blogged about overcoming some of the challenges of data validation in the context of crowdsourcing here, there is clearly more to add since the demand for “fully accurate information” a.k.a. “facts and only facts” was echoed during the second meeting in the General Assembly. I’m hoping this blog post will help move the discourse beyond the black and white concepts that characterize current discussions on data accuracy.

Having worked in the field of conflict early warning and rapid response for the past seven years, I fully understand the critical importance of accurate information. Indeed, a substantial component of my consulting work on CEWARN in the Horn of Africa specifically focused on the data validation process.

To be sure, no one in the humanitarian and human rights community is asking for inaccurate information. We all subscribe to the notion of “Do No Harm.”

Does Time Matter?

What was completely missing from today’s meetings, however, was a reference to time. Nobody noted the importance of timely information during crises, which is rather ironic since both meetings focused on sudden onset emergencies. I suspect that our demand (and partial Western obsession) for fully accurate information has clouded some of our thinking on this issue.

This is particularly ironic given that evidence-based policy-making and data-driven analysis are still the exception rather than the rule in the humanitarian community. Field-based organizations frequently make decisions on coordination, humanitarian relief and logistics without complete and fully accurate, real-time information, especially right after a crisis strikes.

So why is this same community holding crowdsourcing to a higher standard?

Time versus Accuracy

Timely information when a crisis strikes is a critical element for many of us in the humanitarian and human rights communities. Surely then we must recognize the tradeoff between accuracy and timeliness of information. Crisis information is perishable!

The more we demand fully accurate information, the longer the data validation process typically takes and thus the more likely the information will be become useless. Our public health colleagues who work in emergency medicine know this only too well.

The figure below represents the perishable nature of crisis information. Data validation makes sense during time-periods A and B. Continuing to carry out data validation beyond time B may be beneficial to us, but hardly to crisis affected communities. We may very well have the luxury of time. Not so for at-risk communities.

relevance_time

This point often gets overlooked when anxieties around inaccurate information surface. Of course we need to insure that information we produce or relay is as accurate as possible. Of course we want to prevent dangerous rumors from spreading. To this end, the Thomson Reuters Foundation clearly spelled out that their new Emergency Information Service (EIS) would only focus on disseminating facts and only facts. (See my previous post on EIS here).

Yes, we can focus all our efforts on disseminating facts, but are those facts communicated after time-period B above really useful to crisis-affected communities? (Incidentally, since EIS will be based on verifiable facts, their approach may well be liked to Wikipedia’s rules for corrective editing. In any event, I wonder how EIS might define the term “fact”).

Why Ushahidi?

Ushahidi was created within days of the Kenyan elections in 2007 because both the government and national media were seriously under-reporting widespread human rights violations. I was in Nairobi visiting my parents at the time and it was also frustrating to see the majority of international and national NGOs on the ground suffering from “data hugging disorder,” i.e., they had no interest whatsoever to share information with each other or the public for that matter.

This left the Ushahidi team with few options, which is why they decided to develop a transparent platform that would allow Kenyans to report directly, thereby circumventing the government, media and NGOs, who were working against transparency.

Note that the Ushahidi team is only comprised of tech-experts. Here’s a question: why didn’t the human rights or humanitarian community set up a platform like Ushahidi? Why were a few tech-savvy Kenyans without a humanitarian background able to set up and deploy the platform within a week and not the humanitarian community? Where were we? Shouldn’t we be the ones pushing for better information collection and sharing?

In a recent study for the Harvard Humanitarian Initiative (HHI), I mapped and time-stamped reports on the post-election violence reported by the mainstream media, citizen journalists and Ushahidi. I then created a Google Earth layer of this data and animated the reports over time and space. I recommend reading the conclusions.

Accuracy is a Luxury

Having worked in humanitarian settings, we all know that accuracy is more often luxury than reality, particularly right after a crisis strikes. Accuracy is not black and white, yes or no. Rather, we need to start thinking in terms of likelihood, i.e., how likely is this piece of information to be accurate? All of us already do this everyday albeit subjectively. Why not think of ways to complement or triangulate our personal subjectivities to determine the accuracy of information?

At CEWARN, we included “Source of Information” for each incident report. A field reporter could select from several choices: (1) direct observation; (2) media, and (3) rumor. This gave us a three-point weighted-scale that could be used in subsequent analysis.

At Ushahidi, we are working on Swift River, a platform that applies human crowdsourcing and machine analysis (natural language parsing) to filter crisis information produced in real time, i.e., during time-periods A and B above. Colleagues at WikiMapAid are developing similar solutions for data on disease outbreaks. See my recent post on WikiMapAid and data validation here.

Conclusion

In sum, there are various ways to rate the likelihood that a reported event is true. But again, we are not looking to develop a platform that insures 100% reliability. If full accuracy were the gold standard of humanitarian response (or military action for that matter), the entire enterprise would come to a grinding halt. The intelligence community has also recognized this as I have blogged about here.

The purpose of today’s meetings was for us to think more concretely about communication in crises from the perspective of at-risk communities. Yet, as soon as I mentioned crowdsourcing the discussion became about our own demand for fully accurate information with no concerns raised about the importance of timely information for crisis-affected communities.

Ironic, isn’t it?

Patrick Philippe Meier

GIS and GPS for Dangerous Environments

A colleague of mine recently pointed me to SAIC’s IKE 504, a GIS-integrated encrypted GPS targeting and data capture device. IKE captures the GPS coordinates and other geospatial data for any target from a safe distance (up to 1,000 meters) and provides a verifiable digital image of the target. To this end, IKE can be used for specialized mapping.

ike

Patrick Philippe Meier

Web4Dev: Innovation Track Day 2

The second day of the Innovation Track at Web4Dev focused on monitoring and evaluation. Robert Kirkpatrick from InSTEDD, Erik Hersman from Ushahidi and Christopher Strebel from UNESCO each gave a presentation.

Robert introduced InSTEDD’s Mesh4X and GeoChat which I’ve already blogged about here so won’t expand on. But Robert also introduced a new project I was not aware of called Evolve. This tool helps to synthesize data into actionable information, to collaborate around diverse data streams to detect, analyze, triage and track critical events as they unfold.

Erik introduced Ushahidi and described our increasing capacity to crowdsource eyewitness crisis data. However, the challenge is increasingly how to consume and make sense of the incoming data stream. There were thousands of Tweets per minute during the Mumbai attacks. Ushahidi is working on Swift River to explore ways to use crowdsourcing as a filter for data validation.

Christopher Strebel introduced GigaPan, a robotic camera that captures gigapixel images. The tool was developed for the Mars Rover program to take very high resolution images of Mars. UNESCO is introducing the technology for education purposes. I’m not sure I’m entirely convinced about this project; not just because the camera costs $300-$400 but because I don’t see what such a sophisticated  tool adds over regular cameras in terms of education and participation.

In any case, while I found all three presentations interesting, none of them actually addressed the second topic of today’s workshop, namely evaluation. I spent most of December and January working with a team of monitoring and evaluation (M&E) experts to develop a framework for a multi-year project in Liberia. I can conclude from this experience that those of us who don’t have expertise in M&E have a huge amount to learn. Developing serious M&E frameworks is a rigorous process.

Patrick Philippe Meier

Towards an Emergency News Service?

The 2005 World Disasters Report stated that “information is a vital form of aid in itself [since] disaster affected people need information as much as water, food, medicine or shelter. Information can save lives, livelihoods and resources.”

As we know only too well, information is often one of the first casualties in crises: crippled communications and shattered transportation links present significant obstacles. Communication with beneficiaries is rarely high on the priority lists of many relief responders.

ENA

The Thomson Reuters Foundation is therefore proposing to tackle this problem with the creation of an Emergency Information Service (EIS). The concept is simple:

Deploy highly mobile reporting teams to disaster zones to disseminate fast, reliable information to affected populations. The EIS will untangle the often chaotic information flows from governments, international agencies and domestic aid players, producing trustworthy material in local languages for distribution by domestic media, cell phone networks and other methods appropriate to circumstances.

Thompson Reuters wants to send out teams of specialist reporters to cover unfolding disaster zones and channel vital information directly to affected communities. The teams would also interface with governments, the military, the United Nations, international NGOs and local charities.

I can see how this would address some of the current shortcomings, but I’m not convinced about sending in teams of reporters. For one, how will EIS deal with governments that refuse entry into their disaster effected regions? We need a less “egocentric” approach, one that seeks “the proper balance between the need for external assistance and the capacity of local people to deal with the situation” (Cardona 2004).

EIS’s reporting strategy appears to replicate the top-down, external approach to humanitarian response instead of empowering local at-risk communities directly so they can conduct their own reporting and communicating. For example, why not include a strategy to improve and expand citizen journalism in crisis zones?

In any case, I think EIS’s disseminating strategy includes some good ideas. For example:

  • Remote information terminals: These are low-cost computer terminals to be distributed en masse to affected villages, local NGOs, media outlets etc. Terminals will be wind-up or solar-powered laptops capable of being connected to mobile or satellite phones. With minimal training, local people can set up the terminals and use them to gain critical information about relief efforts or trace relatives.
  • Mobile phone distributions: In many crisis situations, SMS messaging is possible even when other communications are destroyed or overloaded. The distribution of thousands of low-cost handsets to community leaders, NGO volunteers and members of the local media could create a “bush telegraph” effect and allow two-way interaction with beneficiaries.
  • Recorded information bulletins: Mobile phone users will be able to dial in to regularly updated, local-language bulletins giving the latest information on health, shelter, government response and so on.
  • Zero-tech solutions: Megaphones, posters, leaflet drops, bulletin boards, community newsletters.

In order for this to catch, ENS should be set up to provide services during times of non-disasters as well.

Trust.org

In terms of next steps:

“The Thomson Reuters Foundation will soon launch a new website – www.trust.org – to serve as a gateway for all its activities. The Emergency News Agency and AlertNet will be core components of the new site. Trust.org will provide a single point of access for aid professionals, journalists, pro bono service providers, donors and members of the public. Telecommunications allowing, it could also serve as a powerful resource centre for communities affected by disasters.”

For more information on the Emergency News Service, please see this thinkpaper (PDF).

Patrick Philippe Meier

3D Crisis Mapping for Disaster Simulation Training

I recently had a fascinating meeting in Seattle with Larry Pixa, Microsoft’s Senior Manager for Disaster Preparedness & Response Program. What I thought would be a half-hour meeting turned into an engaging two-hour conversation. I wish we had had even more time.

Acron

Harvard Humanitarian Initiative (HHI) co-Chair Dr. Jennifer Leaning and I had a conversation two years ago on the need to merge disaster training with serious gaming and 3-D crisis mapping. Her vision, for example, was to recreate downtown Monrovia as a virtual world with avatars and have both live and manual data feeds simulate the virtual environment, a.k.a. immersive realism meets reality mining.

This world would then be used to create scenarios for disaster preparedness and response training, much like my colleagues at ICNC have done by developing a serious game called “A Force More Powerful” which uses artificial intelligence and real-world scenarios to train nonviolent activists.

Force More Powerful

Larry and his colleagues at Acron are pushing the envelope of disaster simulation for training purposes. They are integrating Acron’s serious games know-how with Microsoft ESP and the video game engine of Microsoft’s Flight Simulator platform with dynamic crisis mapping to develop a pilot that closely resembles the vision set out by Jennifer back in 2006. I personally thought we were still a year or two away from having a pilot. Not so. Larry will be presenting the pilot at HHI’s Humanitarian Health Summit in March 2009.

The goal for the pilot, or as we the United Nation’s World Food Program (WFP) call it, the “software-based proof of concept,” is to establish the proof of concept into a “training platform” to be combined with training materials that will serve as a demonstrate the tool to governments and international organizations worldwide; particularly vis-a-vis training to build preparedness & response capabilities and informed decision making for the adoption of technologies to enable or improve disaster response and crisis management.

So our goal is to engage with any/all appropriate agencies to provide training against the “training platform”; the training will be based on some key scenarios in Bangladesh acquired through the partnership between WFP and Microsoft.

Successful training requires that we actually remember the training. But we all know from conventional class learning that we retain little of what we read. On the other hand, our memory retains almost all of what we do and that, according to Larry, is what his new disaster simulations platform seeks to achieve.

Microsoft

What I find particularly compelling about Larry’s work is that the tool he is developing can be used for both disaster training and actual disaster response. That is, once trainees become familiar with the platform, they can use it for in situ disaster response thanks to live data feeds rendering the “virtual” world in quasi-real time. This should eventually enable disaster responders to test out several response scenarios and thereby select the most effective one, all in quasi-real time.

Patrick Philippe Meier