Category Archives: Humanitarian Technologies

Muḥammad ibn Mūsā al-Khwārizmī: An Update from the Qatar Computing Research Institute

I first heard of al-Khwārizmī in my ninth-grade computer science class at the International School of Vienna (AIS) back in 1993. Dr. Herman Prossinger who taught the course is exactly the kind of person one describes when answering the question: which teacher had the most impact on you while growing up? I wonder how many other 9th graders in the world had the good fortune of being taught computer science by a full-fledged professor with a PhD dissertation entitled “Isothermal Gas spheres in General Relativity Theory” (1976) and numerous peer-reviewed publications in top-tier scientific journals including Nature?

Muḥammad ibn Mūsā al-Khwārizmī was a brilliant mathematician & astronomer who spent his time as a scholar in the House of Wisdom in Baghdad (possibly the best name of any co-working space in history). “Al-Khwarithmi” was initially transliterated into Latin as Algoritmi. The manuscript above, for example, begins with “DIXIT algorizmi,” meaning “Says al-Khwārizmī.” And thus was born the world AlgorithmBut al-Khwārizmī’s fundamental contributions were not limited to the fields of mathematics and astronomy, he is also well praised for his important work on geography and cartography. Published in 833, his Kitāb ṣūrat al-Arḍ (Arabic: كتاب صورة الأرض) or “Book on the Appearance of the Earth” was a revised and corrected version of Ptolemy’s Geography. al-Khwārizmī’s book comprised an impressive list of 2,402 coordinates of cities and other geo-graphical features. The only surviving copy of the book can be found at Strasbourg University. I’m surprised the item has not yet been purchased by Qatar and relocated to Doha.

View of the bay from QCRI in Doha, Qatar.

This brings me to the Qatar (Foundation) Computing Research Institute (QCRI), which was almost called the al-Khwārizmī Computing Research Institute. I joined QCRI exactly two weeks ago as Director of Social Innovation. My first impression? QCRI is Doha’s “House of Whizzkids”. The team is young, dynamic, international and super smart. I’m already working on several exploratory research and development (R&D) projects that could potentially lead to initial prototypes by the end of the year. These have to do with the application of social computing and big data analysis for humanitarian response. So I’ve been in touch with several colleagues at the United Nations (UN) Office for the Coordination of Humanitarian Affairs (OCHA) to bounce these early ideas off and am thrilled that all responses thus far have been very positive.

My QCRI colleagues and I are also looking into collaborative platforms for “smart microtasking” which may be useful for the Digital Humanitarian Network. In addition, we’re just starting to explore potential solutions for quantifying veracity in social media, a rather non-trivial problem as Dr. Prossinger would often say with a sly smile in relation to NP-hard problems. In terms of partner-ship building, I will be in New York, DC and Boston next month for official meetings with the UN, World Bank and MIT to explore possible collaborations on specific projects. The team in Doha is particularly strong on big data analytics, social computing, data cleaning, machine learning and translation. In fact, most of the whizzkids here come from very impressive track records with Microsoft, Yahoo, Ivy Leagues, etc. So I’m excited by the potential.

View of Tornado Tower (purple lights) where QCRI is located.

The reason I’m not going into specifics vis-a-vis these early R&D efforts is not because I want to be secretive or elusive. Not at all. We’re still refining the ideas ourselves and simply want to manage expectations. There is a very strong and genuine interest within QCRI to contribute meaningfully to the humanitarian technology space. But we’re really just getting started, still hiring left, center and right, and we’ll be in R&D mode for a while. Plus, we don’t want to rush just for the sake of launching a new product. All too often, humanitarian technologies are developed without the benefit (and luxury) of advanced R&D. But if QCRI is going to help shape next-generation humanitarian technology solutions, we should do this in a way that is deliberate, cutting-edge and strategic. That is our comparative advantage.

In sum, the outcome of our R&D efforts may not always lead to a full-fledged prototype, but all the research and findings we carry out will definitely be shared publicly so we can move the field forward. We’re also committed to developing free and open source software as part of our prototyping efforts. Finally, we have no interest in re-inventing the wheel and far prefer working in partnerships than in isolation. So there we go, time to R&D  like al-Khwārizmī.

Back to the Future: On National Geographic and Crisis Mapping

[Cross-posted from National Geographic Newswatch]

Published in October 1888, the first issue of National Geographic “was a modest looking scientific brochure with an austere terra-cotta cover” (NG 2003). The inaugural publication comprised a dense academic treatise on the classification of geographic forms by genesis. But that wasn’t all. The first issue also included a riveting account of “The Great White Hurricane” of March 1888, which still ranks as one of the worst winter storms ever in US history.

Wreck at Coleman’s Station, New York & Harlem R. R., March 13, 1888. Photo courtesy NOAA Photo Library.

I’ve just spent a riveting week myself at the 2012 National Geographic Explorers Symposium in Washington DC, the birthplace of the National Geographic Society. I was truly honored to be recognized as a 2012 Emerging Explorer along with such an amazing and accomplished cadre of explorers. So it was with excitement that I began reading up on the history of this unique institution whilst on my flight to Doha following the Symposium.

I’ve been tagged as the “Crisis Mapper” of the Emerging Explorers Class of 2012. So imagine my astonishment when I  began discovering that National Geographic had a long history of covering and mapping natural disasters, humanitarian crises and wars starting from the very first issue of the magazine in 1888. And when World War I broke out:

“Readers opened their August 1914 edition of the magazine to find an up-to-date map of ‘The New Balkan States and Central Europe’ that allowed them to follow the developments of the war. Large maps of the fighting fronts continued to be published throughout the conflict […]” (NG 2003).

Map of ‘The New Balkan States and Central Europe’ from the August 1914 “National Geographic Magazine.” Image courtesy NGS.

National Geographic even established a News Service Bureau to provide bulletins on the geographic aspects of the war for the nation’s newspapers. As the respected war strategist Carl von Clausewitz noted half-a-century before the launch of Geographic, “geography and the character of the ground bear a close and ever present relation to warfare, . . . both as to its course and to its planning and exploitation.”

“When World War II came, the Geographic opened its vast files of photographs, more than 300,000 at that time, to the armed forces. By matching prewar aerial photographs against wartime ones, analysts detected camouflage and gathered intelligence” (NG 2003).

During the 1960s, National Geographic “did not shrink from covering the war in Vietnam.” Staff writers and photographers captured all aspects of the war from “Saigon to the Mekong Delta to villages and rice fields.” In the years and decades that followed, Geographic continued to capture unfolding crises, from occupied Palestine and Apartheid South Africa to war-torn Afghanistan and the drought-striven Sahel of Africa.

Geographic also covered the tragedy of the Chernobyl nuclear disaster and the dramatic eruption of Mount Saint Helens. The gripping account of the latter would in fact become the most popular article in all of National Geographic history. Today,

“New technologies–remote sensing, lasers, computer graphics, x-rays and CT scans–allow National Geographic to picture the world in new ways.” This is equally true of maps. “Since the first map was published in the magazine in 1888, maps  have been an integral component of many magazine articles, books and television programs […]. Originally drafted by hand on large projections, today’s maps are created by state-of-the art computers to map everything from the Grand Canyon to the outer reaches of the universe” (NG 2003). And crises.

“Pick up a newspaper and every single day you’ll see how geography plays a dominant role in giving a third dimension to life,” wrote Gil Grosvenor, the former Editor in Chief of National Geographic (NG 2003). And as we know only too well, many of the headlines in today’s newspapers relay stories of crises the world over. National Geographic has a tremendous opportunity to shed a third dimension on emerging crises around the globe using new live mapping technologies. Indeed, to map the world is to know it, and to map the world live is to change it live before it’s too late. The next post in this series will illustrate why with an example from the 2010 Haiti Earthquake.

Patrick Meier is a 2012 National Geographic Emerging ExplorerHe is an internationally recognized thought leader on the application of new technologies for positive social change. He currently serves as Director of Social Innovation at the Qatar Foundation’s Computing Research Institute (QCRI). Patrick also authors the respected iRevolution blog & tweets at @patrickmeier. This piece was originally published here on National Geographic.

Marketing Peace using SMS Mobile Advertising: A New Approach to Conflict Prevention

I was just in Kenya working on the next phase of the PeaceTXT project with my colleague Rachel Brown from Sisi ni Amani. I’m finally getting to implement an approach to conflict early warning and early response that I have been advocating for since 2006. I came close in 2008 whilst working on a conflict early and response project in Timor-Leste. But I wasn’t in Dili long enough to see the project through and the country’s limited mobile phone coverage presented an important obstacle. Long story short, I’ve been advocating for a people-centered and preparedness-based approach to conflict early warning systems for half a decade and am finally implementing one with PeaceTXT.

Conflicts are often grounded in the stories and narratives that people tell themselves and the emotions that these stories generate. Narratives shape identity and the social construct of reality—we interpret our lives through stories. These have the power to transform relationships and communities. The purpose of PeaceTXT is to leverage mobile messaging (SMS) to market peace in strategic ways and thereby generate alternative narratives. SMS reminders have been particularly effective in catalyzing behavior change in several important public health projects. In addition, marketing to the “Bottom of the Pyramid” is increasingly big business and getting more sophisticated. We believe that lessons learned from these sectors can be combined and applied to catalyze behavior  change vis-a-vis peace and conflict issues by amplifying new narratives using timely and strategically targeted SMS campaigns.

Last year, Sisi ni Amani sent the following SMS to 10,000 subscribers across Kenya: A good leader initiates and encourages peace and development among all people and is not tribal. “In a nation divided along ethnic lines, where a winner-takes-all mindset fuels rampant corruption and political violence, changing perceptions of good leadership is a daunting endeavor. And yet, according to post-campaign data, 90 percent of respondents said they changed their understanding of ‘what makes a good leader’ in response to the organization’s messaging. As one respondent commented: ‘I used to think a good leader is one who has the most votes, but now I know a good leader is one who thinks of the people who voted for him, not himself'” (NextBillion Blog Post).

PeaceTXT is about marketing peace using mobile advertising by leveraging user-generated content for said text messages. We’re in the business of selling peace for free by countering other narratives that tend to incite violent behavior. Preparedness is core to the PeaceTXT model. To be sure, local mobile-based advertising is hardly reactive or random. Indeed, billions of dollars go into marketing campaigns for a reason. To this end, we’re busy developing an agile SMS protocol that will allow us to send pre-determined customized text messages to specific groups (demographics) in targeted locations within minutes of an incident occurring. The content for said text messages will come from local communities themselves.

The next step is for Rachel and her team to organize and hold several local focus groups in July to begin generating appropriate content for text messages to de-escalate and/or counter police-community tensions, rumors and insecurity. I’ll be back in Kenya in August to review this user-generated content so we can add the text messages to our SMS protocol and customized SMS platform. I’m thrilled and can’t wait to work on this next phase.

What United Airlines can Teach the World Bank about Mobile Accountability

Flight delays can sometimes lead to interesting discoveries. As my flight to DC was delayed for a third frustrating hour, I picked up the United Airlines in-flight magazine and saw this:

United just launched a novel feedback program that the World Bank and other development organizations may want to emulate given their interest in pro-moting upward accountability. From the United Press Release:

“Behind every great trip is an airline of great people. Now, when you receive excellent customer service from an eligible United […] employee, you can enter him or her in United’s Outperform Recognition Program. If the employee you enter is a winner in our random drawing for cash prizes, you win, too. With just a few clicks on the United mobile app, you could have the chance to win MileagePlus award miles or even roundtrip tickets.”

“Eligible MileagePlus members can participate in the recognition program using the United mobile app, available for Apple and Android devices, to nominate eligible employees. MileagePlus members simply nominate the employee of their choice through the United mobile app.”

This participatory and crowdsourced recognition program is brilliant for several reasons. First, the focus is on identifying positive deviance rather than generating negative feedback. In other words, it is not a complaints but a rewards system. Second, the program is incentive-based with shared proceeds. Not only do United employees have the chance to make some extra cash (average salary of flight attendants is $36,128), those who nominate employees for outstanding service also share in the proceeds in the form of free tickets and airline miles.

Third, United didn’t develop a new, separate smartphone app or technology for this recognition program; they added the feature directly into the existing United app instead. (That said, they ought to give passengers the option of submitting an entry via United’s website as well since not everyone will be comfortable using a smartphone app). I’d also recommend they make some of the submissions available on a decidate section of the United website to give users the option to browse through some of the feedback (and even digg up those they like the most).

I wonder whether other airlines in the StarAlliance network will adopt the same (or similar) recognition program. I also wonder whether donors like the World Bank ought to develop a similar solution (perhaps SMS-based) and require the use of this service for all projects funded by the Bank.

Big Data for Development: Challenges and Opportunities

The UN Global Pulse report on Big Data for Development ought to be required reading for anyone interested in humanitarian applications of Big Data. The purpose of this post is not to summarize this excellent 50-page document but to relay the most important insights contained therein. In addition, I question the motivation behind the unbalanced commentary on Haiti, which is my only major criticism of this otherwise authoritative report.

Real-time “does not always mean occurring immediately. Rather, “real-time” can be understood as information which is produced and made available in a relatively short and relevant period of time, and information which is made available within a timeframe that allows action to be taken in response i.e. creating a feedback loop. Importantly, it is the intrinsic time dimensionality of the data, and that of the feedback loop that jointly define its characteristic as real-time. (One could also add that the real-time nature of the data is ultimately contingent on the analysis being conducted in real-time, and by extension, where action is required, used in real-time).”

Data privacy “is the most sensitive issue, with conceptual, legal, and technological implications.” To be sure, “because privacy is a pillar of democracy, we must remain alert to the possibility that it might be compromised by the rise of new technologies, and put in place all necessary safeguards.” Privacy is defined by the International Telecommunications Union as theright of individuals to control or influence what information related to them may be disclosed.” Moving forward, “these concerns must nurture and shape on-going debates around data privacy in the digital age in a constructive manner in order to devise strong principles and strict rules—backed by adequate tools and systems—to ensure “privacy-preserving analysis.”

Non-representative data is often dismissed outright since findings based on such data cannot be generalized beyond that sample. “But while findings based on non-representative datasets need to be treated with caution, they are not valueless […].” Indeed, while the “sampling selection bias can clearly be a challenge, especially in regions or communities where technological penetration is low […],  this does not mean that the data has no value. For one, data from “non-representative” samples (such as mobile phone users) provide representative information about the sample itself—and do so in close to real time and on a potentially large and growing scale, such that the challenge will become less and less salient as technology spreads across and within developing countries.”

Perceptions rather than reality is what social media captures. Moreover, these perceptions can also be wrong. But only those individuals “who wrongfully assume that the data is an accurate picture of reality can be deceived. Furthermore, there are instances where wrong perceptions are precisely what is desirable to monitor because they might determine collective behaviors in ways that can have catastrophic effects.” In other words, “perceptions can also shape reality. Detecting and understanding perceptions quickly can help change outcomes.”

False data and hoaxes are part and parcel of user-generated content. While the challenges around reliability and verifiability are real, Some media organizations, such as the BBC, stand by the utility of citizen reporting of current events: “there are many brave people out there, and some of them are prolific bloggers and Tweeters. We should not ignore the real ones because we were fooled by a fake one.” And have thus devised internal strategies to confirm the veracity of the information they receive and chose to report, offering an example of what can be done to mitigate the challenge of false information.” See for example my 20-page study on how to verify crowdsourced social media data, a field I refer to as information forensics. In any event, “whether false negatives are more or less problematic than false positives depends on what is being monitored, and why it is being monitored.”

“The United States Geological Survey (USGS) has developed a system that monitors Twitter for significant spikes in the volume of messages about earthquakes,” and as it turns out, 90% of user-generated reports that trigger an alert have turned out to be valid. “Similarly, a recent retrospective analysis of the 2010 cholera outbreak in Haiti conducted by researchers at Harvard Medical School and Children’s Hospital Boston demonstrated that mining Twitter and online news reports could have provided health officials a highly accurate indication of the actual spread of the disease with two weeks lead time.”

This leads to the other Haiti example raised in the report, namely the finding that SMS data was correlated with building damage. Please see my previous blog posts here and here for context. What the authors seem to overlook is that Benetech apparently did not submit their counter-findings for independent peer-review whereas the team at the European Commission’s Joint Research Center did—and the latter passed the peer-review process. Peer-review is how rigorous scientific work is validated. The fact that Benetech never submitted their blog post for peer-review is actually quite telling.

In sum, while this Big Data report is otherwise strong and balanced, I am really surprised that they cite a blog post as “evidence” while completely ignoring the JRC’s peer-reviewed scientific paper published in the Journal of the European Geosciences Union. Until counter-findings are submitted for peer review, the JRC’s results stand: unverified, non-representative crowd-sourced text messages from the disaster affected population in Port-au-Prince that were in turn translated from Haitian Creole to English via a novel crowdsourced volunteer effort and subsequently geo-referenced by hundreds of volunteers  which did not undergo any quality control, produced a statistically significant, positive correlation with building damage.

In conclusion, “any challenge with utilizing Big Data sources of information cannot be assessed divorced from the intended use of the information. These new, digital data sources may not be the best suited to conduct airtight scientific analysis, but they have a huge potential for a whole range of other applications that can greatly affect development outcomes.”

One such application is disaster response. Earlier this year, FEMA Administrator Craig Fugate, gave a superb presentation on “Real Time Awareness” in which he relayed an example of how he and his team used Big Data (twitter) during a series of devastating tornadoes in 2011:

“Mr. Fugate proposed dispatching relief supplies to the long list of locations immediately and received pushback from his team who were concerned that they did not yet have an accurate estimate of the level of damage. His challenge was to get the staff to understand that the priority should be one of changing outcomes, and thus even if half of the supplies dispatched were never used and sent back later, there would be no chance of reaching communities in need if they were in fact suffering tornado damage already, without getting trucks out immediately. He explained, “if you’re waiting to react to the aftermath of an event until you have a formal assessment, you’re going to lose 12-to-24 hours…Perhaps we shouldn’t be waiting for that. Perhaps we should make the assumption that if something bad happens, it’s bad. Speed in response is the most perishable commodity you have…We looked at social media as the public telling us enough information to suggest this was worse than we thought and to make decisions to spend [taxpayer] money to get moving without waiting for formal request, without waiting for assessments, without waiting to know how bad because we needed to change that outcome.”

“Fugate also emphasized that using social media as an information source isn’t a precise science and the response isn’t going to be precise either. “Disasters are like horseshoes, hand grenades and thermal nuclear devices, you just need to be close— preferably more than less.”

Big Data Philanthropy for Humanitarian Response

My colleague Robert Kirkpatrick from Global Pulse has been actively promoting the concept of “data philanthropy” within the context of development. Data philanthropy involves companies sharing proprietary datasets for social good. I believe we urgently need big (social) data philanthropy for humanitarian response as well. Disaster-affected communities are increasingly the source of big data, which they generate and share via social media platforms like twitter. Processing this data manually, however, is very time consuming and resource intensive. Indeed, large numbers of digital humanitarian volunteers are often needed to monitor and process user-generated content from disaster-affected communities in near real-time.

Meanwhile, companies like Crimson Hexagon, Geofeedia, NetBase, Netvibes, RecordedFuture and Social Flow are defining the cutting edge of automated methods for media monitoring and analysis. So why not set up a Big Data Philanthropy group for humanitarian response in partnership with the Digital Humanitarian Network? Call it Corporate Social Responsibility (CRS) for digital humanitarian response. These companies would benefit from the publicity of supporting such positive and highly visible efforts. They would also receive expert feedback on their tools.

This “Emergency Access Initiative” could be modeled along the lines of the International Charter whereby certain criteria vis-a-vis the disaster would need to be met before an activation request could be made to the Big Data Philanthropy group for humanitarian response. These companies would then provide a dedicated account to the Digital Humanitarian Network (DHNet). These accounts would be available for 72 hours only and also be monitored by said companies to ensure they aren’t being abused. We would simply need to  have relevant members of the DHNet trained on these platforms and draft the appropriate protocols, data privacy measures and MoUs.

I’ve had preliminary conversations with humanitarian colleagues from the United Nations and DHnet who confirm that “this type of collaboration would be see very positively from the coordination area within the traditional humanitarian sector.” On the business development end, this setup would enable companies to get their foot in the door of the humanitarian sector—a multi-billion dollar industry. Members of the DHNet are early adopters of humanitarian technology and are ideally placed to demonstrate the added value of these platforms since they regularly partner with large humanitarian organizations. Indeed, DHNet operates as a partnership model. This would enable humanitarian professionals to learn about new Big Data tools, see them in action and, possibly, purchase full licenses for their organizations. In sum, data philanthropy is good for business.

I have colleagues at most of the companies listed above and thus plan to actively pursue this idea further. In the meantime, I’d be very grateful for any feedback and suggestions, particularly on the suggested protocols and MoUs. So I’ve set up this open and editable Google Doc for feedback.

Big thanks to the team at the Disaster Information Management Research Center (DIMRC) for planting the seeds of this idea during our recent meeting. Check out their very neat Emergency Access Initiative.

Geofeedia: Next Generation Crisis Mapping Technology?

My colleague Jeannine Lemaire from the Core Team of the Standby Volunteer Task Force (SBTF) recently pointed me to Geofeedia, which may very well be the next generation in crisis mapping technology. So I spent over an hour talking with GeoFeedia’s CEO, Phil Harris, to learn more about the platform and discuss potential applications for humanitarian response. The short version: I’m impressed; not just with the technology itself and potential, but also by Phil’s deep intuition and genuine interest in building a platform that enables others to scale positive social impact.

Situational awareness is absolutely key to emergency response, hence the rise of crisis mapping. The challenge? Processing and geo-referencing Big Data from social media sources to produce live maps has largely been a manual (and arduous) task for many in the humanitarian space. In fact, a number of humanitarian colleagues I’ve spoken to recently have complained that the manual labor required to create (and maintain) live maps is precisely why they aren’t able to launch their own crisis maps. I know this is also true of several international media organizations.

There have been several attempts at creating automated live maps. Take Havaria and Global Incidents Map, for example. But neither of these provide the customi-zability necessary for users to apply the platforms in meaningful ways. Enter Geofeedia. Lets take the recent earthquake and 800 aftershocks in Emilia, Italy. Simply type in the place name (or an exact address) and hit enter. Geofeedia automatically parses Twitter, YouTube, Flickr, Picasa and Instagram for the latest updates in that area and populates the map with this content. The algorithm pulls in data that is already geo-tagged and designated as public.

The geo-tagging happens on the smartphone, laptop/desktop when an image or Tweet is generated. The platform then allows you to pivot between the map and to browse through a collage of the automatically harvested content. Note that each entry includes a time stamp. Of course, since the search function is purely geo-based, the result will not be restricted to earthquake-related updates, hence the picture of friends at a picnic.

But lets click on the picture of the collapsed roof directly to the left. This opens up a new page with the following: the original picture and a map displaying where this picture was taken.

In between these, you’ll note the source of the picture, the time it was uploaded and the author. Directly below this you’ll find the option to query the map further by geographic distance. Lets click on the 300 meters option. The result is the updated collage below.

We know see a lot more content relevant to the earthquake than we did after the initial search. Geofeedia only parses for recently published information, which adds temporal relevance to the geographic search. The result of combing these two dimensions is a more filtered result. Incidentally, Geofeedia allows you to save and very easily share these searches and results. Now lets click on the first picture on the top left.

Geofeedia allows you to create collections (top right-hand corner).  I’ve called mine “Earthquake Damage” so I can collect all the relevant Tweets, pictures and video footage of the disaster. The platform gives me the option of inviting specific colleagues to view and help curate this new collection by adding other relevant content such as tweets and video footage. Together with Geofeedia’s multi-media approach, these features facilitate the clustering and triangulation of multi-media data in a very easy way.

Now lets pivot from these search results in collage form to the search results in map view. This display can also be saved and shared with others.

One of the clear strengths of Geofeedia is the simplicity of the user-interface. Key features and functions are esthetically designed. For example, if we wish to view the YouTube footage that is closest to the circle’s center, simply click on the icon and the video can be watched in the pop-up on the same page.

Now notice the menu just to the right of the YouTube video. Geofeedia allows you to create geo-fences on the fly. For example, we can click on “Search by Polygon” and draw a “digital fence” of that shape directly onto the map with just a few clicks of the mouse. Say we’re interested in the residential area just north of Via Statale. Simply trace the area, double-click to finish and then press on the magnifying glass icon to search for the latest social media updates and Geofeedia will return all content with relevant geo-tags.

The platform allows us to filter these results further the “Settings” menu as displayed below. On the technical side, the tool’s API supports ATOM/RSS, JSON and GeoRSS formats.

Geofeedia has a lot of potential vis-a-vis humanitarian applications, which is why the Standby Volunteer Task Force (SBTF) is partnering with the group to explore this potential further. A forthcoming blog post on the SBTF blog will outline this partnership in more detail.

In the meantime, below are a few thoughts and suggestions for Phil and team on how they can make Geofeedia even more relevant and compelling for humanitarian applications. A quick qualifier is in order beforehand, however. I often have a tendency to ask for the moon when discovering a new platform I’m excited about. The suggestions that follow are thus not criticism at all but rather the result of my imagination gone wild. So big congrats to Phil and team for having built what is already a very, very neat platform!

  • Topical search feature that enables users to search by location and a specific theme or topic.
  • Delete function that allows users to delete content that is not relevant to them either from the Map or Collage interface. In the future, perhaps some “basic” machine learning algorithms could be added to learn what types of content the user does not want displayed or prioritized.
  • Add function that gives users the option of adding relevant multi-media content, say perhaps from a blog post, a Wikipedia entry, news article or (Geo)RSS feed. I would be particularly interested in seeing a Storyful feed integrated into Geofeedia, for example. The ability to add KML files could also be interesting, e.g., a KML of an earthquake’s epicenter and estimated impact.
  • Commenting function that enables users to comment on individual data points (Tweets, pictures, etc) and a “discussion forum” feature that enables users to engage in text-based conversation vis-a-vis a specific data point.
  • Storify feature that gives users the ability to turn their curated content into a storify-like story board with narrative. A Storify plugin perhaps.
  • Ushahidi feature that enables users to export an item (Tweet, picture, etc) directly to an Ushahidi platform with just one click. This feature should also allow for the automatic publishing of said item on an Ushahidi map.
  • Alerts function that allows one to turn a geo-fence into an automated alert feature. For example, once I’ve created my geo-fence, having an option that allows me (and others) to subscribe to this geo-fence for future updates could be particularly interesting. These alerts would be sent out as emails (and maybe SMS) with a link to the new picture or Tweet that has been geo-tagged within the geographical area of the geo-fence. Perhaps each geo-fence could tweet updates directly to anyone subscribed to that Geofeedia deployment.
  • Trends alert feature that gives users the option of subscribing to specific trends of interest. For example, I’d like to be notified if the number of data points in my geo-fence increases by more than 25% within a 24-hour time period. Or more specifically whether the number of pictures has suddenly increased. These meta-level trends can provide important insights vis-a-vis early detection & response.
  • Analytics function that produces summary statistics and trends analysis for a geo-fence of interest. This is where Geofeedia could better capture temporal dynamics by including charts, graphs and simple time-series analysis to depict how events have been unfolding over the past hour vs 12 hours, 24 hours, etc.
  • Sentiment analysis feature that enables users to have an at-a-glance understanding of the sentiments and moods being expressed in the harvested social media content.
  • Augmented Reality feature … just kidding (sort-of).

Naturally, most or all of the above may not be in line with Geofeedia’s vision, purpose or business model. But I very much look forward to collaborating with Phil & team vis-a-vis our SBTF partnership. A big thanks to Jeannine once again for pointing me to Geofeedia, and equally big thanks to my SBTF colleague Timo Luege for his blog post on the platform. I’m thrilled to see more colleagues actively blog about the application of new technologies for disaster response.

On this note, anyone familiar with this new Iremos platform (above picture) from France? They recently contacted me to offer a demo.

The Future of Crisis Mapping? Full-Sized Arcade Pinball Machines

Remember those awesome pinball machines (of the analog kind)? You’d launch the ball and see it bounce all over, reacting wildly to various fun objects as you accumulate bonus points. The picture below hardly does justice so have a look on YouTube for some neat videos. I wish today’s crisis maps were that dynamic. Instead, they’re still largely static and hardly as interactive or user-friendly.

Do we live in an inert, static universe? No, obviously we don’t, and yet the state of our crisis mapping platforms would seem to suggest otherwise; a rather linear and flat world, which reminds me more of this game:

Things are always changing and interacting around us. So we need maps with automated geo-fencing alerts that can trigger kinetic and non-kinetic actions. To this end, dynamic check-in features should be part and parcel of crisis mapping platforms as well. My check-in at a certain location and time of day should trigger relevant messages to certain individuals and things (cue the Internet of Things) both nearby and at a distance based on the weather and latest crime statistics, for example. In addition, crisis mapping platforms need to have more gamification options and “special effects”. Indeed, they should be more game-like in terms of consoles and user-interface design. They also ought to be easier to use and be more rewarding.

This explains why I blogged about the “Fisher Price Theory of Crisis Mapping” back in 2008. We’ve made progress over the past four years, for sure, but the ultimate pinball machine of crisis mapping still seems to be missing from the arcade of humanitarian technology.

State of the Art in Digital Disease Detection

Larry Brilliant’s TED Talk back in 2006 played an important role in catalyzing my own personal interest in humanitarian technology. Larry spoke about the use of natural language processing and computational linguistics for the early detection and early response to epidemics. So it was with tremendous honor and deep gratitude that I delivered the first keynote presentation at Harvard University’s Digital Disease Detection (DDD) conference earlier this year.

The field of digital disease detection has remained way ahead of the curve since 2006 in terms of leveraging natural language processing, computational linguistics and now crowdsourcing for the purposes of early detection of critical events. I thus highly, highly recommend watching the videos of the DDD Ignite Talks and panel presentations, which are all available here. Topics include “Participatory Surveillance,” “Monitoring Rumors,” “Twitter and Disease Detection,” “Search Query Surveillance,” “Open Source Surveillance,” “Mobile Disease Detection,” etc. The presentation on BioCaster is also well worth watching. I blogged about BioCaster here over three years ago and the platform is as impressive as ever.

These public health experts are really operating at the cutting-edge and their insights are proving important to the broader humanitarian technology community. To be sure, the potential added value of cross-fertilization between fields is tremendous. Just take this example of a public health data mining platform (HealthMap) being used by Syrian activists to detect evidence of killings and human rights violations.

Disaster Response, Self-Organization and Resilience: Shocking Insights from the Haiti Humanitarian Assistance Evaluation

Tulane University and the State University of Haiti just released a rather damming evaluation of the humanitarian response to the 2010 earthquake that struck Haiti on January 12th. The comprehensive assessment, which takes a participatory approach and applies a novel resilience framework, finds that despite several billion dollars in “aid”, humanitarian assistance did not make a detectable contribution to the resilience of the Haitian population and in some cases increased certain communities’ vulnerability and even caused harm. Welcome to supply-side humanitarian assistance directed by external actors.

“All we need is information. Why can’t we get information?” A quote taken from one of many focus groups conducted by the evaluators. “There was little to no information exchange between the international community tasked with humanitarian response and the Haitian NGOs, civil society or affected persons / communities themselves.” Information is critical for effective humanitarian assistance, which should include two objectives: “preventing excess mortality and human suffering in the immediate, and in the longer term, improving the community’s ability to respond to potential future shocks.” This longer term objective thus focuses on resilience, which the evaluation team defines as follows:

“Resilience is the capacity of the affected community to self-organize, learn from and vigorously recover from adverse situations stronger than it was before.”

This link between resilience and capacity for self-organization is truly profound and incredibly important. To be sure, the evaluation reveals that “the humani-tarian response frequently undermined the capacity of Haitian individuals and organizations.” This completely violates the Hippocratic Oath of Do No Harm. The evaluators thus “promote the attainment of self-sufficiency, rather than the ongoing dependency on standard humanitarian assistance.” Indeed, “focus groups indicated that solutions to help people help themselves were desired.”

I find it particularly telling that many aid organizations interviewed for this assessment were reluctant to assist the evaluators in fully capturing and analyzing resource flows, which are critical for impact evaluation. “The lack of transparency in program dispersal of resources was a major constraint in our research of effective program evaluation.” To this end, the evaluation team argue that “by strengthening Haitian institutions’ ability to monitor and evaluate, Haitians will more easily be able to track and monitor international efforts.”

I completely disagree with this remedy. The institutions are part of the problem, and besides, institution-building takes years if not decades. To assume there is even political will and the resources for such efforts is at best misguided. If resilience is about strengthening the capacity of affected communities to self-organize, then I would focus on just that, applying existing technologies and processes that both catalyze and facilitate demand-side, people-centered self-organization. My previous blog post on “Technology and Building Resilient Societies to Mitigate the Impact of Disasters” elaborates on this point.

In sum, “resilience is the critical link between disaster and development; monitoring it will ensure that relief efforts are supporting, and not eroding, household and community capabilities.” This explains why crowdsourcing and data mining efforts like those of Ushahidi, HealthMap and UN Global Pulse are important for disaster response, self-organization and resilience.