Tag Archives: Damage

New Findings: Rapid Assessment of Disaster Damage Using Social Media

The latest peer-reviewed, scientific research on social media & crisis computing has just been published in the prestigious journal, Science. The authors pose a question that many of us in the international humanitarian space have been asking, debating and answering since 2009: Can social media data aid in disaster response and damage assessment?

hurricane-sandy-social-media
To answer this question, the authors of the new study carry out “a multiscale analysis of Twitter activity before, during, and after Hurricane Sandy” and “examine the online response of 50 metropolitan areas of the US.” They find a “strong relationship between proximity to Sandy’s path and hurricane-related social media activity.” In addition, they “show that real and perceived threats, together with physical disaster effects, are directly observable through the intensity and composition of Twitter’s message stream.”

Screen Shot 2016-03-17 at 12.18.11 PM

What’s more, they actually “demonstrate that per-capita Twitter activity strongly correlates with the per-capita economic damage inflicted by the hurricane.” The authors found these results to hold true for a “wide range of [US-based] disasters and suggest that massive online social networks can be used for rapid assessment of damage caused by a large-scale disaster.”

Screen Shot 2016-03-17 at 12.26.51 PM

Unlike the vast majority of crisis computing studies in the scientific literature, this is one of the few (perhaps the only?) study of its kind that uses actual post-disaster damage data, i.e. actual ground-truthing, to demonstrate that “the per-capita number of Twitter messages corresponds directly to disaster-inflicted monetary damage.” What’s more, “The correlation is especially pronounced for persistent post-disaster activity and is weakest at the peak of the disaster.”

Screen Shot 2016-03-17 at 1.16.11 PM

The authors thus conclude that social media is a “viable platform for preliminary rapid damage assessment in the chaotic time immediately after a disaster.” As such, their results suggest that “officials should pay attention to normalized activity levels, rates of original content creation, and rates of content rebroadcast to identify the hardest hit areas in real time. Immediately after a disaster, they should focus on persistence in activity levels to assess which areas are likely to need the most assistance.”

Screen Shot 2016-03-17 at 12.37.02 PM

In sum, the authors found that “Twitter activity during a large-scale natural disaster—in this instance Hurricane Sandy—is related to the proximity of the region to the path of the hurricane. Activity drops as the distance from the hurricane increases; after a distance of approximately 1200 to 1500 km, the influence of proximity disappears. High-level analysis of the composition of the message stream reveals additional findings. Geo-enriched data (with location of tweets inferred from users’ profiles) show that the areas close to the disaster generate more original content […].”

Five years ago, professional humanitarians were still largely dismissive of social media’s added value in disasters. Three years ago, it was the turn of ivory tower academics in the social sciences to dismiss the value added of social media for disaster response. The criticisms focused on the notion that reports posted on social media were simply untrustworthy and hardly representative. The above peer-reviewed scientific study dismisses these limitations as inconsequential.

Screen Shot 2016-03-17 at 1.11.17 PM

Increasing the Reliability of Aerial Imagery Analysis for Damage Assessments

In March 2015, I was invited by the World Bank to spearhead an ambitious humanitarian aerial robotics (UAV) mission to Vanuatu following Cyclone Pam, a devastating Category 5 Cyclone. This mission was coordinated with Heliwest and X-Craft, two outstanding UAV companies who were identified through the Humanitarian UAV Network (UAViators) Roster of Pilots. You can learn more about the mission and see pictures here. Lessons learned from this mission (and many others) are available here.

Screen Shot 2016-02-22 at 6.12.21 PM

The World Bank and partners were unable to immediately analyze the aerial imagery we had collected because they faced a Big Data challenge. So I suggested the Bank activate the Digital Humanitarian Network (DHN) to request digital volunteer assistance. As a result, Humanitarian OpenStreetMap (HOT) analyzed some of the orthorectified mosaics and MicroMappers focused on analyzing the oblique images (more on both here).

This in turn produced a number of challenges. To cite just one, the Bank needed digital humanitarians to identify which houses or buildings were completely destroyed, versus partially damaged versus largely intact. But there was little guidance on how to determine what constituted fully destroyed versus partially damaged or what such structures in Vanuatu look like when damaged by a Cyclone. As a result, data quality was not as high as it could have been. In my capacity as consultant for the World Bank’s UAVs for Resilience Program, I decided to do something about this lack of guidelines for imagery interpretation.

I turned to my colleagues at the Harvard Humanitarian Initiative (where I had previously co-founded and co-directed the HHI Program on Crisis Mapping) and invited them to develop a rigorous guide that could inform the consistent interpretation of aerial imagery of disaster damage in Vanuatu (and nearby Island States). Note that Vanuatu is number one on the World Bank’s Risk Index of most disaster-prone countries. The imagery analysis guide has just published (PDF) by the Signal Program on Human Security and Technology at HHI.

Big thanks to the HHI team for having worked on this guide and for my Bank colleagues and other reviewers for their detailed feedback on earlier drafts. The guide is another important step towards improving data quality for satellite and aerial imagery analysis in the context of damage assessments. Better data quality is also important for the use of Artificial Intelligence (AI) and computer vision as explained here. If a humanitarian UAV mission does happen in response to the recent disaster in Fiji, then the guide may also be of assistance there depending on how similar the building materials and architecture is. For now, many thanks to HHI for having produced this imagery guide.

Using Aerial Robotics and Virtual Reality to Inspect Earthquake Damage in Taiwan (Updated)

The tragic 6.4 magnitude earthquake struck southern Taiwan shortly before 4 in the morning on Saturday, February 6th. Later in the day, aerial robots were used to capture areal videos and images of the disaster damage, like below.

Screen Shot 2016-02-07 at 1.25.26 PM

Within 10 hours of the earthquake, Dean Hosp at Taiwan’s National Cheng Kung University used screenshots of aerial videos posted on YouTube by various media outlets to create the 3D model below. As such, Dean used “second hand” data to create the model, which is why it is low resolution. Having the original imagery first hand would enable a far higher-res 3D model. Says Dean: “If I can fly myself, results can produce more fine and faster.”

Click the images below to enlarge.

Screen Shot 2016-02-07 at 1.32.46 PM

Screen Shot 2016-02-07 at 1.17.12 PM

Update: About 48 hours after the earthquake, Dean and team used their own UAV to create this much higher resolution version (see below), which they also annotated (click to enlarge).

Screen Shot 2016-02-10 at 6.45.04 AMScreen Shot 2016-02-10 at 6.47.33 AM Screen Shot 2016-02-10 at 6.48.19 AMScreen Shot 2016-02-10 at 6.51.23 AM

Here’s the embedded 3D model:

These 3D models were processed using AgiSoft PhotoScan and then uploaded to Sketchfab on the same day the earthquake struck. I’ve blogged about Sketchfab in the past—see this first-ever 3D model of a refugee camp, for example. A few weeks ago, Sketchfab added a Virtual Reality feature to their platform, so I just tried this out on the above model.

Screen Shot 2016-02-10 at 6.51.32 AM

The model appears equally crisp when viewed in VR mode on a mobile device (using Google Cardboard in my case). Simply open this page on your mobile device to view the disaster damage in VR. This works rather well; the model does seem to be of high resolution in Virtual Reality as well.

IMG_5966

This is a good first step vis-a-vis VR applications. As a second step, we need to develop 3D disaster ontologies to ensure that imagery analysts actually interpret 3D models in the same way. As a third step, we need to combine VR headsets with wearable technology that enables the end-user to annotate (or draw on) the 3D models directly within the same VR environment. This would make the damage assessment process more intuitive while also producing 3D training data for the purposes of machine learning—and thus automated feature detection.

I’m still actively looking for a VR platform that will enable this, so please do get in touch if you know of any group, company, research institute, etc., that would be interested in piloting the 3D analysis of disaster damage from the Taiwan or Nepal Earthquakes entirely within a VR solution. Thank you.

Click here to view 360 aerial visual panoramas of the disaster damage.


Many thanks to Sebastien Hodapp for pointing me to the Taiwan model.

UAV/Aerial Video of Gaza Destruction (updated)

Aerial footage captured by a small civilian UAV/drone shows the scale of the devastation caused by Israeli bombardment during the recent conflict:

Media Town, a Palestinian-based production company, flew their DJI Phantom2 quadcopter (pictured below) with a GoPro Hero+3 camera over Gaza City’s eastern suburb of Al-Shejaiya just a few days ago. Al-Shejaiya saw some of the heaviest fighting during the conflict and faced the full force of Israel’s heaviest shelling in July. The footage is a short excerpt from a 40 minute aerial video captured in full high-definition quality. You can also compare aerial footage taken before the shelling with post-bombardment footage in this edited video.

Phantom 2

We will see a rapid increase in aerial footage of post-conflict and post-disaster areas as more local media companies around the world turn to UAVs to support their journalism work. Humanitarian organizations are also exploring the use of UAVs to accelerate their damage assessment efforts following major disasters. The UN Office for the Coordination of Humanitarian Affairs (OCHA), for example, recently published this Policy Brief on UAVs and is also experimenting with the DJI Phantom 2 pictured below.

Screen Shot 2014-06-24 at 2.22.05 PM

My team & I at QCRI have thus launched this Crisis Map of Aerial Videos (which will soon include pictures) to collect disaster footage taken by UAVs across the globe. We have also developed a crowdsourcing platform called MicroMappers to make sense of aerial videos (and soon pictures). Eventually, we hope to combine this crowdsourced analysis of aerial imagery with automated methods. We also plan to integrate actionable content taken from aerial footage with social media reports from crisis areas.

bio

See Also:

  • Crisis Map of UAV Videos for Disaster Response [link]
  • Official UN Policy Brief on Humanitarian UAVs [link]
  • Humanitarians in the Sky: Using UAVs for Disaster Response [link]
  • Humanitarian UAV Missions During Balkan Floods [link]
  • UAVs, Community Mapping & Disaster Risk Reduction in Haiti [link]

Live: Crowdsourced Crisis Map of UAV/Aerial Photos & Videos for Disaster Response (Updated)

Update: Crisis Map now includes features to post photos in addition to videos!

The latest version of the Humanitarian UAV Network’s Crisis Map of UAV/aerial photos & videos is now live on the Network’s website. The crowdsourced map already features dozens of aerial videos of recent disasters. Now, users can also post aerial photographs areas. Like the use of social media for emergency management, this new medium—user-generated (aerial) content—can be used by humanitarian organizations to complement their damage assessments and thus improve situational awareness.

UAViators Map

The purpose of this Humanitarian UAV Network (UAViators) map is not only to provide humanitarian organizations and disaster-affected communities with an online repository of aerial information on disaster damage to augment their situational awareness; this crisis map also serves to raise awareness on how to safely & responsibly use small UAVs for rapid damage assessments. This explains why users who upload new content to the map must confirm that they have read the UAViator‘s Code of Conduct. They also have to confirm that the photos & videos conform to the Network’s mission and that they do not violate privacy or copyrights. In sum, the map seeks to crowdsource both aerial footage and critical thinking for the responsible use of UAVs in humanitarian settings.

UAViators Map 4

As noted above, this is the first version of the map, which means several other features are currently in the works. These new features will be rolled out incrementally over the next weeks and months. In the meantime, feel free to suggest any features you’d like to see in the comments section below. Thank you.

Bio

  • Humanitarian UAV Network: Strategy for 2014-2015 [link]
  • Humanitarians in the Sky: Using UAVs for Disaster Response [link]
  • Humanitarian UAV Missions During Balkan Floods [link]
  • Using UAVs for Disaster Risk Reduction in Haiti [link]
  • Using MicroMappers to Make Sense of UAV/Aerial Imagery During Disasters [link]

Rapid Disaster Damage Assessments: Reality Check

The Multi-Cluster/Sector Initial Rapid Assessment (MIRA) is the methodology used by UN agencies to assess and analyze humanitarian needs within two weeks of a sudden onset disaster. A detailed overview of the process, methodologies and tools behind MIRA is available here (PDF). These reports are particularly insightful when comparing them with the processes and methodologies used by digital humanitarians to carry out their rapid damage assessments (typically done within 48-72 hours of a disaster).

MIRA PH

Take the November 2013 MIRA report for Typhoon Haiyan in the Philippines. I am really impressed by how transparent the report is vis-à-vis the very real limitations behind the assessment. For example:

  • “The barangays [districts] surveyed do not constitute a represen-tative sample of affected areas. Results are skewed towards more heavily impacted municipalities […].”
  • “Key informant interviews were predominantly held with baranguay captains or secretaries and they may or may not have included other informants including health workers, teachers, civil and worker group representatives among others.”
  • Barangay captains and local government staff often needed to make their best estimate on a number of questions and therefore there’s considerable risk of potential bias.”
  • Given the number of organizations involved, assessment teams were not trained in how to administrate the questionnaire and there may have been confusion on the use of terms or misrepresentation on the intent of the questions.”
  • “Only in a limited number of questions did the MIRA checklist contain before and after questions. Therefore to correctly interpret the information it would need to be cross-checked with available secondary data.”

In sum: The data collected was not representative; The process of selecting interviewees was biased given that said selection was based on a convenience sample; Interviewees had to estimate (guesstimate?) the answer for several questions, thus introducing additional bias in the data; Since assessment teams were not trained to administrate the questionnaire, this also introduces the problem of limited inter-coder reliability and thus limits the ability to compare survey results; The data still needs to be validated with secondary data.

I do not share the above to criticize, only to relay what the real world of rapid assessments resembles when you look “under the hood”. What is striking is how similar the above challenges are to the those that digital humanitarians have been facing when carrying out rapid damage assessments. And yet, I distinctly recall rather pointed criticisms leveled by professional humanitarians against groups using social media and crowdsourcing for humanitarian response back in 2010 & 2011. These criticisms dismissed social media reports as being unrepresentative, unreliable, fraught with selection bias, etc. (Some myopic criticisms continue to this day). I find it rather interesting that many of the shortcomings attributed to crowdsourcing social media reports are also true of traditional information collection methodologies like MIRA.

The fact is this: no data or methodology is perfect. The real world is messy, both off- and online. Being transparent about these limitations is important, especially for those who seek to combine both off- and online methodologies to create more robust and timely damage assessments.

bio