Category Archives: Humanitarian Technologies

The Planetary Response Network: Using Citizen Science and Next Generation Satellites to Accelerate Disaster Damage Assessments

Note: The Planetary Response Network was formerly called Zoomanitarians

The Planetary Response Network (PRN) has been in the works for well over a year, so we’re excited to be going fully public for the first time. PRN a joint initiative between Zooniverse (Brook Simmons), Planet Labs (Alex Bakir) and myself at QCRI. The purpose of Zoomanitarians is to accelerate disaster damage assessments by leveraging Planet Labs’ unique constellation of 28 satellites and Zooniverse’s highly scalable microtasking platform. As I noted in this earlier post, digital volunteers from Zooniverse tagged well over 2 million satellite images (of Mars, below) in just 48 hours. So why not invite Zooniverse volunteers to tag millions of images taken by Planet Labs following major disasters (on Earth) to help humanitarians accelerate their damage assessments?

Zooniverse Planet 4

That was the question I posed to Brooke and Alex in early 2013. “Why not indeed?” was our collective answer. So we reached out to several knowledgeable colleagues of mine including Kate Chapman from Humanitarian OpenStreetMap and Lars Bromley from UNOSAT for their feedback and guidance on the idea.

We’ll be able to launch our first pilot project later this year thanks to Kate who kindly provided us with very high-resolution UAV/aerial imagery of downtown Tacloban in the Philippines. Why do we want said imagery when the plan is to use Planet Labs imagery? Because Planet Labs imagery is currently available at 3-5 meter resolution so we’ll be “degrading” the resolution of the aerial imagery to determine just what level and type of damage can be captured at various resolutions as compared to the imagery from Planet Labs. The pilot project will therefore serve to (1) customize and test The Planetary Response Network’s microtasking platform and (2) determine what level of detail can be captured at various resolutions.

PlanetLabs

We’ll then spend the remainder of the year improving the platform based on the results of the pilot project during which time I will continue to seek input from humanitarian colleagues. Zooniverse’s microtasking platform has already been stress-tested extensively over the years, which is one reason why I approached Zooniverse last year. The other reason is that they have over 1 million digital volunteers on their list-serve. Couple this with Planet Labs’ unique constellation of 28 satellites, and you’ve got the potential for near real-time satellite imagery analysis for disaster response. Our plan is to produce “heat maps” based on the results and to share shape files as well for overlay on other maps.

It took imagery analysts well over 48 hours to acquire and analyze satellite imagery following Typhoon Yolanda. While Planet Labs imagery is not (yet) available at high-resolutions, our hope is that PRN will be able to acquire and analyze relevant imagery within 12-24 hours of a request. Several colleagues have confirmed to me that the results of this rapid analysis will also prove invaluable for subsequent, higher-resolution satellite imagery acquisition and analysis. On a related note, I hope that our rapid satellite-based damage assessments will also serve as a triangulation mechanism (ground-truthing) for the rapid social-media-driven damage assessments carried out using the Artificial Intelligence for Disaster Response (AIDR) platform & MicroMappers.

While much work certainly remains, and while The Planetary Response Network is still in the early phases of research and development, I’m nevertheless excited and optimistic about the potential impact—as are my colleagues Brooke and Alex. We’ll be announcing the date of the pilot later this summer, so stay tuned for updates!

Humanitarian UAV Network: Strategy for 2014-2015

The purpose of the Humanitarian UAV Network (UAViators) is to guide the responsible and safe use of small UAVs in humanitarian settings while promoting information sharing and enlightened policymaking. As I’ve noted in the past, UAVs are already being used to support a range of humanitarian efforts. So the question is not if, but rather how to facilitate the inevitable expanded use of UAVs in a responsible and safe manner. This is just one of many challenging questions that UAViators was created to manage.

UAViators Logo

UAViators has already drafted a number of documents, including a Code of Conduct and an Operational Check-List for the use of UAVs in humanitarian settings. These documents will continue to be improved throughout the year, so don’t expect a final and perfect version tomorrow. This space is still too new to have all the answers in a first draft. So our community will aim to improve these documents over time. By the end of 2014, we hope to have a solid version of the of Code of Conduct for organizations and companies to publicly endorse.

OCHA UAV

In the meantime, my three Research Assistants (RA’s) and I are working on (the first ever?) comprehensive evaluation of 1) Small UAVs; 2) Cameras; 3) Payload Units; and 4) Imagery Software specifically for humanitarian field-workers. The purpose of this evaluation is to rate which technologies are best suited to the needs of humanitarians in the field. We will carry out this research through interviews with seasoned UAV experts coupled with secondary, online research. Our aim is to recommend 2-3 small UAVs, cameras, payload units and software solutions for imagery processing and analysis that make the most sense for humanitarians as end users. These suggestions will then be “peer reviewed” by members of the Humanitarian UAV Network.

Following this evaluation, my three RA’s and I will create a detailed end-to-end operational model for the use of UAVs in humanitarian settings. The model will include pre-flight guidance on several key issues including legislation, insurance, safety and coordination. The pre-flight section will also include guidance on how to program the flight-path of the UAVs recommended in the evaluation. But the model does not end with the safe landing of a UAV. The operational model will include post-flight guidance on imagery processing and analysis for decision support as well as guidelines on information sharing with local communities. Once completed, this operational model will also be “peer reviewed” by members of the UAViators.

Credit Drone Adventures

Both deliverables—the evaluation and model—will be further reviewed by the Advisory Board of UAViators and by field-based humanitarians. We hope to have this review completed during the Humanitarian UAV Experts Meeting, which I am co-organizing with OCHA in New York this November. Feedback from this session will be integrated into both deliverables.

Our plan is to subsequently convert these documents into training materials for both online and onsite training. We have thus far identified two sites for this training, one in Southeast Asia and the other in southern Africa. We’re considering a potential third site in South America depending on the availability of funding. These trainings will enable us to further improve our materials and to provide minimum level certification to humanitarians participating in said trainings. To this end, our long-term strategy for the Humanitarian UAV Network is not only to facilitate the coordination of small UAVs in humanitarian settings but also to provide both training and certification in collaboration with multiple humanitarian organizations.

I recognize that the above is highly ambitious. But all the signals I’m getting from humanitarian organizations clearly demonstrate that the above is needed. So if you have some expertise in this space and wish to join my Research Assistants and I in this applied and policy-focused research, then please do get in touch. In addition, if your organization or company is interested in funding any of the above, then do get in touch as well. We have the initial funding for the first phase of the 2014-2015 strategy and are in the process of applying for funding to complete the second phase.

One final but important point: while the use of many small and large UAVs in complex airspaces in which piloted (manned) aircraft are also flying poses a major challenge in terms of safety, collision avoidance and coordination, this obviously doesn’t mean that small UAVs should be grounded in humanitarian contexts with far simpler airspaces. Indeed, to argue that small UAVs cannot be responsibly and safely operated in simpler airspaces ignores the obvious fact that they already have—and continue to be responsibly & safely used. Moreover, I for one don’t see the point of flying small UAVs in areas already covered by larger UAVs and piloted aircraft. I’m far more interested in the rapid and local deployment of small UAVs to cover areas that are overlooked or have not yet been reached by mainstream response efforts. In sum, while it will take years to develop effective solutions for large UAV-use in dense and complex airspaces, small UAVs are already being used responsibly and safely by a number of humanitarian organizations and their partners.

Bio

See Also:

  • Welcome to the Humanitarian UAV Network [link]
  • How UAVs are Making a Difference in Disaster Response [link]
  • Humanitarians Using UAVs for Post Disaster Recovery [link]
  • Grassroots UAVs for Disaster Response [link]
  • Using UAVs for Search & Rescue [link]
  • Debrief: UAV/Drone Search & Rescue Challenge [link]
  • Crowdsourcing Analysis of UAV Imagery for Search/Rescue [link]
  • Check-List for Flying UAVs in Humanitarian Settings [link]

Using MicroMappers to Make Sense of UAV Imagery During Disasters

Aerial imagery will soon become a Big Data problem for humanitarian response—particularly oblique imagery. This was confirmed to me by a number of imagery experts in both the US (FEMA) and Europe (JRC). Aerial imagery taken at an angle is referred to as oblique imagery; compared to vertical imagery, which is taken by cameras pointing straight down (like satellite imagery). The team from Humanitarian OpenStreetMap (HOT) is already well equipped to make sense of vertical aerial imagery. They do this by microtasking the tracing of said imagery, as depicted below. So how do we rapidly analyze oblique images, which often provide more detail vis-a-vis infrastructure damage than vertical pictures?

HOTosm PH

One approach is to microtask the tagging of oblique images. This was carried out very successfully after Hurricane Sandy (screenshot below).

This solution did not include any tracing and was not designed to inform the development of machine learning classifiers to automatically identify features of interest, like damaged buildings, for example. Making sense of Big (Aerial) Data will ultimately require the combined use of human computing (microtasking) and machine learning. As volunteers use microtasking to trace features of interest such as damaged buildings pictured in oblique aerial imagery, perhaps machine learning algorithms can learn to detect these features automatically if enough examples of damaged buildings are provided. There is obviously value in doing automated feature detection with vertical imagery as well. So my team and I at QCRI have been collaborating with a local Filipino UAV start-up (SkyEye) to develop a new “Clicker” for our MicroMappers collection. We’ll be testing the “Aerial Clicker” below with our Filipino partners this summer. Incidentally, SkyEye is on the Advisory Board of the Humanitarian UAV Network (UAViators).

Aerial Clicker

Aerial Clicker 2

SkyEye is interested in developing a machine learning classifier to automatically identify coconut trees, for example. Why? Because coconut trees are an important source of livelihood for many Filipinos. Being able to rapidly identify trees that are still standing versus uprooted would enable SkyEye to quickly assess the impact of a Typhoon on local agriculture, which is important for food security & long-term recovery. So we hope to use the Aerial Clicker to microtask the tracing of coconut trees in order to significantly improve the accuracy of the machine learning classifier that SkyEye has already developed.

Will this be successful? One way to find out is by experimenting. I realize that developing a “visual version” of AIDR is anything but trivial. While AIDR was developed to automatically identify tweets (i.e., text) of interest during disasters by using microtasking and machine learning, what if we also had a free and open source platform to microtask and then automatically identify visual features of interest in both vertical and oblique imagery captured by UAVs? To be honest, I’m not sure how feasible this is vis-a-vis oblique imagery. As an imagery analyst at FEMA recently told me, this is still a research question for now. So I’m hoping to take this research on at QCRI but I do not want to duplicate any existing efforts in this space. So I would be grateful for feedback on this idea and any related research that iRevolution readers may recommend.

In the meantime, here’s another idea I’m toying with for the Aerial Clicker:

Aerial Clicker 3

I often see this in the aftermath of major disasters; affected communities turning to “analog social medial” to communicate when cell phone towers are down. The aerial imagery above was taken following Typhoon Yolanda in the Philippines. And this is just one of several dozen images with analog media messages that I came across. So what if our Aerial Clicker were to ask digital volunteers to transcribe or categorize these messages? This would enable us to quickly create a crisis map of needs based on said content since every image is already geo-referenced. Thoughts?

bio

See Also:

  • Welcome to the Humanitarian UAV Network [link]
  • How UAVs are Making a Difference in Disaster Response [link]
  • Humanitarians Using UAVs for Post Disaster Recovery [link]
  • Grassroots UAVs for Disaster Response [link]
  • Using UAVs for Search & Rescue [link]
  • Debrief: UAV/Drone Search & Rescue Challenge [link]
  • Crowdsourcing Analysis of UAV Imagery for Search/Rescue [link]
  • Check-List for Flying UAVs in Humanitarian Settings [link]

Debrief: UAV/Drone Search and Rescue Challenge

I had the pleasure of helping co-organize the first UAV/Drone Search and Rescue Challenge in the DC Area last Saturday. This was the first time that members of the DC Area Drone User Group participated in an event like this, so it was an ideal opportunity for everyone involved to better understand how UAVs might be used in a real world emergency to support of professional first responders. The challenge was held at the 65-acre MadCap Farm in The Plains, Virginia. For perspective, 65 acres is equal to about 30 full-size football (soccer) fields.

Madcap Farm

Satellite view of MadCap Farm above versus aerial view below during the UAV Search and Rescue Challenge.

Madcap Farm 2

Big thanks to our host and to Timothy Reuter who organized and ran the challenge; and of course many thanks indeed to all five teams who participated in the challenge. One colleague even flew in from Texas to compete in the event, which was sponsored by UAViators, the Humanitarian UAV Network. I described the rules of the challenge in this post but let me briefly summarize these here. Teams were notified of the following “alert” the night before the challenge:

“We have received reports of three lost campers in the vicinity of MadCap Farms. Local Search & Rescue professionals have requested our help to find them. Please report to the front field of MadCap no later than 9:15am for additional details on the campers and efforts to locate them. You will receive a laminated map of the area upon your arrival as well as a wax pen. We ask that you use your drones to identify objects that may help local responders determine where the campers are and ideally find the campers themselves. You will mark on the maps you receive what items you find, their color, and any people you identify. If any of the campers are trapped, you may need to deliver some form of medicine or other relief to them in advance of first responders being able to aid them in person.”

UAVteams

Upon reporting to the farm the following morning, the teams (pictured above) were notified that the campers were teenagers who were carrying sleeping bags and tents. In reality, our three lost campers were the cardboard stand-ups below, but Timothy had already hidden these and scattered their belongings by the time participants arrived at the farm. By the way, notice all the clouds in the picture above? This would have hampered efforts to use satellite imagery in the search and rescue efforts. UAVs, in contrast, fly below the cloud canopy and can provide far cheaper and more up-to-date imagery at far higher spatial resolutions and available even using the best commercial satellites.

LostCampers2

As a side note, I was really pleased to see the Civilian Air Patrol (CAP) at the Search and Rescue Challenge. The Air Patrol is a federally supported non-profit volunteer-based organization that serves as the official civilian auxiliary of the US Air Force. CAP members share a passion for aviation and come from all backgrounds and walks of life.

UAV_CAP

Back to the Challenge. Each team had an hour to fly their UAVs and another hour to search through their aerial videos and/or images post-flight. Two of the five teams used fixed-wing UAVs, like the group below, which kicked off our Search & Rescue Challenge.

UAVgroup1

They decided to program their UAV for autonomous flight. You can see the flight path below with specified altitude and the different way points (numbers) in the top-right screen (click to enlarge).

UAVgroup1 autonomous

Here’s a short 20-second video of the hand-held launch of the fixed-wing UAV. Once airborne, the team simply switches to auto-pilot and the UAV does the rest, accurately following the pre-programmed flight path.

The team decided to analyze their streaming aerial video in real-time, as you can observe in the second video below. While this certainly expedites the analysis and the search for the missing campers, it is also challenging since the team has to pivot back and forth between the live video and the flight path of the UAV in order to pin-point the location of a potential camper or their tent. Unlike rotary-wing UAVs, fixed-wing UAVs obviously cannot hover over one area but need to circle back to fly over the same area.

My colleague Michael and his co-pilot programmed a quadcopter to fly to designated waypoints at a specified altitude. They too used live-streaming to look for clues that could reveal the location of the three missing campers. But they also recorded the video-feed for later analysis, which proved far more effective at identifying said clues. In any event, they used First Person View (FPV) goggles to see exactly what the quadcopter’s camera was seeing, as depicted below.

FPV Goggles Quadcopter

In addition to searching for the whereabouts of the missing campers, Timothy and I decided to add a bit more focus on the “rescue” part of Search & Rescue. My colleague Euan kindly gave us a number of his new payload units, which are basically a pair of magnets that can be demagnetized by passing a small electric current through said magnets, thus acting as a release mechanism. Euan posted this short video of his prototype payload units in action during a first test earlier this year. Competing teams could earn additional points if they were able to carry a small payload (of their choice) and release this near any of the cardboard stand-ups they could find.

UAV payload unit

Some teams used Euan’s units while other used their own, like the device pictured above. Here’s a short video of payload release (with parachute) during the competition.

At the end of the competition, we all came together for a full debrief and of course to count up points. Timothy asked each team to share what they felt went well and what some of the major challenges were. The picture below shows some of the items (sleeping bags, clothing, etc.) that were scattered around the farm.

UAV debrief

Perhaps the single biggest challenge was altitude. Given that we were in a valley surrounded by rolling hills, it was difficult for competing teams to judge at what altitude their UAVs should be programmed to fly since we couldn’t see over the next hill to determine whether there were taller trees in the distance.

UAVtrees

Flying too high would make it more difficult to identify the potential campers on the ground while flying too low would mean running into trees. Unfortunately, two teams encountered the latter problem but both UAVs were eventually recovered. This highlights the importance of developing automatic collision avoidance systems (ACAS) specifically for UAVs. In addition, if UAVs are to be used for Search and Rescue efforts in forested areas, it would pay to have a back-up quadcopter to rescue any UAVs caught in taller trees. One could attach a hanger to said quadcopter to unhook UAVs out of trees. The picture below is taken by a camera fixed to a quadcopter that hit the top of a tree. Yes, we all had a good laugh about the irony of sending UAVs to rescue other UAVs.

StuckQuadUAV

The debrief also revealed that most teams were able to find more items post-flight after going through their recorded video footage. My colleague Michael noted that finding signs of the campers was “like looking for a needle in a haystack.” One team noted that live video feeds have a limited range, which hampered their efforts. Another team remarked that one can never have enough batteries on hand. Indeed, wind conditions can very easily affect the endurance of UAV batteries, for example. The importance of pre-flight check-lists was reiterated as well as clearly spelling out safety protocols before a challenge.

UAViators Logo

I’ll be sharing this debrief and lessons learned with my humanitarian colleagues at the United Nations and the Red Cross; as well as with members of the Advisory Board of the Humanitarian UAV Network (UAViators). Keep in mind that groups like UNICEF, UNHCR and the UN Office for the Coordination for Humanitarian Affairs (OCHA) have not yet begun to experiment hands-on with UAVs to support their relief efforts, so all of the above will be very new to them, just as it was to most teams who participated in the challenge. So this kind of hands-on learning will be of interest to humanitarians groups looking to explore this space.

Counting Points UAVs

We counted up the points from each team’s map (like the one above) after the debrief and congratulated the winning team pictured below. They were the only team that found all three missing campers along with some of their belongings.

Winning UAV

Big thanks again to our hosts at MadCap Farm, to Timothy Reuter and all participants for spending a fun Saturday outdoors trying something new. We certainly learned some valuable lessons and in the process made new friends.

The short video above was produced by CCTV America, a news television channel that reported on the Search & Rescue Challenge.

bio

Acknowledgements: Many thanks to Timothy Reuter and Michael Ender for their feedback on an earlier draft of this blog post.

See also:

  • How UAVs are Making a Difference in Disaster Response [link]
  • Humanitarians Using UAVs for Post Disaster Recovery [link]
  • Grassroots UAVs for Disaster Response [link]
  • Using UAVs for Search & Rescue [link]
  • Crowdsourcing Analysis of UAV Imagery for Search/Rescue [link]

An Introduction to Humanitarian UAVs and their Many Uses 

UAViators Logo

Satellite images have been used to support humanitarian efforts for decades. Why? A bird’s eye view of a disaster-affected area simply captures far more information than most Earth-based data-collection technologies can. In short, birds have more situational awareness than we do. In contrast to satellites, UAVs offer significantly higher-resolution imagery, are unobstructed by clouds, can be captured more quickly, by more groups and more often at a fraction of the cost with far fewer licensing and data-sharing restrictions than satellite imagery.

Introduction to UAVs

There are basically three types of UAVs: 1) the balloon/kite variety; 2) fixed-wing UAVs; 3) rotary-wing UAVs. While my forthcoming book looks at humanitarian applications of each type, I’ll focus on fixed-wing and rotary-wing UAVs here since these are of greatest interest to humanitarian organizations. These types of UAVs differ from traditional remote control planes and helicopters because they are programmable and intelligent. UAVs can be programmed to take-off, fly and land completely autonomously, for example. They often include intelligent flight stabilization features that adapt for changing wind speeds and other weather-related conditions. They also have a number of built-in fail-safe mechanisms; some of the newer UAVs even include automated collision avoidance systems.

Screen Shot 2014-05-01 at 5.32.11 AM

Fixed-wing UAVs like senseFly’s eBees (above) are launched by hand and can land on a wide variety of surfaces, requiring only a few meters of landing space. They fly autonomously along pre-programmed routes and also land themselves auto-matically. My colleague Adam from senseFly recently flew eBees to support recovery efforts in the Philippines following Typhoon Yolanda. Adam is also on the Advisory Board of the Humanitarian UAV Network (UAViators). Other fixed-wing UAVs are flown manually and require an airstrip for both manual take-off and landing. Rotary-wing UAVs, in contrast, are “helicopters” with three or more propellors. Quadcopters, for example, have four propellors, like the Huginn X1 below, which my colleague Liam Dawson, another Advisory Board member, flew following Typhoon Yolanda in the Philippines. One advantage of rotary-UAVs is that they take-off and land vertically. They can also hover in one fixed place and can also be programmed to fly over designated points.

Huginn x1

Rotary-UAVs cannot glide like their fixed-wing counterparts, which means their batteries get used up fairly quickly. So they can’t stay airborne for very long (~25 minutes, 2 kilometer range, depending on the model) compared to fixed-wing UAVs like the eBee (~45 minutes, 3 kilometers). Advisory Board member Shane Coughlan is designing fixed-wing humanitarian UAVs that will have a range of several hundred kilometers. Fixed-wing UAVs, however, cannot hover in one place over time. So both types of UAVs come with their advantages and disadvantages. Most UAV experts agree that fixed-wing and rotary-wing UAVs can serve complementary purposes, however. You can quickly use a quadcopter to do an initial aerial recon and then send a fixed-wing UAV for more detailed, high-resolution imagery capture.

Humanitarian Uses of UAVs

So what are some specific examples of humanitarian UAVs in action? Lets take the response to Typhoon Yolanda, which saw an unprecedented number of UAVs in operation. Rotary-wing UAVs were used to support search and rescue efforts, looking for survivors amongst massive fields of debris left behind by the unprecedented storm surge (see below). More specifically, Liam captured very high-resolution imagery of large debris-riddled areas that could not be reached by car and then analyzed this imagery for any signs of trapped survivors. Most of this imagery was taken with an oblique, high-resolution camera; oblique simply means that the camera was not pointing straight down but an angle.

Yolanda Storm Surge

These cameras can be set to capture continuous very high-resolution (VHR) video footage or to take VHR photographs multiple times a second or minute. Some cameras like GoPros can do both. (In the US, the majority of search and rescue operations supported by UAVs use the fixed-wing kind like the Spectra which use down-facing vertical cameras. Advisory Board member Gene Robinson is one of the leading experts in UAV search & rescue missions in North America). Rotary-wing UAVs were also used in the wake of Yolanda to help clear roads. Again, very high-resolution (oblique) aerial imagery was used to determine which roads to prioritize for clearance and what equipment would be needed to clear said-roads given the widely varying levels of debris. In the same way, aerial imagery was also used to identify sites for humanitarian organizations to set up their base-camps.

Fixed-wing UAVs like the eBee were used to survey disaster damage in Tacloban, with the resulting imagery uploaded to Humanitarian OpenStreetMap’s (HOT’s) Task Manager to trace up-to-date digital maps of the area. This is not the first time that HOT has used aerial imagery—the team used aerial imagery back in 2010 following the devastating Haiti earthquake. Beyond building damage, VHR aerial imagery can also be used to assess the impact of a disasters on powerlines, cell phone towers, agriculture and farmland.

CorePhil DSI

In addition, VHR images can also be used to estimate populations and the number of displaced persons. In Haiti, the International Organization for Migration (IOM) used UAVs used to map over 40 square kilometers of dense urban territory, which included several shantytowns. The imagery was used to count the number of tents and to organize a “door-to-door” census of the population–an important step to identify needs and organize more permanent infrastructure.

A few months after Yolanda, the Swiss humanitarian organization Medair used fixed-wing UAVs to support their post-disaster recovery operations. They took hundreds of VHR aerial photographs to create very detailed 2D and 3D digital maps of Tacloban and Leyte. This required flying the UAV at the same altitude along a pre-programmed route. The UAV team that captured the images used the Pix4D software to stitch these together and create the 3D maps, which capture elevation—an important piece of information for disaster-risk analysis like floods and storm surges. The VHR maps enabled Medair to identify and thus advocate for those residential areas that were falling behind vis-a-vis reconstruction and recovery. The imagery was also used by the local mayor’s office in Tacloban to identify appropriate, safe and dignified areas to resettle Filipinos had been forced to live in informal shelters right after the Typhoon.

To learn more about past and ongoing humanitarian UAV projects, please see the “Directory of Projects” on the Humanitarian UAV Network. If you know of other projects that are not listed in this directory, then please do add these directly to the document, thank you.

Other Potential Use Cases

There’s been increasing talk of using UAVs for transportation (small payloads of 1-5 Kilos). Perhaps these payloads could carry flyers with important information for disaster-affected communities who are without electricity, radio access or cell phone coverage. Along those lines, some entrepreneurial groups in the US are starting to use UAVs for advertising purposes by flying banners. This may be another way to “communicate” with disaster affected communities, although it would be limited to one-way communication. Obviously, payloads could also include satellite phones or first-aid kits, like the one below, which we were testing earlier this week for our upcoming UAV Search & Rescue Challenge.

UAVpayload

In addition, Google and Facebook are betting hundreds of millions of dollars that UAVs will provide remote areas with Internet connectivity (as is DARPA). This may come in handy when traditional communication infrastructure is affected after major disasters. Some telecommunications companies may follow in these footsteps; sending out a fleet of UAVs to serve as temporary “aerial cell phone towers” when their Earth-based towers go down. A related idea is to extend the concept of meshed mobile communication technologies (like those developed by The Serval Project) to aerial meshed communication networks.

UAVs can also carry a host of sensors other than a regular camera. In Haiti, UAV sensors were used to assess water quality and identify areas of standing water where mosquitos and epidemics could thrive. Highly sensitive audio sensors can be used to listen for signs of trapped survivors. Other sensors can also be used to identify whether radio transmitters and other ground-based communication facilities (like cell phone towers) still work. This use-case was brought to my attention earlier this year by a member of the UAViators Advisory Board.

Conclusion

UAVs can add value in a number of areas but are obviously not the solution to every problem. In many cases, the use of UAVs won’t be appropriate; and when all you have is a hammer, then everything looks like a nail. So my advocacy around the use of humanitarian UAVs should obviously not be taken to suggest that UAVs are the answer to every and all humanitarian problems; UAVs, like other technologies used in humanitarian settings, obviously come with a host of advantages and disadvantages. As always, the key is to accurately identify and describe the challenge first; and then to assess potential technology solutions that are most appropriate—if any. Finally, and again obviously, flying UAVs is just part of the challenge. Coordination, safety, privacy, information sharing, imagery analysis, legislation and operational response are just a some of the other challenges that the Humanitarian UAV Network is set up to address.

bio

See Also:

  • Welcome to the Humanitarian UAV Network [link]
  • How UAVs are Making a Difference in Disaster Response [link]
  • Humanitarians Using UAVs for Post Disaster Recovery [link]
  • Grassroots UAVs for Disaster Response [link]
  • Using UAVs for Search & Rescue [link]
  • Crowdsourcing Analysis of UAV Imagery for Search/Rescue [link]
  • Check-List for Flying UAVs in Humanitarian Settings [link]

Using AIDR to Collect and Analyze Tweets from Chile Earthquake

Wish you had a better way to make sense of Twitter during disasters than this?

Type in a keyword like #ChileEarthquake in Twitter’s search box above and you’ll see more tweets than you can possibly read in a day let alone keep up with for more than a few minutes. Wish there way were an easy, free and open source solution? Well you’ve come to the right place. My team and I at QCRI are developing the Artificial Intelligence for Disaster Response (AIDR) platform to do just this. Here’s how it works:

First you login to the AIDR platform using your own Twitter handle (click images below to enlarge):

AIDR login

You’ll then see your collection of tweets (if you already have any). In my case, you’ll see I have three. The first is a collection of English language tweets related to the Chile Earthquake. The second is a collection of Spanish tweets. The third is a collection of more than 3,000,000 tweets related to the missing Malaysia Airlines plane. A preliminary analysis of these tweets is available here.

AIDR collections

Lets look more closely at my Chile Earthquake 2014 collection (see below, click to enlarge). I’ve collected about a quarter of a million tweets in the past 30 hours or so. The label “Downloaded tweets (since last re-start)” simply refers to the number of tweets I’ve collected since adding a new keyword or hashtag to my collection. I started the collection yesterday at 5:39am my time (yes, I’m an early bird). Under “Keywords” you’ll see all the hashtags and keywords I’ve used to search for tweets related to the earthquake in Chile. I’ve also specified the geographic region I want to collect tweets from. Don’t worry, you don’t actually have to enter geographic coordinates when you set up your own collection, you simply highlight (on map) the area you’re interested in and AIDR does the rest.

AIDR - Chile Earthquake 2014

You’ll also note in the above screenshot that I’ve selected to only collect tweets in English, but you can collect all language tweets if you’d like or just a select few. Finally, the Collaborators section simply lists the colleagues I’ve added to my collection. This gives them the ability to add new keywords/hashtags and to download the tweets collected as shown below (click to enlarge). More specifically, collaborators can download the most recent 100,000 tweets (and also share the link with others). The 100K tweet limit is based on Twitter’s Terms of Service (ToS). If collaborators want all the tweets, Twitter’s ToS allows for sharing the TweetIDs for an unlimited number of tweets.

AIDR download CSV

So that’s the AIDR Collector. We also have the AIDR Classifier, which helps you make sense of the tweets you’re collecting (in real-time). That is, your collection of tweets doesn’t stop, it continues growing, and as it does, you can make sense of new tweets as they come in. With the Classifier, you simply teach AIDR to classify tweets into whatever topics you’re interested in, like “Infrastructure Damage”, for example. To get started with the AIDR Classifier, simply return to the “Details” tab of our Chile collection. You’ll note the “Go To Classifier” button on the far right:

AIDR go to Classifier

Clicking on that button allows you to create a Classifier, say on the topic of disaster damage in general. So you simply create a name for your Classifier, in this case “Disaster Damage” and then create Tags to capture more details with respect to damage-related tweets. For example, one Tag might be, say, “Damage to Transportation Infrastructure.” Another could be “Building Damage.” In any event, once you’ve created your Classifier and corresponding tags, you click Submit and find your way to this page (click to enlarge):

AIDR Classifier Link

You’ll notice the public link for volunteers. That’s basically the interface you’ll use to teach AIDR. If you want to teach AIDR by yourself, you can certainly do so. You also have the option of “crowdsourcing the teaching” of AIDR. Clicking on the link will take you to the page below.

AIDR to MicroMappers

So, I called my Classifier “Message Contents” which is not particularly insightful; I should have labeled it something like “Humanitarian Information Needs” or something, but bear with me and lets click on that Classifier. This will take you to the following Clicker on MicroMappers:

MicroMappers Clicker

Now this is not the most awe-inspiring interface you’ve ever seen (at least I hope not); reason being that this is simply our very first version. We’ll be providing different “skins” like the official MicroMappers skin (below) as well as a skin that allows you to upload your own logo, for example. In the meantime, note that AIDR shows every tweet to at least three different volunteers. And only if each of these 3 volunteers agree on how to classify a given tweet does AIDR take that into consideration when learning. In other words, AIDR wants to ensure that humans are really sure about how to classify a tweet before it decides to learn from that lesson. Incidentally, The MicroMappers smartphone app for the iPhone and Android will be available in the next few weeks. But I digress.

Yolanda TweetClicker4

As you and/or your volunteers classify tweets based on the Tags you created, AIDR starts to learn—hence the AI (Artificial Intelligence) in AIDR. AIDR begins to recognize that all the tweets you classified as “Infrastructure Damage” are indeed similar. Once you’ve tagged enough tweets, AIDR will decide that it’s time to leave the nest and fly on it’s own. In other words, it will start to auto-classify incoming tweets in real-time. (At present, AIDR can auto-classify some 30,000 tweets per minute; compare this to the peak rate of 16,000 tweets per minute observed during Hurricane Sandy).

Of course, AIDR’s first solo “flights” won’t always go smoothly. But not to worry, AIDR will let you know when it needs a little help. Every tweet that AIDR auto-tags comes with a Confidence level. That is, AIDR will let you know: “I am 80% sure that I correctly classified this tweet”. If AIDR has trouble with a tweet, i.e., if it’s confidence level is 65% or below, the it will send the tweet to you (and/or your volunteers) so it can learn from how you classify that particular tweet. In other words, the more tweets you classify, the more AIDR learns, and the higher AIDR’s confidence levels get. Fun, huh?

To view the results of the machine tagging, simply click on the View/Download tab, as shown below (click to enlarge). The page shows you the latest tweets that have been auto-tagged along with the Tag label and the confidence score. (Yes, this too is the first version of that interface, we’ll make it more user-friendly in the future, not to worry). In any event, you can download the auto-tagged tweets in a CSV file and also share the download link with your colleagues for analysis and so on. At some point in the future, we hope to provide a simple data visualization output page so that you can easily see interesting data trends.

AIDR Results

So that’s basically all there is to it. If you want to learn more about how it all works, you might fancy reading this research paper (PDF). In the meantime, I’ll simply add that you can re-use your Classifiers. If (when?) another earthquake strikes Chile, you won’t have to start from scratch. You can auto-tag incoming tweets immediately with the Classifier you already have. Plus, you’ll be able to share your classifiers with your colleagues and partner organizations if you like. In other words, we’re envisaging an “App Store” of Classifiers based on different hazards and different countries. The more we re-use our Classifiers, the more accurate they will become. Everybody wins.

And voila, that is AIDR (at least our first version). If you’d like to test the platform and/or want the tweets from the Chile Earthquake, simply get in touch!

bio

Note:

  • We’re adapting AIDR so that it can also classify text messages (SMS).
  • AIDR Classifiers are language specific. So if you speak Spanish, you can create a classifier to tag all Spanish language tweets/SMS that refer to disaster damage, for example. In other words, AIDR does not only speak English : )

Welcome to the Humanitarian UAV Network

UAViators Logo

The Humanitarian UAV Network (UAViators) is now live. Click here to access and join the network. Advisors include representatives from 3D Robotics, AirDroids, senseFly & DroneAdventures, OpenRelief, ShadowView Foundation, ICT4Peace Foundation, the United Nations and more. The website provides a unique set of resources, including the most comprehensive case study of humanitarian UAV deployments, a directory of organizations engaged in the humanitarian UAV space and a detailed list of references to keep track of ongoing research in this rapidly evolving area. All of these documents along with the network’s Code of Conduct—the only one of it’s kind—are easily accessible here.

UAViators 4 Teams

The UAViators website also includes 8 action-oriented Teams, four of which are displayed above. The Flight Team, for example, includes both new and highly experienced UAV pilots while the Imagery Team comprises members interested in imagery analysis. Other teams include the Camera, Legal and Policy Teams. In addition to this Team page, the site also has a dedicated Operations page to facilitate & coordinate safe and responsible UAV deployments in support of humanitarian efforts. In between deployments, the website’s Global Forum is a place where members share information about relevant news, events and more. One such event, for example, is the upcoming Drone/UAV Search & Rescue Challenge that UAViators is sponsoring.

When first announcing this initiative,  I duly noted that launching such a network will at first raise more questions than answers, but I welcome the challenge and believe that members of UAViators are well placed to facilitate the safe and responsible use of UAVs in a variety of humanitarian contexts.

Acknowledgements: Many thanks to colleagues and members of the Advisory Board who provided invaluable feedback and guidance in the lead-up to this launch. The Humanitarian UAV Network is result of collective vision and effort.

bio

See also:

  • How UAVs are Making a Difference in Disaster Response [link]
  • Humanitarians Using UAVs for Post Disaster Recovery [link]
  • Grassroots UAVs for Disaster Response [link]
  • Using UAVs for Search & Rescue [link]
  • Crowdsourcing Analysis of UAV Imagery for Search and Rescue [link]

Humanitarians Using UAVs for Post Disaster Recovery

I recently connected with senseFly’s Adam Klaptocz who founded the non-profit group DroneAdventures to promote humanitarian uses of UAVs. I first came across Adam’s efforts last year when reading about his good work in Haiti, which demonstrated the unique role that UAV technology & imagery can play in post-disaster contexts. DroneAdventures has also been active in Japan and Peru. In the coming months, the team will also be working on “aerial archeology” projects in Turkey and Egypt. When Adam emailed me last week, he and his team had just returned from yet another flying mission, this time in the Philippines. I’ll be meeting up with Adam in a couple weeks to learn more about their recent adventures. In the meantime, here’s a quick recap of what they were up to in the Philippines this month.

MedAir

Adam and team snapped hundreds of aerial images using their “eBee drones” to create a detailed set of 2D maps and 3D terrain models of the disaster-affected areas where partner Medair works. This is the first time that the Swiss humanitarian organization Medair is using UAVs to inform their recovery and rehabilitation programs. They plan to use the UAV maps & models of Tacloban and hard-hit areas in Leyte to assist in assessing “where the greatest need is” and what level of “assistance should be given to affected families as they continue to recover” (1). To this end, having accurate aerial images of these affected areas will allow the Swiss organization to “address the needs of individual households and advocate on their behalf when necessary” (2). 

ebee

An eBee Drone also flew over Dulag, north of Leyte, where more than 80% of the homes and croplands were destroyed following Typhoon Yolanda. Medair is providing both materials and expertise to build new shelters in Dulag. As one Medair representative noted during the UAV flights, “Recovery from a disaster of this magnitude can be complex. The maps produced from the images taken by the drones will give everyone, including community members themselves, an opportunity to better understand not only where the greatest needs are, but also their potential solutions” (3). The partners are also committed to Open Data: “The images will be made public for free online, enabling community leaders and humanitarian organizations to use the information to coordinate reconstruction efforts” (4). The pictures of the Philippines mission below were very kindly shared by Adam who asked that they be credited to DroneAdventures.

Credit: DroneAdventures

At the request of the local Mayor, DroneAdventures and MedAir also took aerial images of a relatively undamaged area some 15 kilometers north of Tacloban, which is where the city government is looking to relocate families displaced by Typhoon Yolanda. During the deployment, Adam noted that “Lightweight drones such as the eBee are safe and easy to operate and can provide crucial imagery at a precision and speed unattainable by satellite imagery. Their relatively low cost of deployment make the technology attainable even by small communities throughout the developing world. Not only can drones be deployed immediately following a disaster in order to assess damage and provide detailed information to first-responders like Medair, but they can also assist community leaders in planning recovery efforts” (5). As the Medair rep added, “You can just push a button or launch them by hand to see them fly, and you don’t need a remote anymore—they are guided by GPS and are inherently safe” (6).

Credit: DroneAdventures

I really look forward to meeting up with Adam and the DroneAdventures team at the senseFly office in Lausanne next month to learn more about their recent work and future plans. I will also be asking the team for their feedback and guidance on the Humanitarian UAV Network (UAViators) that I am launching. So stay tuned for updates!

Bio

See also:

  • Calling All UAV Pilots: Want to Support Humanitarian Efforts? [link]
  • How UAVs are Making a Difference in Disaster Response [link]
  • Grassroots UAVs for Disaster Response (in the Philippines) [link]

 

Launching a Search and Rescue Challenge for Drone / UAV Pilots

My colleague Timothy Reuter (of AidDroids fame) kindly invited me to co-organize the Drone/UAV Search and Rescue Challenge for the DC Drone User Group. The challenge will take place on May 17th near Marshall in Virginia. The rules for the competition are based on the highly successful Search/Rescue challenge organized by my new colleague Chad with the North Texas Drone User Group. We’ll pretend that a person has gone missing by scattering (over a wide area) various clues such as pieces of clothing & personal affects. Competitors will use their UAVs to collect imagery of the area and will have 45 minutes after flying to analyze the imagery for clues. The full set of rules for our challenge are listed here but may change slightly as we get closer to the event.

searchrescuedrones

I want to try something new with this challenge. While previous competitions have focused exclusively on the use of drones/UAVs for the “Search” component of the challenge, I want to introduce the option of also engaging in the “Rescue” part. How? If UAVs identify a missing person, then why not provide that person with immediate assistance while waiting for the Search and Rescue team to arrive on site? The UAV could drop a small and light-weight first aid kit, or small water bottle, or even a small walkie talkie. Enter my new colleague Euan Ramsay who has been working on a UAV payloader solution for Search and Rescue; see the video demo below. Euan, who is based in Switzerland, has very kindly offered to share several payloader units for our UAV challenge. So I’ll be meeting up with him next month to take the units back to DC for the competition.

Another area I’d like to explore for this challenge is the use of crowdsourcing to analyze the aerial imagery & video footage. As noted here, the University of Central Lancashire used crowdsourcing in their UAV Search and Rescue pilot project last summer. This innovative “crowdsearching” approach is also being used to look for Malaysia Flight 370 that went missing several weeks ago. I’d really like to have this crowdsourcing element be an option for the DC Search & Rescue challenge.

UAV MicroMappers

My team and I at QCRI have developed a platform called MicroMappers, which can easily be used to crowdsource the analysis of UAV pictures and videos. The United Nations (OCHA) used MicroMappers in response to Typhoon Yolanda last year to crowdsource the tagging pictures posted on Twitter. Since then we’ve added video tagging capability. So one scenario for the UAV challenge would be for competitors to upload their imagery/videos to MicroMappers and have digital volunteers look through this content for clues of our fake missing person.

In any event, I’m excited to be collaborating with Timothy on this challenge and will be share updates on iRevolution on how all this pans out.

bio

See also:

  • Using UAVs for Search & Rescue [link]
  • Crowdsourcing Analysis of UAV Imagery for Search and Rescue [link]
  • How UAVs are Making a Difference in Disaster Response [link]
  • Grassroots UAVs for Disaster Response [link]

Grassroots UAVs for Disaster Response

I was recently introduced to a new initiative that seeks to empower grassroots communities to deploy their own low-cost xUAVs. The purpose of this initiative? To support locally-led disaster response efforts and in so doing transfer math, science and engineering skills to local communities. The “x” in xUAV refers to expendable. The initiative is a partnership between California State University (Long Beach), University of Hawaii, Embry Riddle, The Philippine Council for Industry, Energy & Emerging Technology Research & Development, Skyeye, Aklan State University and Ateneo de Manila University in the Philippines. The team is heading back to the Philippines next week for their second field mission. This blog post provides a short overview of the project’s approach and the results from their first mission, which took place during December 2013-February 2014.

xUAV1

The xUAV team is specifically interested in a new category of UAVs, those that are locally available, locally deployable, low-cost, expendable and extremely easy to use. Their first field mission to the Philippines focused on exploring the possibilities. The pictures above/below (click to enlarge) were kindly shared by the Filipinos engaged in the project—I am very grateful to them for allowing me to share these publicly. Please do not reproduce these pictures without their written permission, thank you.

xUAV2

I spoke at length with one of the xUAV team leads, Ted Ralston, who is heading back to the Philippines the second field mission. The purpose of this follow up visit is to shift the xUAV concept from experimental to deployable. One area that his students will be focusing on with the University of Manila is the development of a very user-friendly interface (using a low-cost tablet) to pilot the xUAVs so that local communities can simply tag way-points on a map that the xUAV will then automatically fly to. Indeed, this is where civilian UAVs are headed, full automation. A good example of this trend towards full automation is the new DroidPlanner 2.0 App just released by 3DRobotics. This free app provides powerful features to very easily plan autonomous flights. You can even create new flight plans on the fly and edit them onsite.

DroidPlanner.png

So the xUAV team will focus on developing software for automated take-off and landing as well as automated adjustments for wind conditions when the xUAV is airborne, etc. The software will also automatically adjust the xUAV’s flight parameters for any added payloads. Any captured imagery would then be made easily viewable via touch-screen directly from the low-cost tablet.

xUAV3

One of the team’s top priorities throughout this project is to transfer their skills to young Filipinos, given them hands on training in science, math and engineering. An equally important, related priority, is their focus on developing local partnerships with multiple partners. We’re familiar with ideas behind Public Participatory GIS (PPGIS) vis-a-vis the participatory use of geospatial information systems and technologies. The xUAV team seeks to extend this grassroots approach to Public Participatory UAVs.

xUAV4

I’m supporting this xUAV initiative in a number of ways and will be uploading the team’s UAV imagery (videos & still photos) from their upcoming field mission to MicroMappers for some internal testing. I’m particularly interested in user-generated (aerial) content that is raw and not pre-processed or stitched together, however. Why? Because I expect this type of imagery to grow in volume given the very rapid growth of the personal micro-UAV market. For more professionally produced and stitched-together aerial content, an ideal platform is Humanitarian OpenStreetMap’s Tasking Server, which is tried and tested for satellite imagery and which was recently used to trace processed UAV imagery of Tacloban.

Screen Shot 2014-03-12 at 1.03.20 PM

I look forward to following the xUAV team’s efforts and hope to report on the outcome of their second field mission. The xUAV initiative fits very nicely with the goals of the Humanitarian UAV Network (UAViators). We’ll be learning a lot in the coming weeks and months from our colleagues in the Philippines.

bio