Category Archives: Drones/UAVs

MicroMappers: Towards Next Generation Humanitarian Technology

The MicroMappers platform has come a long way and still has a ways to go. Our vision for MicroMappers is simple: combine human computing (smart crowd-sourcing) with machine computing (artificial intelligence) to filter, fuse and map a variety of different data types such as text, photo, video and satellite/aerial imagery. To do this, we have created a collection of “Clickers” for MicroMappers. Clickers are simply web-based crowdsourcing apps used to make sense of “Big Data”. The “Text Cicker” is used to filter tweets & SMS’s; “Photo Clicker” to filter photos; “Video Clicker” to filter videos and yes the Satellite & Aerial Clickers to filter both satellite and aerial imagery. These are the Data Clickers. We also have a collection of Geo Clickers that digital volunteers use to geo-tag tweets, photos and videos filtered by the Data Clickers. Note that these Geo Clickers auto-matically display the results of the crowdsourced geo-tagging on our MicroMaps like the one below.

MM Ruby Tweet Map

Thanks to our Artificial Intelligence (AI) engine AIDR, the MicroMappers “Text Clicker” already combines human and machine computing. This means that tweets and text messages can be automatically filtered (classified) after some initial crowdsourced filtering. The filtered tweets are then pushed to the Geo Clickers for geo-tagging purposes. We want to do the same (semi-automation) for photos posted to social media as well as videos; although this is still a very active area of research and development in the field of computer vision.

So we are prioritizing our next hybrid human-machine computing efforts on aerial imagery instead. Just like the “Text Clicker” above, we want to semi-automate feature detection in aerial imagery by adding an AI engine to the “Aerial Clicker”. We’ve just starting to explore this with computer vision experts in Switzerland and Canada. Another development we’re eyeing vis-a-vis UAVs is live video streaming. To be sure, UAVs will increasingly be transmitting live video feeds directly to the web. This means we may eventually need to develop a “Streaming Clicker”, which would in some respects resemble our existing “Video Clicker” except that the video would be broadcasting live rather than play back from YouTube, for example. The “Streaming Clicker” is for later, however, or at least until a prospective partner organization approaches us with an immediate and compelling social innovation use-case.

In the meantime, my team & I at QCRI will continue to improve our maps (data visualizations) along with the human computing component of the Clickers. The MicroMappers smartphone apps, for example, need more work. We also need to find partners to help us develop apps for tablets like the iPad. In addition, we’re hoping to create a “Translate Clicker” with Translators Without Borders (TWB). The purpose of this Clicker would be to rapidly crowdsource the translation of tweets, text messages, etc. This could open up rather interesting possibilities for machine translation, which is certainly an exciting prospect.

MM All Map

Ultimately, we want to have one and only one map to display the data filtered via the Data and Geo Clickers. This map, using (Humanitarian) OpenStreetMap as a base layer, would display filtered tweets, SMS’s, photos, videos and relevant features from satellite and UAV imagery. Each data type would simply be a different layer on this fused “Meta-Data Crisis Map”; and end-users would simply turn individual layers on and off as needed. Note also the mainstream news feeds (CNN and BBC) depicted in the above image. We’re working with our partners at UN/OCHA, GDELT & SBTF to create a “3W Clicker” to complement our MicroMap. As noted in my forthcoming book, GDELT is the ultimate source of data for the world’s digitized news media. The 3Ws refers to Who, What, Where; an important spreadsheet that OCHA puts together and maintains in the aftermath of major disasters to support coordination efforts.

In response to Typhoon Ruby in the Philippines, Andrej Verity (OCHA) and I collaborated with Kalev Leetaru from GDELT to explore how the MicroMappers “3W Clicker” might work. The result is the Google Spreadsheet below (click to enlarge) that is automatically updated every 15 minutes with the latest news reports that refer to one or more humanitarian organizations in the Philippines. GDELT includes the original URL of the news article as well as the list of humanitarian organizations referenced in the article. In addition, GDELT automatically identifies the locations referred to in the articles, key words (tags) and the date of the news article. The spreadsheet below is already live and working. So all we need now is the “3W Clicker” to crowdsource the “What”.

MM GDELT output

The first version of the mock-up we’ve created for the “3W Clicker” is displayed below. Digital volunteers are presented with an interface that includes an news article with the names of humanitarian organizations highlighted in red for easy reference. GDELT auto-populates the URL, the organization name (or names if there are more than one) and the location. Note that both the “Who” & “Where” information can be edited directly by the volunteer incase GDELT’s automated algorithm gets those wrong. The main role of digital volunteers, however, would simply be to identify the “What” by quickly skimming the article.

MM 3W Clicker v2

The output of the “3W Clicker” would simply be another MicroMap layer. As per Andrej’s suggestion, the resulting data could also be automatically pushed to another Google Spreadsheet in HXL format. We’re excited about the possibilities and plan to move forward on this sooner rather than later. In addition to GDELT, pulling in feeds from CrisisNET may be worth exploring. I’m also really keen on exploring ways to link up with the Global Disaster Alert & Coordination System (GDACS) as well as GeoFeedia.

In the meantime, we’re hoping to pilot our “Satellite Clicker” thanks to recent conversations with Planet Labs and SkyBox Imaging. Overlaying user-generated content such as tweets and images on top of both satellite and aerial imagery can go a long way to helping verify (“ground truth”) social media during disasters and other events. This is evidenced by recent empirical studies such as this one in Germany and this one in the US. On this note, as my QCRI colleague Heather Leson recently pointed out, the above vision for MicroMappers is still missing one important data feed; namely sensors—the Internet of Things. She is absolutely spot on, so we’ll be sure to look for potential pilot projects that would allow us to explore this new data source within MicroMappers.

The above vision is a tad ambitious (understatement). We really can’t do this alone. To this end, please do get in touch if you’re interested in joining the team and getting MicroMappers to the next level. Note that MicroMappers is free and open source and in no way limited to disaster response applications. Indeed, we recently used the Aerial Clicker for this wildlife protection project in Namibia. This explains why our friends over at National Geographic have also expressed an interest in potentially piloting the MicroMappers platform for some of their projects. And of course, one need not use all the Clickers for a project, simply the one(s) that make sense. Another advantage of MicroMappers is that the Clickers (and maps) can be deployed very rapidly (since the platform was initially developed for rapid disaster response purposes). In any event, if you’d like to pilot the platform, then do get in touch.

bio

See also: Digital Humanitarians – The Book

Digital Jedis: There Has Been An Awakening…

New: List of Software for UAVs and Aerial Imagery

My research team and I at the Humanitarian UAV Network (UAViators) have compiled a list of more than 30 common software platforms used to operate UAVs and analyze resulting aerial imagery. We carried out this research to provide humanitarian organizations with a single repository where they can review existing software platforms (including free & open source solutions) for their humanitarian UAV missions. The results, available here, provide a brief description of each software platform along with corresponding links for additional information and download. We do realize that this list is not (yet) comprehensive, so we hope you’ll help us fill remaining gaps. This explains why we’ve made our research available as an open, editable Google Doc.

UAV software

Many thanks to my research assistant Peter Mosur for taking the lead on this. We have additional research documents available here on the UAViators website.

bio

See also:

  • Humanitarian UAV Network: Strategy for 2014-2015 [link]
  • Humanitarians in the Sky: Using UAVs for Disaster Response [link]
  • Low-Cost UAV Applications for Post-Disaster Damage Assessments: A Streamlined Workflow [Link]

Humanitarian UAV/Drones in Conflict Zones: Fears, Concerns and Opportunities

My research team and I at the Humanitarian UAV Network (UAViators) have compiled a list of fears and concerns expressed by humanitarians and others on the use of UAVs in humanitarian settings. To do this, we closely reviewed well over 50 different documents, reports, articles, etc., on humanitarian UAVs as part of this applied research project. The motivation behind this research is to better understand the different and overlapping concerns that humanitarian organizations have over the use of non-lethal drones in crises, particularly crises mired by violent conflict.

Resarch Table

The results of this research are available in this open & editable spreadsheet and summarized in the table above. We identified a total of 9 different categories of concerns and tallied the unique instances in which these appear in the official humanitarian reports, articles, papers, studies, etc., that we reviewed. The top 3 concerns are: Military Association, Data Privacy and Consent. We very much welcome feedback, so feel free to get in touch via the comments section below and/or add additional content directly to the spreadsheet. This research will feed into an upcoming workshop that my colleague Kristin Sandvik (on the Advisory Board of UAViators) and I are looking to organize in the Spring of 2015. The workshop will address the most pressing issues around the use of civilian UAVs in conflict zones.

I tend to believe that UAV initiatives like the Syria Airlift Project (video above) can play a positive role in conflict settings. In fact, I wrote about this exact use of UAVs back in 2008 for PeaceWork Magazine (scroll down) and referred to previous (conventional) humanitarian airlift examples from the Berlin Airlift in the 1940’s to the Biafran Airlift in the 1960’s as a basis for remotely piloted aircraft systems. As such, I suggested that UAVs could be used in Burma at the time to transport relief supplies in response to the complex emergency. While fraught with risks, these risks can at times be managed when approached with care, integrity and professionalism, especially if a people-centered, community-based approach is taken, which prioritizes both safety and direct empowerment.

While some humanitarians may be categorically against any and all uses of non-lethal UAVs in conflict zones regardless of the circumstances, the fact is that their opinions won’t prevent affected communities and others from using UAVs anyway. More and more individuals will have access to cheaper and cheaper UAVs in the months and years ahead. As a UN colleague noted with respect to the Syria Airlift Project, initiatives like these may well be a sign of things to come. This sentiment is also shared by my colleague Jules Frost at World Vision. See her recent piece entitled: “Eyes in the Sky are Inevitable: UAVs and Humanitarian Response.”

Bio

Acknowledgements: Many thanks to my research assistants Peter Mosur and Jus Mackinnon for taking the lead in this research.

See also:

  • On UAVs for Peacebuilding and Conflict Prevention [link]
  • The Use of Drones for Nonviolent Civil Resistance [link]
  • Drones for Human Rights: Brilliant or Foolish [link]

UN Experts Meeting on Humanitarian UAVs

Updated: The Experts Meeting Summary Report is now available here (PDF) and also here as an open, editable Google Doc for comments/questions.

The Humanitarian UAV Network (UAViators) and the United Nations Office for the Coordination of Humanitarian Affairs (OCHA) are co-organizing the first ever “Experts Meeting on Humanitarian UAVs” on November 6th at UN Head-quarters in New York. This full-day strategy meeting, which is co-sponsored by the ICT for Peace Foundation (ICT4Peace) and QCRI, will bring together leading UAV experts (including several members of the UAV Network’s Advisory Board, such as DJI) with seasoned humanitarian professionals from OCHA, WFP, UNICEF, UNHCR, UNDAC, IOM, American Red Cross, European Commission and several other groups that are also starting to use civilian UAVs or have a strong interest in leveraging this technology.

The strategy session, which I’ll be running with my colleague Dan Gilman from OCHA (who authored this Policy Brief on Humanitarian UAVs), will provide an important opportunity for information sharing between UAV experts and humanitarian professionals with the explicit goal of catalyzing direct collabo-ration on the operational use of UAVs in humanitarian settings. UAV experts seek to better understand humanitarian information needs (e.g. UNDAC needs) while humanitarians seek to better understand the challenges and opportunities regarding the rapid deployment of UAVs. In sum, this workshop will bring together 30 experts from different disciplines to pave the way forward for the safe and effective use of humanitarian UAVs.

Screen Shot 2014-06-24 at 2.22.05 PM

The Experts Meeting will include presentations from select participants such as Gene Robinson (leading expert in the use of UAVs for Search & Rescue), Kate Chapman (director of Humanitarian OpenStreetMap), Peter Spruyt (European Commission’s Joint Research Center), Jacob Petersen (Anthea Technologies), Charles Devaney (University of Hawaii), Adam Klaptocz (Drone Adventures & senseFly) and several others. Both Matternet and Google’s Project Wing have been formally invited to present on the latest in UAV payload transportation. (Representatives from the Small UAV Coalition have also been invited to attend).

In addition to the above, the strategy meeting will include dedicated sessions on Ethics, Legislation and Regulation facilitated by Brendan Schulman (leading UAV lawyer) and Kristin Sandvik (Norwegian Center for Humanitarian Studies). Other sessions are expected to focus on Community Engagement, Imagery Analysis as well as Training and Certification. The final session of the day will be dedicated to identifying potential joint pilot projects between UAV pro’s and humanitarian organizations as well as the Humanitarian UAV Network.

UAViators Logo

We will be writing up a summary of the Experts Meeting and making this report publicly available via the Humanitarian UAV Network website. In addition, we plan to post videos of select talks given during the strategy meeting along with accompanying slides. This first meeting at UN Headquarters serves as a spring board for 2 future strategy meetings scheduled for 2015. One of these will be a 3-day high-level & policy-focused international workshop on Humanitarian UAVs, which will be held at the Rockefeller Foundation’s Center in Bellagio, Italy (pictured below in an UAV/aerial image I took earlier this year). This workshop will be run by myself, Dan Gilman and Kristin Sandvik (both of whom are on the Advisory Board of the Humanitarian UAV Network).

ProPic35

Kristin and I are also looking to co-organize another workshop in 2015 to focus specifically on the use of non-lethal UAVs in conflict zones. We are currently talking to prospective donors to make this happen. So stay tuned for more information on all three Humanitarian UAV meetings as one of our key goals at the Humanitarian UAV Network is to raise awareness about humanitarian UAVs by publicly disseminating results & findings from key policy discussions and UAV missions. In the meantime, big thanks to UN/OCHA, ICT4Peace and the Rockefeller Foundation for their crucial and most timely support.

Bio

See also:

  • Humanitarians in the Sky: Using UAVs for Disaster Response [link]
  • Low-Cost UAV Applications for Post-Disaster Damage Assessments: A Streamlined Workflow [Link]
  • Humanitarian UAVs Fly in China After Earthquake [link]
  • Humanitarian UAV Missions During Balkan Floods [link]
  • Humanitarian UAVs in the Solomon Islands [link]
  • UAVs, Community Mapping & Disaster Risk Reduction in Haiti [link]

Low-Cost UAV Applications for Post-Disaster Assessments: A Streamlined Workflow

Colleagues Matthew Cua, Charles Devaney and others recently co-authored this excellent study on their latest use of low-cost UAVs/drones for post-disaster assessments, environmental development and infrastructure development. They describe the “streamlined workflow—flight planning and data acquisition, post-processing, data delivery and collaborative sharing,” that they created “to deliver acquired images and orthorectified maps to various stakeholders within [their] consortium” of partners in the Philippines. They conclude from direct hands-on experience that “the combination of aerial surveys, ground observations and collaborative sharing with domain experts results in richer information content and a more effective decision support system.”

Screen Shot 2014-10-03 at 11.26.12 AM

UAVs have become “an effective tool for targeted remote sensing operations in areas that are inaccessible to conventional manned aerial platforms due to logistic and human constraints.” As such, “The rapid development of unmanned aerial vehicle (UAV) technology has enabled greater use of UAVs as remote sensing platforms to complement satellite and manned aerial remote sensing systems.” The figure above (click to enlarge) depicts the aerial imaging workflow developed by the co-authors to generate and disseminate post-processed images. This workflow, the main components of which are “Flight Planning & Data Acquisition,” “Data Post-Processing” and “Data Delivery,” will “continuously be updated, with the goal of automating more activities in order to increase processing speed, reduce cost and minimize human error.”

Screen Shot 2014-10-03 at 11.27.02 AM

Flight Planning simply means developing a flight plan based on clearly defined data needs. The screenshot above (click to enlarge) is a “UAV flight plan of the coastal section of Tacloban city, Leyte generated using APM Mission Planner. The [flight] plan involved flying a small UAV 200 meters above ground level. The raster scan pattern indicated by the yellow line was designed to take images with 80% overlap & 75% side overlap. The waypoints indicating a change in direction of the UAV are shown as green markers.” The purpose of the overlapping is to stitch and accurately geo-referenced the images during post-processing. A video on how to program UAV flight is available here.  This video specifically focuses on post-disaster assessments in the Philippines.

“Once in the field, the team verifies the flight plans before the UAV is flown by performing a pre-flight survey [which] may be done through ground observations of the area, use of local knowledge or short range aerial observations with a rotary UAV to identify launch/recovery sites and terrain characteristics. This may lead to adjustment in the flight plans. After the flight plans have been verified, the UAV is deployed for data acquisition.”

Screen Shot 2014-10-03 at 11.27.33 AM

Matthew, Charles and team initially used a Micropilot MP-Vision UAV for data acquisition. “However, due to increased cost of maintenance and significant skill requirements of setting up the MP-Vision,” they developed their own custom UAV instead, which “uses semi-professional and hobby- grade components combined with open-source software” as depicted in the above figure (click to enlarge). “The UAV’s airframe is the Super SkySurfer fixed-wing EPO foam frame.” The team used the “ArduPilot Mega (APM) autopilot system consisting of an Arduino-based microprocessor board, airspeed sensor, pressure and tem-perature sensor, GPS module, triple-axis gyro and other sensors. The firmware for navigation and control is open-source.”

The custom UAV, which costs approximately $2,000, has “an endurance of about 30-50 minutes, depending on payload weight and wind conditions, and is able to survey an area of up to 4 square kilometers.” The custom platform was “easier to assemble, repair, maintain, modify & use. This allowed faster deploy-ability of the UAV. In addition, since the autopilot firmware is open-source, with a large community of developers supporting it, it became easier to identify and address issues and obtain software updates.” That said, the custom UAV was “more prone to hardware and software errors, either due to assembly of parts, wiring of electronics or bugs in the software code.” Despite these drawbacks, “use of the custom UAV turned out to be more feasible and cost effective than use of a commercial-grade UAV.”

In terms of payloads (cameras), three different kinds were used: Panasonic Lumix LX3, Canon S100, and GoPro Hero 3. These cameras come with both advantages and disadvantages for aerial mapping. The LX3 has better image quality but the servo triggering the shutter would often fail. The S100 is GPS-enabled and does not require mechanical triggering. The Hero-3 was used for video reconnaissance specifically.

Screen Shot 2014-10-04 at 5.31.47 AM

“The workflow at [the Data-Processing] stage focuses on the creation of an orthomosaic—an orthorectified, georeferenced and stitched map derived from aerial images and GPS and IMU (inertial measurement unit values, particularly yaw, pitch and roll) information.” In other words, “orthorectification is the process of stretching the image to match the spatial accuracy of a map by considering location, elevation, and sensor information.”

Transforming aerial images into orthomosaics involves: (1) manually removing take-off/landing, burry & oblique images; (2) applying contrast enhancement to images that are either over- or under-exposed using commercial image-editing software; (3) geo-referencing the resulting images; (4) creating an orthomosaic from the geo-tagged images. The geo-referencing step is not needed if the images are already geo-referenced (i.e., have GPS coordinates, like those taken with the Cannon S100. “For non-georeferenced images, georeferencing is done by a custom Python script that generates a CSV file containing the mapping between images and GPS/IMU information. In this case, the images are not embedded with GPS coordinates.” The sample orthomosaic above uses 785 images taken during two UAV flights (click to enlarge).

Matthew, Charles and team used the “Pix4Dmapper photomapping software developed by Pix4D to render their orthomosaics. “The program can use either geotagged or non-geotagged images. For non-geotagged images, the software accepts other inputs such as the CSV file generated by the custom Python script to georeference each image and generate the photomosaic. Pix4D also outputs a report containing information about the output, such as total area covered and ground resolution. Quantum GIS, an open-source GIS software, was used for annotating and viewing the photomosaics, which can sometimes be too large to be viewed using common photo viewing software.”

Screen Shot 2014-10-03 at 11.28.20 AM

Data Delivery involves uploading the orthomosaics to a common, web-based platform that stakeholders can access. Orthomosaics “generally have large file sizes (e.g around 300MB for a 2 sq. km. render),” so the team created a web-based geographic information systems (GIS) to facilitate sharing of aerial maps. “The platform, named VEDA, allows viewing of rendered maps and adding metadata. The key advantage of using this platform is that the aerial imagery data is located in one place & can be accessed from any computer with a modern Internet browser. Before orthomosaics can be uploaded to the VEDA platform, they need to be converted into an approprate format supported by the platform. The current format used is MBTiles developed by Mapbox. The MBTiles format specifies how to partition a map image into smaller image tiles for web access. Once uploaded, the orthomosaic map can then be annotated with additional information, such as markers for points of interest.” The screenshot above (click to enlarge) shows the layout of a rendered orthomosaic in VEDA.

Matthew, Charles and team have applied the above workflow in various mission-critical UAV projects in the Philippines including damage assessment work after Typhoon Haiyan in 2013. This also included assessing the impact of the Typhoon on agriculture, which was an ongoing concern for local government during the recovery efforts. “The coconut industry, in particular, which plays a vital role in the Philippine economy, was severely impacted due to millions of coconut trees being damaged or flattened after the storm hit. In order to get an accurate assessment of the damage wrought by the typhoon, and to make a decision on the scale of recovery assistance from national government, aerial imagery coupled with a ground survey is a potentially promising approach.”

So the team received permission from local government to fly several missions over areas in Eastern Visayas that [were] devoted to coconut stands prior to Typhoon Haiyan.” (As such, “The UAV field team operated mostly in rural areas and wilderness, which reduced the human risk factor in case of aircraft failure. Also, as a safety guideline, the UAV was not flown within 3 miles from an active airport”). The partners in the Philippines are developing image processing techniques to distinguish “coconut trees from wild forest and vegetation for land use assessment and carbon source and sink estimates. One technique involved use of superpixel classification, wherein the image pixels are divided into homogeneous regions (i.e. collection of similar pixels) called superpixels which serve as the basic unit for classification.”

Screen Shot 2014-10-03 at 11.29.07 AM

The image below shows the “results of the initial test run where areas containing coconut trees [above] have been segmented.”

Screen Shot 2014-10-03 at 11.29.23 AM

“Similar techniques could also be used for crop damage assessment after a disaster such as Typhoon Haiyan, where for example standing coconut trees could be distinguished from fallen ones in order to determine capacity to produce coconut-based products.” This is an area that my team and I at QCRI are exploring in partnership with Matthew, Charles and company. In particular, we’re interested in assessing whether crowdsourcing can be used to facilitate the development of machine learning classifiers for image feature detection. More on this herehere and on CNN here. In addition, since “aerial imagery augmented with ground observations would provide a richer source of informa-tion than either one could provide alone,” we are also exploring the integration of social media data with aerial imagery (as described here).

In conclusion, Matthew, Charles and team are looking to further develop the above framework by automating more processes, “such as image filtering and image contrast enhancement. Autonomous take-off & landing will be configured for the custom UAV in order to reduce the need for a skilled pilot. A catapult system will be created for the UAV to launch in areas with a small clearing and a parachute system will be added in order to reduce the risk of damage due to belly landings.” I very much look forward to following the team’s progress and to collaborating with them on imagery analysis for disaster response.

bio

See Also:

  • Official UN Policy Brief on Humanitarian UAVs [link]
  • Common Misconceptions About Humanitarian UAVs [link]
  • Humanitarians in the Sky: Using UAVs for Disaster Response [link]
  • Humanitarian UAVs Fly in China After Earthquake [link]
  • Humanitarian UAV Missions During Balkan Floods [link]
  • Humanitarian UAVs in the Solomon Islands [link]
  • UAVs, Community Mapping & Disaster Risk Reduction in Haiti [link]

Using UAVs to Estimate Crowd Populations

Russian friends of mine recently pointed me to a very interesting computer vision program called “White Counter”. The purpose of this mysterious-sounding algorithm was to automatically count people in a dense crowd from video footage. The program developed in early 2012 by two experts in computer vision and artificial intelligence: Anatoliy Katz and Igor Khuraskin. I recently spoke with Anatoliy to learn more about his work given potential applications for counting refugees and internally displaced peoples using UAVs.

He and Igor created their “White Counter” in order to counter government figures on the number of protestors who join demonstrations. As is typical, the Kremlin will always downplay the numbers. So Anatoliy and Igor used insights from fluid dynamics to create their algorithm, measuring average speed of flow as well as density, for example. Note that “White Counter” is not a fully automated solution. The algorithm requires manual counts every 30 seconds in order to estimate overall crowd figures. But the results of the algorithm are impressive: the error margin at this point is less than 3%. Anatoliy and Igo used their algorithm during the “March of Millions” on September 15, 2012 (video above). Their code is python based and open-source, so if you’re interested in experimenting with the code, simply email whitecounter2012@gmail.com.

My colleague Austin Choi-Fitzpatrick and his team are also working on a similar challenge. They too are interested in estimating the size of social movements. As Austin rightly notes, “Establishing the size of a protest event is critical for social movements as they signal their legitimacy to the media, to the general public & as they demonstrate their strength to the authorities that they’re challenging.” But the methods we use to estimate how large a protest is haven’t changed in more than half-a-century. So Austin & team are looking to update these methods using UAVs and aerial imagery. They take a given image, identify the total area covered by protestors, then slice up the image into a grid of micro-images. They then assess the density level of the crowd in each micro-image. The video below introduces the project and research in more detail.

Perhaps in the near future humanitarian UAVs will be able to draw on these advances in computer vision to assess refugee populations and the number of displaced peoples following major disasters. In the meantime, we can use simple crowdsourcing solutions like MicroMappers to estimate populations. There is a precedence for applying crowdsourcing to compute population counts—see this UN Refugee Project, for example. But if you know of any related work other than the JRC’s efforts that draws on automated techniques, then please let me know, thank you!

bio

See Also:

  • The Use of Drones for Nonviolent Civil Resistance [link]
  • Drones for Human Rights: Brilliant or Foolish? [link]
  • Crisis Map of UAV Videos for Disaster Response [link]
  • Official UN Policy Brief on Humanitarian UAVs [link]

Humanitarian UAVs in the Solomon Islands

The Solomon Islands experienced heavy rains and flash floods following Tropical Cyclone Ita earlier this year. Over 50,000 people were affected and dozens killed, according to ReliefWeb. Infrastructure damage was extensive; entire houses were washed away and thousands lost their food gardens.

solomonfloods

Disaster responders used a rotary-wing UAV (an “Oktocopter”) to assist with the damage assessment efforts. More specifically, the UAV was used to assess the extent of the flood damage in the most affected area along Mataniko River.

Solomons UAV

The UAV was also used to map an area proposed for resettlement. In addition, the UAV was flown over a dam to assess potential damage. These flights were pre-programmed and thus autonomous. (Here’s a quick video demo on how to program UAV flights for disaster response). The UAV was flown at 110 meters altitude in order to capture very high-resolution imagery. “The altitude of 110m also allowed for an operation below the traditional air space and ensured a continuous visibility of the UAV from the starting / landing position.”

Screen Shot 2014-09-25 at 4.04.49 AM

While responders faced several challenges with the UAV, they nevertheless stated that “The UAV was extremely useful for the required mapping” (PDF). Some of these challenges included the limited availability of batteries, which limited the number of UAV flights. The wind also posed a challenge.

Solomons Analysis

Responders took more than 800 pictures (during one 17 minute flight) over the above area which was proposed for resettlement. About 10% of these images were then stitched together to form the mosaic displayed above. The result below depicts flooded areas along Mataniko River. According to responders, “This image data can be utilized to demonstrate the danger of destruction to people who start to resettle in the Mataniko River Valley. These very high resolution images (~ 3 to 5 cm) show details such as destroyed cars, parts of houses, etc. which demonstrate the force of the high water.” In sum, “The maps together with the images of the river could be utilized to raise awareness not to settle again in these areas.”

Screen Shot 2014-09-25 at 4.14.33 AM

Images taken of the dam were used to create the Digital Terrain Model (DTM) below. This enables responders to determine areas where the dam is most likely to overflow due to damage or future floods.

Screen Shot 2014-09-25 at 4.15.40 AM

The result of this DTM analysis enables responders to target the placement of rubber mats fixed with sand bags around the damn’s most vulnerable points.

Solomons Dam

In conclusion, disaster responders write that the use of “UAVs for data acquisition can be highly recommended. The flexibility of an UAV can be of high benefit for mapping purposes, especially in cases where fast data acquisition is desired, e.g. natural hazards. An important advantage of a UAV as platform is that image data recording is performed at low height and not disturbed by cloud cover. In theory a fixed-wing UAV might be more efficient for rapid mapping. However, the DTM applications would not be possible in this resolution with a fixed wing UAV. Notably due to the flexibility for potential starting and landing areas and the handling of the topography characterized by step valleys and obstacles such as power lines between mountain tops within the study area. Especially within the flooded areas a spatially sufficient start and land area for fixed wing UAVs would have been hard to identify.”

Bio

See Also:

  • Official UN Policy Brief on Humanitarian UAVs [link]
  • Common Misconceptions About Humanitarian UAVs [link]
  • Humanitarians in the Sky: Using UAVs for Disaster Response [link]
  • Crisis Map of UAV Videos for Disaster Response [link]
  • Humanitarian UAV Missions During Balkan Floods [link]
  • UAVs, Community Mapping & Disaster Risk Reduction in Haiti [link]

Piloting MicroMappers: How to Become a Digital Ranger in Namibia (Revised!)

Many thanks to all of you who have signed up to search and protect Namibia’s beautiful wildlife! (There’s still time to sign up here; you’ll receive an email on Friday, September 26th with the link to volunteer).

Our MicroMappers Wildlife Challenge will launch on Friday, September 26th and run through Sunday, September 28th. More specifically, we’ll begin the search for Namibia’s wildlife at 12noon Namibia time that Friday (which is 12noon Geneva, 11am London, 6am New York, 6pm Shanghai, 7pm Tokyo, 8pm Sydney). You can join the expedition at any time after this. Simply add yourself to this list-serve to participate. Anyone who can get online can be a digital ranger, no prior experience necessary. We’ll continue our digital search until sunset on Sunday evening.

Namibia Map 1

As noted here, rangers at Kuzikus Wildlife Reserve need our help to find wild animals in their reserve. This will help our ranger friends to better protect these beautiful animals from poachers and other threats. According to the rangers, “Rhino poaching continues to be a growing problem that threatens to extinguish some rhino species within a decade or two. Rhino monitoring is thus important for their protection. Using digital maps in combination with MicroMappers to trace aerial images of rhinos could greatly improve rhino monitoring efforts.”

NamibiaMap2
At 12noon Namibia time on Friday, September 26th, we’ll send an email to the above list-serve with the link to our MicroMappers Aerial Clicker, which we’ll use to crowdsource the search for Namibia’s wildlife. We’ll also publish a blog post on MicroMappers.org with the link. Here’s what the Clicker looks like (click to enlarge the Clicker):

MM Aerial Clicker Namibia

When we find animals, we’ll draw “digital shields” around them. Before we show you how to draw these shields and what types of animals we’ll be looking for, here are examples of helpful shields (versus unhelpful ones); note that we’ve had to change these instructions, so please review them carefully! 

MM Rihno Zoom

This looks like two animals! So lets draw two shields.

MM Rhine New YES

The white outlines are the shields that we drew using the Aerial Clicker above. Notice that our shields include the shadows of the animals, this important. If the animals are close to each other, the shields can overlap but there can only be one shield per animal (one shield per rhino in this case : )

MM Rhino New NO

These shields are too close to the animals, please give them more room!

MM Rhino No
These shields are too big.

If you’ve found something that may be an animal but you’re not sure, then please draw a shield anyway just in case. Don’t worry if most pictures don’t have any animals. Knowing where the animals are not is just as important as knowing where they are!

MM Giraffe Zoom

This looks like a giraffe! So lets draw a shield.

MM Giraffe No2

This shield does not include the giraffe’s shadow! So lets try again.

MM Giraffe No

This shield is too large. Lets try again!

MM Giraffe New YES

Now that’s perfect!

Here are some more pictures of animals that we’ll be looking for. As a digital ranger, you’ll simply need to draw shields around these animals, that’s all there is to it. The shields can overlap if need be, but remember: one shield per animal, include their shadows and give them some room to move around : )

MM Ostritch

Can you spot the ostriches? Click picture above to enlarge. You’ll be abel to zoom in with the Aerial Clicker during the Wildlife Challenge.

MM Oryx

Can you spot the five oryxes in the above? (Actually, there may be a 6th one, can you see it in the shadows?).

MM Impala

And the impalas in the left of the picture? Again, you’ll be able zoom in with the Aerial Clicker.

So how exactly does this Aerial Clicker work? Here’s a short video that shows just easy it is to draw a digital shield using the Clicker (note that we’ve had to change the instructions, so please review this video carefully!):

Thanks for reading and for watching! The results of this expedition will help rangers in Namibia make sure they have found all the animals, which is important for their wildlife protection efforts. We’ll have thousands of aerial photographs to search through next week, which means that our ranger friends in Namibia need as much help as possible! So this is where you come on in: please spread the word and invite your friends, families and colleagues to search and protect Namibia’s beautiful wildlife.

MicroMappers is a joint project with the United Nations (OCHA), and the purpose of this pilot is also to test the Aerial Clicker for future humanitarian response efforts. More here. Any questions or suggestions? Feel free to email me at patrick@iRevolution.net or add them in the comments section below.

Thank you!

And the UAV/Drone Social Innovation Award Goes To?

The winner of the Drone Social Innovation Award has just been announced! The $10,000 prize is awarded to the most socially beneficial, documented use of a UAV platform that costs less than $3,000. The purpose of this award is to “spur innovation, investment, and attention to the positive role this technology can play in our society.” Many thanks to my colleague Timothy Reuter for including me on the panel of judges for this novel Social Innovation Award, which was kindly sponsored by NEXA Capital Partners.

Here’s a quick look at our finalists!

Disaster Relief in the Philippines

Using low-cost UAVs to take high-resolution imagery of disaster-affected areas following Typhoon Haiyan in the Philippines. The team behind these efforts is also working with NGOs from around the world to enable them to use this simple technology for situational awareness in times of crisis. The team is also developing platforms to deliver critical items to locations that are difficult to access in post-disaster scenarios.

Taking Autism to the Sky

Teaching young children with autism how to build and fly their own UAVs. The team behind this initiative to scale their work and teach autistic kids both better social skills and “concrete skills in drone technology that could get them a job one day. It’s just one of the many proposed uses of drones in schools and in science and technology education.”

Crowd Estimation with UAVs

While this entry focuses specifically on the use of UAVs and algorithms to determine the size of social movements (e.g., rallies & protests), there may be application to estimating population flows in refugee and IDP settings. I have a blog post on this topic coming up, stay tuned!

Drones for environmental conservation

Aerial photos and videos helped to direct millions in funding to acquire and preserve hundreds of acres of valuable habitat and strategic additions to public park space. “In a single glance, an aerial photo allows you understand so much more about location than a view from the ground.”

Landmine detection

Bosnia-Herzegovina has one of the highest densities of land mines in the world. So this team from Barcelona is exploring how UAVs might accelerate the process of land mine detection. See this post to learn about another UAV land mine detection effort following the massive flooding in the region this summer.

Whale Research and Conservation

Using benign research tools like UAVs to prove you can study whales without killing them. This allows conservationists to study whales’ DNA  along with their stress hormones without disturbing them or requiring the use of loud and expensive airplanes.


And the award goes to… (drum roll please)… not one but two entries (yes it really was a tie)!  Big congratulations to the teams behind the Land Mine Detection and Disaster Response projects! We really look forward to following your progress. Thank you for your important contributions to social innovation!

We are hoping to making this new “Drone Social Innovation Award” an annual competition (if the stars align again next year). So stay tuned. In the meantime, many thanks again to Timothy Reuter for spearheading this social innovation challenge, to my fellow judges, and most importantly to all participants for taking the time to share their remarkable projects!

bio

See Also:

  • Crowdsourcing the Analysis of Aerial Imagery for Wildlife Protection  and Disaster Response [link]
  • Humanitarians in the Sky: Using UAVs for Disaster Response [link]
  • Official UN Policy Brief on Humanitarian UAVs [link]
  • Crisis Map of UAV Videos for Disaster Response [link]
  • Humanitarian UAV Missions During Balkan Floods [link]
  • UAVs, Community Mapping & Disaster Risk Reduction in Haiti [link]