Tag Archives: MicroMappers

Crowdsourcing Point Clouds for Disaster Response

Point Clouds, or 3D models derived from high resolution aerial imagery, are in fact nothing new. Several software platforms already exist to reconstruct a series of 2D aerial images into fully fledged 3D-fly-through models. Check out these very neat examples from my colleagues at Pix4D and SenseFly:

What does a castle, Jesus and a mountain have to do with humanitarian action? As noted in my previous blog post, there’s only so much disaster damage one can glean from nadir (that is, vertical) imagery and oblique imagery. Lets suppose that the nadir image below was taken by an orbiting satellite or flying UAV right after an earthquake, for example. How can you possibly assess disaster damage from this one picture alone? Even if you had nadir imagery for these houses before the earthquake, your ability to assess structural damage would be limited.

Screen Shot 2015-04-09 at 5.48.23 AM

This explains why we also captured oblique imagery for the World Bank’s UAV response to Cyclone Pam in Vanuatu (more here on that humanitarian mission). But even with oblique photographs, you’re stuck with one fixed perspective. Who knows what these houses below look like from the other side; your UAV may have simply captured this side only. And even if you had pictures for all possible angles, you’d literally have 100’s of pictures to leaf through and make sense of.

Screen Shot 2015-04-09 at 5.54.34 AM

What’s that famous quote by Henry Ford again? “If I had asked people what they wanted, they would have said faster horses.” We don’t need faster UAVs, we simply need to turn what we already have into Point Clouds, which I’m indeed hoping to do with the aerial imagery from Vanuatu, by the way. The Point Cloud below was made only from single 2D aerial images.

It isn’t perfect, but we don’t need perfection in disaster response, we need good enough. So when we as humanitarian UAV teams go into the next post-disaster deployment and ask what humanitarians they need, they may say “faster horses” because they’re not (yet) familiar with what’s really possible with the imagery processing solutions available today. That obviously doesn’t mean that we should ignore their information needs. It simply means we should seek to expand their imaginations vis-a-vis the art of the possible with UAVs and aerial imagery. Here is a 3D model of a village in Vanuatu constructed using 2D aerial imagery:

Now, the title of my blog post does lead with the word crowdsourcing. Why? For several reasons. First, it takes some decent computing power (and time) to create these Point Clouds. But if the underlying 2D imagery is made available to hundreds of Digital Humanitarians, we could use this distributed computing power to rapidly crowdsource the creation of 3D models. Second, each model can then be pushed to MicroMappers for crowdsourced analysis. Why? Because having a dozen eyes scrutinizing one Point Cloud is better than 2. Note that for quality control purposes, each Point Cloud would be shown to 5 different Digital Humanitarian volunteers; we already do this with MicroMappers for tweets, pictures, videos, satellite images and of course aerial images as well. Each digital volunteer would then trace areas in the Point Cloud where they spot damage. If the traces from the different volunteers match, then bingo, there’s likely damage at those x, y and z coordinate. Here’s the idea:

We could easily use iPads to turn the process into a Virtual Reality experience for digital volunteers. In other words, you’d be able to move around and above the actual Point Cloud by simply changing the position of your iPad accordingly. This technology already exists and has for several years now. Tracing features in the 3D models that appear to be damaged would be as simple as using your finger to outline the damage on your iPad.

What about the inevitable challenge of Big Data? What if thousands of Point Clouds are generated during a disaster? Sure, we could try to scale our crowd-sourcing efforts by recruiting more Digital Humanitarian volunteers, but wouldn’t that just be asking for a “faster horse”? Just like we’ve already done with MicroMappers for tweets and text messages, we would seek to combine crowdsourcing and Artificial Intelligence to automatically detect features of interest in 3D models. This sounds to me like an excellent research project for a research institute engaged in advanced computing R&D.

I would love to see the results of this applied research integrated directly within MicroMappers. This would allow us to integrate the results of social media analysis via MicroMappers (e.g, tweets, Instagram pictures, YouTube videos) directly with the results of satellite imagery analysis as well as 2D and 3D aerial imagery analysis generated via MicroMappers.

Anyone interested in working on this?

How Digital Jedis Are Springing to Action In Response To Cyclone Pam

Digital Humanitarians sprung to action just hours after the Category 5 Cyclone collided with Vanuatu’s many islands. This first deployment focused on rapidly assessing the damage by analyzing multimedia content posted on social media and in the mainstream news. This request came directly from the United Nations (OCHA), which activated the Digital Humanitarian Network (DHN) to carry out the rapid damage assessment. So the Standby Task Force (SBTF), a founding member of the DHN, used QCRI′s MicroMappers platform to produce a digital, interactive Crisis Map of some 1,000+ geo-tagged pictures of disaster damage (screenshot below).

MM_ImageMap_Vanuatu

Within days of Cyclone Pam making landfall, the World Bank (WB) activated the Humanitarian UAV Network (UAViators) to quickly deploy UAV pilots to the affected islands. UAViators has access to a global network of 700+ professional UAV pilots is some 70+ countries worldwide. The WB identified two UAV teams from the Humanitarian UAV Network and deployed them to capture very high-resolution aerial photographs of the damage to support the Government’s post-disaster damage assessment efforts. Pictures from these early UAV missions are available here. Aerial images & videos of the disaster damage were also posted to the UAViators Crowdsourced Crisis Map.

Last week, the World Bank activated the DHN (for the first time ever) to help analyze the many, many GigaBytes of aerial imagery from Vanuatu. So Digital Jedis from the DHN are now using Humanitarian OpenStreetMap (HOT) and MicroMappers (MM) to crowdsource the search for partially damaged and fully destroyed houses in the aerial imagery. The OSM team is specifically looking at the “nadir imagery” captured by the UAVs while MM is exclusively reviewing the “oblique imagery“. More specifically, digital volunteers are using MM to trace destroyed houses red, partially damaged houses orange, and using blue to denote houses that appear to have little to no damage. Below is an early screenshot of the Aerial Crisis Map for the island of Efate. The live Crisis Map is available here.

Screen Shot 2015-04-06 at 10.56.09 AM

Clicking on one of these markers will open up the high resolution aerial pictures taken at that location. Here, two houses are traced in blue (little to no damage) and two on the upper left are traced in orange (partial damage expected).

Screen Shot 2015-04-06 at 10.57.17 AM

The cameras on the UAVs captured the aerial imagery in very high resolution, as you can see from the close up below. You’ll note two traces for the house. These two traces were done by two independent volunteers (for the purposes of quality control). In fact, each aerial image is shown to at least 3 different Digital Jedis.

Screen Shot 2015-04-06 at 10.58.31 AM

Once this MicroMappers deployment is over, we’ll be using the resulting traces to create automated featured detection algorithms; just like we did here for the MicroMappers Namibia deployment. This approach, combining crowdsourcing with Artificial Intelligence (AI), is explored in more detail here vis-a-vis disaster response. The purpose of taking this hybrid human-machine computing solution is to accelerate (semi-automate) future damage assessment efforts.

Meanwhile, back in Vanuatu, the HOT team has already carried out some tentative, preliminary analysis of the damage based on the aerial imagery provided. They are also up-dating their OSM maps of the affected islands thanks this imagery. Below is an initial damage assessment carried out by HOT for demonstration purposes only. Please visit their deployment page on the Vanuatu response for more information.

2015-04-04_18h04_00

So what’s next? Combining both the nadir and oblique imagery to interpret disaster damage is ultimately what is needed, so we’re actually hoping to make this happen (today) by displaying the nadir imagery directly within the Aerial Crisis Map produced by MicroMappers. (Many thanks to the MapBox team for their assistance on this). We hope this integration will help HOT and our World Bank partners better assess the disaster damage. This is the first time that we as a group are doing anything like this, so obviously lots of learning going on, which should improve future deployments. Ultimately, we’ll need to create 3D models (point clouds) of disaster affected areas (already easy to do with high-resolution aerial imagery) and then simply use MicroMappers to crowdsource the analysis of these 3D models.

And here’s a 3D model of a village in Vanuatu constructed using 2D aerial photos taken by UAV:

For now, though, Digital Jedis will continue working very closely with the World Bank to ensure that the latter have the results they need in the right format to deliver a comprehensive damage assessment to the Government of Vanuatu by the end of the week. In the meantime, if you’re interested in learning more about digital humanitarian action, then please check out my new book, which features UAViators, HOT, MM and lots more.

Aerial Imagery Analysis: Combining Crowdsourcing and Artificial Intelligence

MicroMappers combines crowdsourcing and artificial intelligence to make sense of “Big Data” for Social Good. Why artificial intelligence (AI)? Because regular crowdsourcing alone is no match for Big Data. The MicroMappers platform can already be used to crowdsource the search for relevant tweets as well as pictures, videos, text messages, aerial imagery and soon satellite imagery. The next step is therefore to add artificial intelligence to this crowdsourced filtering platform. We have already done this with tweets and SMS. So we’re now turning our attention to aerial and satellite imagery.

Our very first deployment of MicroMappers for aerial imagery analysis was in Africa for this wildlife protection project. We crowdsourced the search for wild animals in partnership with rangers from the Kuzikus Wildlife Reserve based in Namibia. We were very pleased with the results, and so were the rangers. As one of them noted: “I am impressed with the results. There are at times when the crowd found animals that I had missed!” We were also pleased that our efforts caught the attention of CNN. As noted in that CNN report, our plan for this pilot was to use crowdsourcing to find the wildlife and to then combine the results with artificial intelligence to develop a set of algorithms that can automatically find wild animals in the future.

To do this, we partnered with a wonderful team of graduate students at EPFL, the well known polytechnique in Lausanne, Switzerland. While these students were pressed for time due to a number of deadlines, they were nevertheless able to deliver some interesting results. Their applied, computer vision research is particularly useful given our ultimate aim: to create an algorithm that can learn to detect features of interest in aerial and satellite imagery in near real-time (as we’re interested in applying this to disaster response and other time-sensitive events). For now, however, we need to walk before we can run. This means carrying out the tasks of crowdsourcing and artificial intelligence in two (not-yet-integrated) steps.

MM Oryx

As the EPFL students rightly note in their preliminary study, the use of thermal imaging (heat detection) to automatically identify wildlife in the bush is some-what problematic since “the temperature difference between animals and ground is much lower in savannah […].” This explains why the research team used the results of our crowdsourcing efforts instead. More specifically, they focused on automatically detecting the shadows of gazelles and ostriches by using an object based support vector machine (SVM). The whole process is summarized below.

Screen Shot 2015-02-09 at 12.46.38 AM

The above method produces results like the one below (click to enlarge). The circles represents the objects used to train the machine learning classifier. The discerning reader will note that the algorithm has correctly identified all the gazelles save for one instance in which two gazelles were standing close together were identified as one gazelle. But no other objects were mislabeled as a gazelle. In other words, EPFL’s gazelle algorithm is very accurate. “Hence the classifier could be used to reduce the number of objects to assess manually and make the search for gazelles faster.” Ostriches, on the other hand, proved more difficult to automatically detect. But the students are convinced that this could be improved if they had more time.

Screen Shot 2015-02-09 at 12.56.17 AM

In conclusion, more work certainly needs to be done, but I am pleased by these preliminary and encouraging results. In addition, the students at EPFL kindly shared some concrete features that we can implement on the MicroMappers side to improve the crowdsourced results for the purposes of developing automated algorithms in the future. So a big thank you to Briant, Millet and Rey for taking the time to carry out the above research. My team and I at QCRI very much look forward to continuing our collaboration with them and colleagues at EPFL.

In the meantime, more on all this in my new bookDigital Humanitarians: How Big Data is Changing the Face of Humanitarian Response, which has already been endorsed by faculty at Harvard, MIT, Stanford, Oxford, etc; and by experts at the UN, World Bank, Red Cross, Twitter, etc.

MicroMappers: Towards Next Generation Humanitarian Technology

The MicroMappers platform has come a long way and still has a ways to go. Our vision for MicroMappers is simple: combine human computing (smart crowd-sourcing) with machine computing (artificial intelligence) to filter, fuse and map a variety of different data types such as text, photo, video and satellite/aerial imagery. To do this, we have created a collection of “Clickers” for MicroMappers. Clickers are simply web-based crowdsourcing apps used to make sense of “Big Data”. The “Text Cicker” is used to filter tweets & SMS’s; “Photo Clicker” to filter photos; “Video Clicker” to filter videos and yes the Satellite & Aerial Clickers to filter both satellite and aerial imagery. These are the Data Clickers. We also have a collection of Geo Clickers that digital volunteers use to geo-tag tweets, photos and videos filtered by the Data Clickers. Note that these Geo Clickers auto-matically display the results of the crowdsourced geo-tagging on our MicroMaps like the one below.

MM Ruby Tweet Map

Thanks to our Artificial Intelligence (AI) engine AIDR, the MicroMappers “Text Clicker” already combines human and machine computing. This means that tweets and text messages can be automatically filtered (classified) after some initial crowdsourced filtering. The filtered tweets are then pushed to the Geo Clickers for geo-tagging purposes. We want to do the same (semi-automation) for photos posted to social media as well as videos; although this is still a very active area of research and development in the field of computer vision.

So we are prioritizing our next hybrid human-machine computing efforts on aerial imagery instead. Just like the “Text Clicker” above, we want to semi-automate feature detection in aerial imagery by adding an AI engine to the “Aerial Clicker”. We’ve just starting to explore this with computer vision experts in Switzerland and Canada. Another development we’re eyeing vis-a-vis UAVs is live video streaming. To be sure, UAVs will increasingly be transmitting live video feeds directly to the web. This means we may eventually need to develop a “Streaming Clicker”, which would in some respects resemble our existing “Video Clicker” except that the video would be broadcasting live rather than play back from YouTube, for example. The “Streaming Clicker” is for later, however, or at least until a prospective partner organization approaches us with an immediate and compelling social innovation use-case.

In the meantime, my team & I at QCRI will continue to improve our maps (data visualizations) along with the human computing component of the Clickers. The MicroMappers smartphone apps, for example, need more work. We also need to find partners to help us develop apps for tablets like the iPad. In addition, we’re hoping to create a “Translate Clicker” with Translators Without Borders (TWB). The purpose of this Clicker would be to rapidly crowdsource the translation of tweets, text messages, etc. This could open up rather interesting possibilities for machine translation, which is certainly an exciting prospect.

MM All Map

Ultimately, we want to have one and only one map to display the data filtered via the Data and Geo Clickers. This map, using (Humanitarian) OpenStreetMap as a base layer, would display filtered tweets, SMS’s, photos, videos and relevant features from satellite and UAV imagery. Each data type would simply be a different layer on this fused “Meta-Data Crisis Map”; and end-users would simply turn individual layers on and off as needed. Note also the mainstream news feeds (CNN and BBC) depicted in the above image. We’re working with our partners at UN/OCHA, GDELT & SBTF to create a “3W Clicker” to complement our MicroMap. As noted in my forthcoming book, GDELT is the ultimate source of data for the world’s digitized news media. The 3Ws refers to Who, What, Where; an important spreadsheet that OCHA puts together and maintains in the aftermath of major disasters to support coordination efforts.

In response to Typhoon Ruby in the Philippines, Andrej Verity (OCHA) and I collaborated with Kalev Leetaru from GDELT to explore how the MicroMappers “3W Clicker” might work. The result is the Google Spreadsheet below (click to enlarge) that is automatically updated every 15 minutes with the latest news reports that refer to one or more humanitarian organizations in the Philippines. GDELT includes the original URL of the news article as well as the list of humanitarian organizations referenced in the article. In addition, GDELT automatically identifies the locations referred to in the articles, key words (tags) and the date of the news article. The spreadsheet below is already live and working. So all we need now is the “3W Clicker” to crowdsource the “What”.

MM GDELT output

The first version of the mock-up we’ve created for the “3W Clicker” is displayed below. Digital volunteers are presented with an interface that includes an news article with the names of humanitarian organizations highlighted in red for easy reference. GDELT auto-populates the URL, the organization name (or names if there are more than one) and the location. Note that both the “Who” & “Where” information can be edited directly by the volunteer incase GDELT’s automated algorithm gets those wrong. The main role of digital volunteers, however, would simply be to identify the “What” by quickly skimming the article.

MM 3W Clicker v2

The output of the “3W Clicker” would simply be another MicroMap layer. As per Andrej’s suggestion, the resulting data could also be automatically pushed to another Google Spreadsheet in HXL format. We’re excited about the possibilities and plan to move forward on this sooner rather than later. In addition to GDELT, pulling in feeds from CrisisNET may be worth exploring. I’m also really keen on exploring ways to link up with the Global Disaster Alert & Coordination System (GDACS) as well as GeoFeedia.

In the meantime, we’re hoping to pilot our “Satellite Clicker” thanks to recent conversations with Planet Labs and SkyBox Imaging. Overlaying user-generated content such as tweets and images on top of both satellite and aerial imagery can go a long way to helping verify (“ground truth”) social media during disasters and other events. This is evidenced by recent empirical studies such as this one in Germany and this one in the US. On this note, as my QCRI colleague Heather Leson recently pointed out, the above vision for MicroMappers is still missing one important data feed; namely sensors—the Internet of Things. She is absolutely spot on, so we’ll be sure to look for potential pilot projects that would allow us to explore this new data source within MicroMappers.

The above vision is a tad ambitious (understatement). We really can’t do this alone. To this end, please do get in touch if you’re interested in joining the team and getting MicroMappers to the next level. Note that MicroMappers is free and open source and in no way limited to disaster response applications. Indeed, we recently used the Aerial Clicker for this wildlife protection project in Namibia. This explains why our friends over at National Geographic have also expressed an interest in potentially piloting the MicroMappers platform for some of their projects. And of course, one need not use all the Clickers for a project, simply the one(s) that make sense. Another advantage of MicroMappers is that the Clickers (and maps) can be deployed very rapidly (since the platform was initially developed for rapid disaster response purposes). In any event, if you’d like to pilot the platform, then do get in touch.

bio

See also: Digital Humanitarians – The Book

Digital Jedis Complete Response to Typhoon Ruby

Thank you, Digital Jedis!

Every Click you made on MicroMappers was a gift. Typhoon Ruby (Hagupit) disrupted the lives of many and caused damage in regions already affected by previous disasters. As MicroMappers, you gave your time, clicks and skills to make a difference. Catherine, the Head of the UN’s Information Management Unit in the Philippines had this to say: “I would like to thank all the volunteers […] for their invaluable contribution over the past few days. We are lucky that Hagupit [Ruby] made less damages than expected and that the emergency quickly scaled down.”

MM Ruby Tweet Map

MicroMappers and our partners at the Standby Task Force (SBTF) were activated by the United Nations Office for the Coordination of Humanitarian Affairs (OCHA). The Mission?

To augment the situational awareness of humanitarian actors on the ground by making sense of social media generated following the Typhoon.

Over the course of 72 hours, these Digital Jedis united to MicroMap one Click at a time. By reviewing tweets and image, each MicroMapper built collective intelligence and insights that were used to build a comprehensive situational awareness reports and maps for the UN. Many hands, and in this case, Clicks, make light work.

As Catherine rightly notes, there was thankfully less damage than many feared. This explains why our MicroMaps (above and below) are thankfully not riddled with hundreds of markers. In addition, we prioritize quality over quantity at MicroMappers. Our UN partners had specifically asked for tweets related to:

(1) Requests for Help / Needs
(2) Infrastructure Damage
(3) Humanitarian Aid Provided

Together, these tweets—which are mapped above—represented less than 5% of the Ruby-related tweets that were collected during the first 72 hours of the Typhoon making landfall. This doesn’t mean that only 5% of the information on Twitter was relevant for emergency response, however. Indeed, we also tagged tweets that were not related to the above 3 categories but that were still informative. These constituted more than 20% of all tweets collected (which are not included in the map above). In the analysis provided to UN partners, we did include a review of those other relevant tweets.

MM Ruby Tweet Clicker

Some 700 Digital Jedis joined the response online, a new record for MicroMappers! An astounding 50,394 Clicks were made using the Text Clicker pictured above (each tweet was reviewed by at least 3 digital volunteers for quality assurance purposes). And a further 3,555 Clicks were carefully made by the SBTF to geo-locate (map) relevant tweets. In other words, close to 55,000 Clicks went into making the high quality map displayed above! That’s over 12 Clicks per minute non-stop for more than 4,300 consecutive minutes!

MM Ruby Image Map

The United Nations also asked Digital Jedis to identify pictures posted on Twitter that showed disaster damage. Over 30,000 Clicks went into this operation with a further 7,413 Clicks made by the SBTF to map images that showed severe and mild damage. In sum, over 40,000 Clicks went into the MicroMap above. Overall, the entire MicroMappers response was powered by close to 100,000 Clicks!

Screen Shot 2014-12-10 at 8.36.04 AMMM Infographic 2MM Infographic 3

Digital Jedis have yet again shown that together, we can help people get positively involved in their world, even when half-a-globe and many timezones away. Yes, we can and should donate $$ to support relief efforts and good causes around the world but we can also get directly involved by donating our time, or what we call M&M’s, Minutes and Mouse clicks. This year MicroMappers have mobilized to support wildlife protection in Namibia, food security efforts in the Philippines and of course this most recent response to Typhoon Ruby. On that note, thanks again to all volunteers who supported the MicroMappers response to the Typhoon in partnership with the United Nations. You truly are Digital Jedis! And the UK Guardian certainly agrees, check out their article on our digital response.

So what’s next? We will continue to solicit your feedback on how to improve the Clickers and will get started right away. (Add your MicroMappers feedback here). In the meantime, we will leave the Clickers online for newcomers who wish to practice. We are also in touch with the UN and UAV partners in the Philippines as they may soon fly their small, remote-control planes to take aerial photographs over disaster affected areas. If they do, they will send us the photographs for analysis via MicroMappers, so stay tuned.

In closing, MicroMappers was developed by QCRI in partnership SBTF/OCHA. So a million thanks to the QCRI team and SBTF for deploying MicroMappers in support of these digital humanitarian efforts. Special thanks go to Ji Lucas, Jus Mackinnon, ChaTo Castillo, Muhammad Imran, Heather Leson, Sarah Vieweg and last but certainly not least Peter Mosur.

(Ed. note: Blog post was cross-posted from MicroMappers.org. Infrographic uses Infogr.am software)

Calling All Digital Jedis: Support UN Response to Super Typhoon Ruby!

The United Nations has officially activated the Digital Humanitarian Network (DHN) in response to Typhoon Ruby. The DHN serves as the official interface between formal humanitarian organizations and digital volunteer groups from all around the world. These digital volunteers—also known as Digital Jedis— provide humanitarian organizations like the UN and the Red Cross with the “surge” capacity they need to make sense of the “Big Data” that gets generated during disasters. This “Big Data” includes large volumes of social media reports and satellite imagery, for example. And there is a lot of this data being generated right now as a result of Super Typhoon Ruby.

Typhoon Ruby

To make sense of this flash flood of information, Digital Jedis use crowdsourcing platforms like MicroMappers, which was developed in partnership with the UN Office for the Coordination of Humanitarian Affairs (OCHA). In their activation of the Digital Humanitarian Network, the UN has requested that Digital Jedis look for Ruby-related tweets that refer to needs, damage & response efforts. They have also asked digital volunteers to identify pictures of damage caused by the Typhoon. These tweets and pictures will then to be added to a live crisis map to augment the UN’s own disaster damage and needs assessment efforts.

You too can be a Digital Jedi. Trust me, MicroMappers is far easier to use than a lightsaber. All it takes is a single Click of the mouse. Yes, it really is that simple. So, if a Digital Jedi you want to be, let your first Click be this one! Following that click will set you on the path to help the United Nation’s important relief efforts in the Philippines. So if you’ve got a bit of time on your hands—even 2 minutes goes a long way—then help us make a meaningful difference in the world, join the Force! And may the Crowd be with Us!

bio

See also: Digital Humanitarians – The Path of the Digtal Jedis

Piloting MicroMappers: Crowdsourcing the Analysis of UAV Imagery for Disaster Response

New update here!

UAVs are increasingly used in humanitarian response. We have thus added a new Clicker to our MicroMappers collection. The purpose of the “Aerial Clicker” is to crowdsource the tagging of aerial imagery captured by UAVs in humanitarian settings. Trying out new technologies during major disasters can pose several challenges, however. So we’re teaming up with Drone Adventures, Kuzikus Wildlife Reserve, Polytechnic of Namibia, and l’École Polytechnique Fédérale de Lausanne (EPFL) to try out our new Clicker using high-resolution aerial photographs of wild animals in Namibia.

Kuzikus1
As part of their wildlife protection efforts, rangers at Kuzikus want to know how many animals (and what kinds) are roaming about their wildlife reserve. So Kuzikus partnered with Drone Adventures and EPFL’s Cooperation and Development Center (CODEV) and the Laboratory of Geographic Information Systems (LASIG) to launch the SAVMAP project, which stands for “Near real-time ultrahigh-resolution imaging from unmanned aerial vehicles for sustainable land management and biodiversity conservation in semi-arid savanna under regional and global change.” SAVMAP was co-funded by CODEV through LASIG. You can learn more about their UAV flights here.

Our partners are interested in experimenting with crowdsourcing to make sense of this aerial imagery and raise awareness about wildlife in Namibia. As colleagues at Kuzikus recently told us, “Rhino poaching continues to be a growing problem that threatens to extinguish some rhino species within a decade or two. Rhino monitoring is thus important for their protection. One problematic is to detect rhinos in large areas and/or dense bush areas. Using digital maps in combination with MicroMappers to trace aerial images of rhinos could greatly improve rhino monitoring efforts.” 

So our pilot project serves two goals: 1) Trying out the new Aerial Clicker for future humanitarian deployments; 2) Assessing whether crowdsourcing can be used to correctly identify wild animals.

MM Aerial Clicker

Can you spot the zebras in the aerial imagery above? If so, you’re already a digital ranger! No worries, you won’t need to know that those are actually zebras, you’ll simply outline any animals you find (using your mouse) and click on “Add my drawings.” Yes, it’s that easy : )

We’ll be running our Wildlife Challenge from September 26th-28th. To sign up for this digital expedition to Namibia, simply join the MicroMappers list-serve here. We’ll be sure to share the results of the Challenge with all volunteers who participate and with our partners in Namibia. We’ll also be creating a wildlife map based on the results so our friends know where the animals have been spotted (by you!).

MM_Rhino

Given that rhino poaching continues to be a growing problem in Namibia (and elsewhere), we will obviously not include the location of rhinos in our wildlife map. You’ll still be able to look for and trace rhinos (like those above) as well as other animals like ostriches, oryxes & giraffes, for example. Hint: shadows often reveal the presence of wild animals!

MM_Giraffe

Drone Adventures hopes to carry out a second mission in Namibia early next year. So if we’re successful in finding all the animals this time around, then we’ll have the opportunity to support the Kuzikus Reserve again in their future protection efforts. Either way, we’ll be better prepared for the next humanitarian disaster thanks to this pilot. MicroMappers is developed by QCRI and is a joint project with the United Nations Office for the Coordination of Humanitarian Affairs (OCHA).

Any questions or suggestions? Feel free to email me at patrick@iRevolution.net or add them in the comments section below. Thank you!