Category Archives: Robotics

Aerial Robotics and Agriculture: Opportunities for the Majority World

The majority of studies and articles on the use of drones/UAVs for agriculture seem to focus on examples and opportunities in the US, Europe or Japan. These reports talk about the needs for large scale aerial surveys over massive farms, machine learning algorithms for automated crop detection, and the development of sophisticated forecasting models to inform decisions at the very micro level. This is the realm of precision agriculture. But what about small-holder and family farms in the Majority World? Do flying robots make sense for them? Yes, in some cases, but not in the same way that this technology makes sense for large farms in highly industrialized countries.

First things first, smallholder and family farms won’t have as much need for long-range fixed-wing UAVs as ranchers do in the US. According to this FAO study (PDF), smallholder farms typically cover less than 0.02 square kilometers. Secondly, these farms do not necessarily need access to very high-resolution, orthorectified mosaics or fancy 3D models. Mosaics and 3D models require data processing software; and software requires a computer to run said software, not to mention having time to learn how to use said software. Without software, a farmer could still upload her aerial images to the cloud for processing but that requires a reliable and relatively fast Internet connection. Also, data processing means having to store that data before and after processing. So now the farmer needs software, a computer and a hard disk (or two for backup). 

I’m not suggesting that very high-resolution orthorectified mosaics, 3D models and multi-spectral sensors cannot add value to smallholder and family farms. Of course they can. Farmers have always needed accurate as well as up-to-date information on their crops and on the environmental conditions of the land on which their crops grow. I’m just suggesting a more practical approach to begin with. Simply getting a live video feed from a bird’s eye view can already reveal patterns that show everything from irrigation problems to soil variation and even pest and fungal infestations that aren’t apparent at eye level.

At the end of the day, farming is an input-output problem. Local farmers can get a live video feed from the sky to subsequently reduce their inputs—water and pesticides—while maintaining the same output. But lets unpack that a bit. Once a farmer detects an irrigation problem from the video feed, they don’t need an other piece of robotics tech to intervene. They can easily see where the drone is hovering and simply walk over to the area with basic tools to intervene as needed. Smallholder and family farms do not have access to variable-rate sprayers and other fancy tractor tools that can take precision data and respond with precision interventions. So very high-res mosaics and 3D models may add little value in the context of smallholder farms in developing countries.

rededge-cam-nepal-rice-fields

Of course, some farmers may prefer to pay consultants or local companies to carry out these aerial surveys instead of leasing or owning a drone and carrying out the surveys themselves. In fact, some companies actually found it too “tedious to teach farmers how to use the drones they had made. Instead, they decided to focus on providing the much-needed service of mapping out farms and sites” (1). In stark contrast, my team & I at WeRobotics have really enjoyed training local partners in Nepal and Tanzania. We don’t find it tedious at all but rather highly rewarding. Building local capacity around the use of appropriate robotics solutions goes to the heart of our mission.

This explains why we’re creating local robotics labs—we call them Flying Labs—to transfer the skills and technologies that our partners need to scale the positive impact of their local efforts. So we’re especially keen to work with smallholder and family farms so they can use robotics solutions to improve their yields. They could lease small drones from the labs for a nominal fee, and I’m willing to bet that some savvy young men and women working on these farms will be keen to learn a new set of skills that could lead to an increase in income. We’re also keen to work with local drone consultants or local companies to enable them to expand their services to include agriculture. The key, either way, is to design and deliver effective trainings to local farmers, consultants and/or companies while providing each with long-term support through the Flying Labs. 


Thanks to colleagues from WeRobotics for feedback on an earlier version of this postI’m keen to receive additional input from iRevolution readers. 

What Happens When the Media Sends Drone Teams to Disasters?

Media companies like AFP, CNN and others are increasingly capturing dramatic aerial footage following major disasters around the world. These companies can be part of the solution when it comes to adding value to humanitarian efforts on the ground. But they can also be a part of the problem.

screenshot-2016-10-26-04-51-44

Media teams are increasingly showing up to disasters with small drones (UAVs) to document the damage. They’re at times the first with drones on the scene and thus able to quickly capture dramatic aerial footage of the devastation below. These media assets lead to more views and thus traffic on news websites, which increases the probability that more readers click on ads. Cue Noam Chomsky’s Manufacturing Consent: The Political Economy of the Mass Media, my favorite book whilst in high school.

Aerial footage can also increase situational awareness for disaster responders if that footage is geo-located. Labeling individual scenes in video footage with the name of the towns or villages being flown over would go a long way. This is what I asked one journalist to do in the aftermath of the Nepal Earthquake after he sent me dozens of his aerial videos. I also struck up an informal agreement with CNN to gain access to their raw aerial footage in future disasters. On a related note, I was pleased when my CNN contact expressed an interest in following the Humanitarian UAV Code of Conduct.

In an ideal world, there would be a network of professional drone journalists with established news agencies that humanitarian organizations could quickly contact for geo-tagged video footage after major disasters to improve their situational awareness. Perhaps the Professional Society of Drone Journalists (PSDJ) could be part of the solution. In any case, the network would either have its own Code of Conduct or follow the humanitarian one. Perhaps they could post their footage and pictures directly to the Humanitarian UAV Network (UAViators) Crisis Map. Either way, the media has long played an important role in humanitarian disasters, and their increasing use of drones makes them even more valuable partners to increase situational awareness.

The above scenario describes the ideal world. But the media can (and has) been part of the problem as well. “If it bleeds, it leads,” as the saying goes. Increased competition between media companies to be the first to capture dramatic aerial video that goes viral means that they may take shortcuts. They may not want to waste time getting formal approval from a country’s civil aviation authority. In Nepal after the earthquake, one leading company’s drone team was briefly detained by authorities for not getting official permission.

screenshot-2016-10-26-05-07-05

Media companies may not care to engage with local communities. They may be on a tight deadline and thus dispense with getting community buy-in. They may not have the time to reassure traumatized communities about the robots flying overhead. Media companies may overlook or ignore potential data privacy repercussions of publishing their aerial videos online. They may also not venture out to isolated and rural areas, thus biasing the video footage towards easy-to-access locations.

So how do we in the humanitarian space make media drone teams part of the solution rather than part of the problem? How do we make them partners in these efforts? One way forward is to start a conversation with these media teams and their relevant networks. Perhaps we start with a few informal agreements and learn by doing. If anyone is interested in working with me on this and/or has any suggestions on how to make this happen, please do get in touch. Thanks!

Why Robots Are Flying Over Zanzibar and the Source of the Nile

An expedition in 1858 revealed that Lake Victoria was the source of the Nile. We found ourselves on the shores of Africa’s majestic lake this October, a month after a 5.9 magnitude earthquake struck Tanzania’s Kagera Region. Hundreds were injured and dozens killed. This was the biggest tragedy in decades for the peaceful lakeside town of Bukoba. The Ministry of Home Affairs invited WeRobotics to support the recovery and reconstruction efforts by carrying out aerial surveys of the affected areas. 

2016-10-10-08-14-57-hdr

The mission of WeRobotics is to build local capacity for the safe and effective use of appropriate robotics solutions. We do this by co-creating local robotics labs that we call Flying Labs. We use these Labs to transfer the professional skills and relevant robotics solutions to outstanding local partners. Our explicit focus on capacity building explains why we took the opportunity whilst in Kagera to train two Tanzanian colleagues. Khadija and Yussuf joined us from the State University of Zanzibar (SUZA). They were both wonderful to work with and quick learners too. We look forward to working with them and other partners to co-create our Flying Labs in Tanzania. More on this in a future post.

Aerial Surveys of Kagera Region After The Earthquake

We surveyed multiple areas in the region based on the priorities of our local partners as well as reports provided by local villagers. We used the Cumulus One UAV from our technology partner DanOffice to carry out the flights. The Cumulus has a stated 2.5 hour flight time and 50 kilometer radio range. We’re using software from our partner Pix4D to process the 3,000+ very high resolution images captured during our 2 days around Bukoba.

img_6753

Above, Khadija and Yussuf on the left with a local engineer and a local member of the community on the right, respectfully. The video below shows how the Cumulus takes off and lands. The landing is automatic and simply involves the UAV stalling and gently gliding to the ground. 

We engaged directly with local communities before our flights to explain our project and get their permissions to fly. Learn more about our Code of Conduct.

img_6807

Aerial mapping with fixed-wing UAVs can identify large-scale damage over large areas and serve as a good base map for reconstruction. A lot of the damage, however, can be limited to large cracks in walls, which cannot be seen with nadir (vertical) imagery. We thus flew over some areas using a Parrot Bebop2 to capture oblique imagery and to get closer to the damage. We then took dozens of geo-tagged images from ground-level with our phones in order to ground-truth the aerial imagery.

img_6964

We’re still processing the resulting imagery so the results below are simply the low resolution previews of one (out of three) surveys we carried out.

ortho1_bukoba

Both Khadija and Yussuf were very quick learners and a real delight to work with. Below are more pictures documenting our recent work in Kagera. You can follow all our trainings and projects live via our Twitter feed (@werobotics) and our Facebook page. Sincerest thanks to both Linx Global Intelligence and UR Group for making our work in Kagera possible. Linx provided the introduction to the Ministry of Home Affairs while the UR Group provided invaluable support on the logistics and permissions.

img_6827

Yussuf programming the flight plan of the Cumulus

img_6875

Khadija is setting up the Cumulus for a full day of flying around Bukoba area

img_6756

Khadija wants to use aerial robots to map Zanzibar, which is where she’s from

img_6787

Community engagement is absolutely imperative

img_6791

Local community members inspecting the Parrot’s Bebop2

From the shores of Lake Victoria to the coastlines of Zanzibar

Together with the outstanding drone team from the State University of Zanzibar, we mapped Jozani Forest and part of the island’s eastern coastline. This allowed us to further field-test our long-range platform and to continue our local capacity building efforts following our surveys near the Ugandan border. Here’s a picture-based summary of our joint efforts.

2016-10-14-09-09-48

Flying Labs Coordinator Yussuf sets up the Cumulus UAV for flight

2016-10-13-14-44-27-hdr

Turns out selfie sticks are popular in Zanzibar and kids love robots.

2016-10-14-10-01-25

Khairat from Team SUZA is operating the mobile air traffic control tower. Team SUZA uses senseFly eBees for other projects on the island.

2016-10-15-09-03-10

Another successful takeoff, courtesy of Flying Labs Coordinator Yussuf.

2016-10-15-11-11-20

We flew the Cumulus at a speed of 65km/h and at an altitude of 265m.

2016-10-15-13-11-13

The Cumulus flew for 2 hours, making this our longest UAV flight in Zanzibar so far.

2016-10-15-10-38-51-hdr

Khadija from Team SUZA explains to local villagers how and why she maps Zanzibar using flying robots.

2016-10-15-17-26-23

Tide starts rushing back in. It’s important to take the moon into account when mapping coastlines, as the tide can change drastically during a single flight and thus affect the stitching process.

The content above is cross-posted from WeRobotics.

Using Swimming Robots to Warn Villages of Himalayan Tsunamis

Cross-posted from National Geographic 

Climate change is having a devastating impact on the Himalaya. On the Ngozumpa glacier, one of the largest and longest in the region, hundreds of supraglacial lakes dot the glacier surface. One lake in particular is known for its continuous volume purges on an annual basis. Near the start of the monsoon this summer, in less than 48 hours, it loses enough water to fill over 40 Olympic-sized swimming pools. To make matters worse, these glacial lakes act like cancers: they consume Himalayan glaciers from the inside out, making some of them melt twice as fast. As a result, villages down-valley from these glacial lakes are becoming increasingly prone to violent flash floods, which locals call Himalayan Tsunamis.

To provide early warnings of these flash floods requires that we collect a lot more geophysical and hydrologic information on these glacial lakes. So scientists like Ulyana (co-author) are racing to understand exactly how these glacial lakes form and grow, and how they’re connected to each other through seemingly secret subterranean channels. We need to know how deep and steep these lakes are, what the lake floors look like and of what materials they are composed (e.g., mud, rock, bare ice).

Ulyana, her colleagues and a small local team of Sherpa have recently started using autonomous swimming robots to automatically map lake floors and look for cracks that may trigger mountain tsunamis. Using robotics to do this is both faster and more accurate than having humans take the measurements. What’s more, robots are significantly safer. Indeed, even getting near these lakes (let alone in them!) is dangerous enough due to unpredictable collapses of ice called calving and large boulders rolling off of surrounding ice cliffs and into the lakes below. Just imagine being on a small inflatable boat floating on ice-cold water when one of those icefalls happen.

We (Ulyana and Patrick) are actively looking to utilize diving robots as well—specifically the one in the video footage below. This OpenROV Trident robot will enable us to get to the bottom of these glacial lakes to identify deepening ‘hotspots’ before they’re visible from the lake’s surface or from the air. Our plan next year is to pool our efforts, bringing diving, swimming and flying robots to Nepal so we can train our partners—Sherpas and local engineers—on how to use these robotic solutions to essentially take the ‘pulse’ of the changing Himalaya. This way they’ll be able to educate as well as warn nearby villages before the next mountain floods hit.

We plan to integrate these efforts with WeRobotics (co-founded by co-author Patrick) and in particular with the local robotics lab that WeRobotics is already setting up in Kathmandu. This lab has a number of flying robots and trained Nepali engineers. To learn more about how these flying robots are being used in Nepal, check out the pictures here.

We’ll soon be adding diving robots to the robotic lab’s portfolio in Nepal thanks to WeRobotics’s partnership with OpenROV. What’s more, all WeRobotics labs have an expressed goal of spinning off  local businesses that offer robotics as services. Thus, the robotics start-up that spins off from our lab in Nepal will offer a range of mapping services using both flying and diving robots. As such, we want to create local jobs that use robotics (jobs that local partners want!) so that our Nepali friends can make a career out of saving their beautiful mountains.  

Please do get in touch if you’d like to get involved or support in other ways! Email us ulyana@scienceinthewild.com and patrick@werobotics.org

Could These Swimming Robots Help Local Communities?

Flying robots are all over the news these days. But UAVs/drones are hardly the only autonomous robotics solutions out there. Driving robots like the one below by Starship Technologies has already driven more than 5,000 miles in 39 cities across 12 counties, gentling moving around hundreds of thousands of people in the process of delivering small packages. I’m already in touch with Starship and related companies to explore a range of humanitarian applications. Perhaps less well known, however, are the swimming robots that can be found floating and diving in oceans, lakes and rivers around the world.

These swimming robots are often referred to as maritime or marine robots, aquatic robots, remotely operated vehicles and autonomous surface water (or underwater) vehicles. I’m interested in swimming robots for the same reason I’m interested in flying and driving robots: they allow us to collect data, transport cargo and take samples in more efficient and productive ways. Flying Robots, for example, can be used to transport essential vaccines and medicines. They can also collect data by taking pictures to support precision agriculture and they can take air samples to test for pollution. The equivalent is true for swimming and diving robots.

So I’d like to introduce you to this cast of characters and will then elaborate on how they can and have been used to make a difference. Do please let me know if I’m missing any major ones—robots and use-cases.

OpenRov Trident
OpenROV_Trident
This tethered diving robot can reach depths of up to 100 meters with a maximum speed of 2 meters per second (7 km/hour). The Trident has a maximum run-time of 3 hours, weighs just under 3 kg and easily fits in a backpack. It comes with a 25 meter tether (although longer tethers are also available). The robot, which relays a live video feed back to the surface, can be programmed to swim in long straight lines (transects) over a given area to generate a continuous map of the seafloor. The OpenROV software is open source. More here.

MidWest ROV Screen Shot 2016-07-29 at 8.00.07 AM
This remotely operated swimming robot has an maximum cruising speed of just under 5km per hour and weighs 25kg. The ROV is a meter long and has a run time of approximately 4 hours. The platform has full digital and audio recording capabilities with a sonar a scanner that can record a swatch of ~60 meter wide at a depth of 180 meters. This sturdy robot has been swimming in one of the fastest changing glacier lakes in the Himalayas to assess flood hazards. More here. See also MarineTech.

Hydromea Vertex AUV

This small swimming robot can cover a volume of several square kilometers at a depth of up to 300 meters with a maximum speed of 1 meter per second (3.5 km/hour). The Vertex can automatically scan vertically and horizontally, or any other angle for that matter and from multiple locations. The platform, which only weighs 7 kg and has a length of 70 cm, can be used to create 3D scans of the seafloor with up to 10 robots operating in simultaneously in parallel thanks to communication and localization technology that enables them to cooperate as a team. More here.

Liquid Robotics Wave Glider
LiquidRobotics
The Wave Glider is an autonomous swimming robot powered by both wave and solar energy, enabling it to cruise at 5.5 km/hour. The surface component, which measures 3 meters in length, contains solar panels that power the platform and onboard sensors. The tether and underwater component enables the platform to use waves for thrust. This Glider operates individually or in fleets to deliver real-time data for up to a year with no fuel. The platform has already traveled well over one million kilometers and through a range of weather conditions including hurricanes and typhoons. More here.

SeaDrone
SeaDrone
This tethered robot weighs 5kg and can operate at a depth of 100 meters with a maximum of 1.5m per second (3.5km/hour). Tethers are available at a length of 30 meters to 50 meters. The platform has a battery life of 3 hours and provides a live, high-definition video feed. The SeaDrone platform can be easily controlled from an iOS tablet. More here.

Clear Path Robotics Heron
ClearPath
This surface water swimming robot can cruise at a maximum speed of 1.7 meters per second (6km/ hour) for around 2 hours. The Heron, which weighs 28kg, offers a payload bay for submerged sensors and a mounting system for those above water. The robot can carry a maximum payload of 10kg. A single operator can control multiple Herons simultaneously. The platform, like others described below is ideal for ecosystem assessments and bathymetry surveys (to map the topography of lakes and ocean floors). More here.

SailDrone
SailDrone
The Saildrone navigates to its destination using wind power alone, typically cruising at an average speed of 5.5 km/hour. The robot can then stay at a designated spot or perform survey patterns. Like other robots introduced here, the Saildrone can carry a range of sensors for data collection. The data is then transmitted back to shore via satellite. The Saildrone is also capable of carrying an additional 100 kg worth of payload. More here.

EMILY
Screen Shot 2016-07-29 at 2.29.07 PM
EMILY, an acronym for Emergency Integrated Lifesaving Lanyard, is a robotic device used by lifeguards for rescuing swimmers. It operates on battery power and is operated by remote control after being dropped into the water from shore, a boat or pier, or helicopter. EMILY has a maximum cruising speed of 35km per hour (much faster than a human lifeguard can swim) and function as a floatation device for up to 4-6 people. The platform was used in Greece to assist in ocean rescues of refugees crossing the Aegean Sea from Turkey. More here. The same company has also created Searcher, an autonomous marine robot that I hope to learn more about soon.

Platypus
Platypus
Platypus manufactures four different types of swimming robots one which is depicted above. Called the Serval, this platform has a maximum speed of 15 km/hour with a runtime of 4 hours. The Serval weighs 35kg and can carry a payload of 70kg. The Serval can use either wireless, 3G or Edge to communicate. Platypus also offers a base-station package that includes a wireless router and antenna with range up to 2.5 km. The Felis, another Playtpus robot, has a max speed of 30km/hour and a max payload of 200kg. The platform can operate for 12 hours. These platforms can be used for autonomous mapping. More here.

AquaBot
AquaBot
The aim of the AquaBot project is to develop an underwater tethered robot that automates the tasks of visual inspection of fish farm nets and mooring systems. There is little up-to-date information on this project so it is unclear how many prototypes and tests were carried out. Specs for this diving robot don’t seem to be readily available online. More here.

There are of course many more marine robots out there. Have a look at these other companies: Bluefin Robotics, Ocean Server, Riptide Autonomous Systems, Seabotix, Blue Robotics, YSI, AC-CESS and Juice Robotics, for example. The range of applications of maritime robotics can be applied to is also growing. At WeRobotics, we’re actively exploring a wide number of use-cases to determine if and where maritime robots might be able to add value to the work of our local partners in developing countries.

aquaculture

Take aquaculture (also known as aquafarming), for example. Aquaculture is the fastest growing sector of global food production. But many types of aquaculture remain labor intensive. In addition, a combination of “social and environmental pressures and biological necessities are creating opportunities for aquatic farms to locate in more exposed waters further offshore,” which increases both risks and costs, “particularly those associated with the logistics of human maintenance and intervention activities.” These and other factors make “this an excellent time to examine the possibilities for various forms of automation to improve the efficiency and cost-effectiveness of farming the oceans.”

Just like land-based agriculture, aquaculture can also be devastated by major disasters. To this end, aquaculture represents an important food security issue for local communities directly dependent on seafood for their livelihoods. As such, restoring aquafarms can be a vital element of disaster recovery. After the 2011 Japan Earthquake and Tsunami, for example, maritime robots were used to “remediate fishing nets and accelerate the restarting of the fishing economy.” As further noted in the book Disaster Robotics, the robots “cleared fishing beds from debris and pollution” by mapping the “remaining debris in the prime fishing and aquaculture areas, particularly looking for cars and boats leaking oil and gas and for debris that would snag and tear fishing nets.”

JapanTsunami

Ports and shipping channels in both Japan and Haiti were also reopened using marine robots following the major earthquakes in 2011 and 2010. They mapped the debris field that could damage or prevent ships from entering the port. The clearance of this debris allowed “relief supplies to enter devastated areas and economic activities to resume.” To be sure, coastal damage caused by earthquake and tsunamis can render ports inoperable. Marine robots can thus accelerate both response and recovery efforts by reopening ports that represent primary routes for relief supplies, as noted in the book Disaster Robotics

In sum, marine robotics can be used for aquaculture, structural inspections, estimation of debris volume and type, victim recovery, forensics, environmental monitoring as well as search, reconnaissance and mapping. While marine robots remain relatively expensive, new types of low-cost solutions are starting to enter the market. As these become cheaper and more sophisticated in terms of their autonomous capabilities, I am hopeful that they will become increasingly useful and accessible to local communities around the world. Check out WeRobotics to learn about how appropriate robotics solutions can support local livelihoods.

Reverse Robotics: A Brief Thought Experiment

Imagine a world in which manually controlled technologies simply do not exist. The very thought of manual technologies is, in actual fact, hardly conceivable let alone comprehensible. Instead, this seemingly alien world is seamlessly powered by intelligent and autonomous robotics systems. Lets call this world Planet AI.

PlanetAI

Planet AI’s version of airplanes, cars, trains and ships are completely unmanned. That is, they are fully autonomous—a silent symphony of large and small robots waltzing around with no conductor in sight. On one fateful night, a young PhD student awakens in a sweat unable to breathe, momentarily. The nightmare: all the swirling robots of Planet AI were no longer autonomous. Each of them had to be told exactly what to do by the Planet’s inhabitants. Madness.

She couldn’t go back to sleep. The thought of having to tell her robotics transport unit (RTU) in the morning how to get from her studio to the university gave her a panic attack. She would inevitably get lost or worse yet crash, maybe even hurt someone. She’d need weeks of practice to manually control her RTU. And even if she could somehow master manual steering, she wouldn’t be able to steer and work on her dissertation at the same time during the 36-minute drive. What’s more, that drive would easily become a 100-minute drive since there’s no way she would manually steer the RTU at 100 kilometers an hour—the standard autonomous speed of RTUs; more like 30km/h.

And what about the other eight billion inhabits of Planet AI? The thought of having billions of manually controlled RTUs flying, driving & swimming through the massive metropolis of New AI was surely the ultimate horror story. Indeed, civilization would inevitably come to an end. Millions would die in horrific RTU collisions. Transportation would slow to a crawl before collapsing. And the many billions of hours spent working, resting or playing in automated RTU’s every day would quickly evaporate into billions of hours of total stress and anxiety. The Planet’s Global GDP would free fall. RTU’s carrying essential cargo automatically from one side of the planet to the other would need to be steered manually. Where would those millions of jobs require such extensive manual labor come from? Who in their right mind would even want to take such a dangerous and dull assignment? Who would provide the training and certification? And who in the world would be able to pay for all the salaries anyway?

At this point, the PhD student was on her feet. “Call RTU,” she instructed her personal AI assistant. An RTU swung by while she as putting on her shoes on. Good, so far so good, she told herself. She got in slowly and carefully, studying the RTU’s behavior suspiciously. No, she thought to herself, nothing out of the ordinary here either. It was just a bad dream. The RTU’s soft purring power source put her at ease, she had always enjoyed the RTU’s calming sound. For the first time since she awoke from her horrible nightmare, she started to breathe more easily. She took an extra deep and long breath.

starfleet

The RTU was already waltzing with ease at 100km per hour through the metropolis, the speed barely noticeable from inside the cocoon. Forty-six, forty-seven and forty-eight; she was counting the number of other RTU’s that were speeding right alongside her’s, below and above as well. She arrived on campus in 35 minutes and 48 seconds—exactly the time it had taken the RTU during her 372 earlier rides. She breathed a deep sigh of relief and said “Home Please.” It was just past 3am and she definitely needed more sleep.

She thought of her fiancée on the way home. What would she think about her crazy nightmare given her work in the humanitarian space? Oh no. Her heart began to race again. Just imagine the impact that manually steered RTUs would have on humanitarian efforts. Talk about a total horror story. Life-saving aid, essential medicines, food, water, shelter; each of these would have to be trans-ported manually to disaster-affected communities. The logistics would be near impossible to manage manually. Everything would grind and collapse to a halt. Damage assessments would have to be carried manually as well, by somehow steering hundreds of robotics data units (RDU’s) to collect data on affected areas. Goodness, it would take days if not weeks to assess disaster damage. Those in need would be left stranded. “Call Fiancée,” she instructed, shivering at the thought of her fiancée having to carry out her important life-saving relief work entirely manually.


The point of this story and thought experiment? While some on Planet Earth may find the notion of autonomous robotics system insane and worry about accidents, it is worth noting that a future world like Planet AI would feel exactly the same way with respect to our manually controlled technologies. Over 80% of airplane accidents are due to human pilot error and 90% of car accidents are the result of human driver error. Our PhD student on Planet AI would describe our use of manually controlled technologies a suicidal, not to mention a massive waste of precious human time.

Screen Shot 2016-07-07 at 11.11.08 AM

An average person in the US spends 101 minutes per day driving (which totals to more than 4 years in their life time). There are 214 million licensed car drivers in the US. This means that over 360 million hours of human time in the US alone is spent manually steering a car from point A to point B every day. This results in more than 30,000 people killed per year. And again, that’s just for the US. There are over 1 billion manually controlled motor vehicles on Earth. Imagine what we could achieve with an additional billion hours every day if we had Planet AI’s autonomous systems to free up this massive cognitive surplus. And lets not forget the devastating environmental impact of individually-owned, manually controlled vehicles.

If you had the choice, would you prefer to live on Earth or on Planet AI if everything else were held equal?

Humanitarian Robotics: The $15 Billion Question?

The International Community spends around $25 Billion per year to provide life saving assistance to people devastated by wars and natural disasters. According to the United Nations, this is $15 Billion short of what is urgently needed; that’s $15 Billion short every year. So how do we double the impact of humanitarian efforts and do so at half the cost?

wp1

Perhaps one way to deal with this stunning 40% gap in funding is to scale the positive impact of the aid industry. How? By radically increasing the efficiency (time-savings) and productivity (cost-savings) of humanitarian efforts. This is where Artificial Intelligence (AI) and Autonomous Robotics come in. The World Economic Forum refers to this powerful new combination as the 4th Industrial Revolution. Amazon, Facebook, Google and other Top 100 Fortune companies are powering this revolution with billions of dollars in R&D. So whether we like it or not, the robotics arms race will impact the humanitarian industry just like it is impacting other industries: through radical gains in efficiency & productivity.

Take Amazon, for example. The company uses some 30,000 Kiva robots in its warehouses across the globe (pictured below). These ground-based, terrestrial robotics solutions have already reduced Amazon’s operating expenses by no less than 20%. And each new warehouse that integrates these self-driving robots will save the company around $22 million in fulfillment expenses alone. According to Deutsche Bank, “Bringing the Kivas to the 100 or so distribution centers that still haven’t implemented the tech would save Amazon a further $2.5 billion.” As is well known, the company is also experimenting with aerial robotics (drones). A recent study by PwC (PDF) notes that “the labor costs and services that can be replaced by the use of these devices account for about $127 billion today, and that the main sectors that will be affected are infrastructure, agriculture, and transportation.” Meanwhile, Walmart and others are finally starting to enter the robotics arms race. The former is using ground-based robots to ship apparel and is actively exploring the use of aerial robotics to “photograph ware-house shelves as part of an effort to reduce the time it takes to catalogue inventory.”

Amazon Robotics

What makes this new industrial revolution different from those that preceded it is the fundamental shift from manually controlled technologies—a world we’re all very familiar with—to a world powered by increasingly intelligent and autonomous systems—an entirely different kind of world. One might describe this as a shift towards extreme automation. And whether extreme automation powers aerial robotics, terrestrial robotics or maritime robots (pictured below) is besides the point. The disruption here is the one-way shift towards increasingly intelligent and autonomous systems.

All_Robotics

Why does this fundamental shift matter to those of us working in humanitarian aid? For at least two reasons: the collection of humanitarian information and the transportation of humanitarian cargo. Whether we like it or not, the rise of increasingly autonomous systems will impact both the way we collect data and transport cargo by making these processes faster, safer and more cost-effective. Naturally, this won’t happen overnight: disruption is a process.

Humanitarian organizations cannot stop the 4th Industrial Revolution. But they can apply their humanitarian principles and ideals to inform how autonomous robotics are used in humanitarian contexts. Take the importance of localizing aid, for example, a priority that gained unanimous support at the recent World Humanitarian Summit. If we apply this priority to humanitarian robotics, the question becomes: how can access to appropriate robotics solutions be localized so that local partners can double the positive impact of their own humanitarian efforts? In other words, how do we democratize the 4th Industrial Revolution? Doing so may be an important step towards closing the $15 billion gap. It could render the humanitarian industry more efficient and productive while localizing aid and creating local jobs in new industries.

This is What Happens When You Send Flying Robots to Nepal

In September 2015, we were invited by our partner Kathmandu University to provide them and other key stakeholders with professional hands-on training to help them scale the positive impact of their humanitarian efforts following the devastating earthquakes. More specifically, our partners were looking to get trained on how to use aerial robotics solutions (drones) safely and effectively to support their disaster risk reduction and early recovery efforts. So we co-created Kathmandu Flying Labs to ensure the long-term sustainability of our capacity building efforts. Kathmandu Flying Labs is kindly hosted by our lead partner, Kathmandu University (KU). This is already well known. What is hardly known, however, is what happened after we left the country.

Screen Shot 2015-11-02 at 5.17.58 PM

Our Flying Labs are local innovation labs used to transfer both relevant skills and appropriate robotics solutions sustainably to outstanding local partners who need these the most. The co-creation of these Flying Labs include both joint training and applied projects customized to meet the specific needs & priorities of our local partners. In Nepal, we provided both KU and Kathmandu Living Labs (KLL) with the professional hands-on training they requested. What’s more, thanks to our Technology Partner DJI, we were able to transfer 10 DJI Phantoms (aerial robotics solutions) to our Nepali partners (6 to KU and 4 to KLL). In addition, thanks to another Technology Partner, Pix4D, we provided both KU and KLL with free licenses of the Pix4D software and relevant training so they could easily process and analyze the imagery they captured using their DJI platforms. Finally, we carried out joint aerial surveys of Panga, one of the towns hardest-hit by the 2015 Earthquake. Joint projects are an integral element of our capacity building efforts. These projects serve to reinforce the training and enable our local partners to create immediate added value using aerial robotics. This important phase of Kathmandu Flying Labs is already well documented.

WP15

What is less known, however, is what KU did with the technology and software after we left Nepal. Indeed, the results of this next phase of the Flying Labs process (during which we provide remote support as needed) has not been shared widely, until now. KU’s first order of business was to actually finish the joint project we had started with them in Panga. It turns out that our original aerial surveys there were actually incomplete, as denoted by the red circle below.

Map_Before

But because we had taken the time to train our partners and transfer both our skills and the robotics technologies, the outstanding team at KU’s School of Engineering returned to Panga to get the job done without needing any further assistance from us at WeRobotics. They filled the gap:

Map_After

The KU team didn’t stop there. They carried out a detailed aerial survey of a nearby hospital to create the 3D model below (at the hospital’s request). They also created detailed 3D models of the university and a nearby temple that had been partially damaged by the 2015 earthquakes. Furthermore, they carried out additional disaster damage assessments in Manekharka and Sindhupalchowk, again entirely on their own.

Yesterday, KU kindly told us about their collaboration with the World Wildlife Fund (WWF). Together, they are conducting a study to determine the ecological flow of Kaligandaki river, one of the largest rivers in Nepal. According to KU, the river’s ecosystem is particularly “complex as it includes aquatic invertebrates, flora, vertebrates, hydrology, geo-morphology, hydraulics, sociology-cultural and livelihood aspects.” The Associate Dean at KU’s School of Engineering wrote “We are deploying both traditional and modern technology to get the information from ground including UAVs. In this case we are using the DJI Phantoms,” which “reduced largely our field investigation time. The results are interesting and promising.” I look forward to sharing these results in a future blog post.

kali-gandaki-river

Lastly, KU’s Engineering Department has integrated the use of the robotics platforms directly into their courses, enabling Geomatics Engineering students to use the robots as part of their end-of-semester projects. In sum, KU has done truly outstanding work following our capacity building efforts and deserve extensive praise. (Alas, it seems that KLL has made little to no use of the aerial technologies or the software since our training 10 months ago).

Several months after the training in Nepal, we were approached by a British company that needed aerial surveys of specific areas for a project that the Nepal Government had contracted them to carry out. So they wanted to hire us for this project. We proposed instead that they hire our partners at Kathmandu Flying Labs since the latter are more than capable to carry out the surveys themselves. In other words, we actively drive business opportunities to Flying Labs partners. Helping to create local jobs and local businesses around robotics as a service is one of our key goals and the final phase of the Flying Labs framework.

So when we heard last week that USAID’s Global Development Lab was looking to hire a foreign company to carry out aerial surveys for a food security project in Nepal, we jumped on a call with USAID to let them know about the good work carried out by Kathmandu Flying Labs. We clearly communicated to our USAID colleagues that there are perfectly qualified Nepali pilots who can carry out the same aerial surveys. USAID’s Development Lab will be meeting with Kathmandu Flying Labs during their next visit in September.

thumb_IMG_4591_1024

On a related note, one of the participants who we trained in September was hired soon after by Build Change to support the organization’s shelter programs by producing Digital Surface Models (DSMs) from aerial images captured using DJI platforms. More recently, we heard from another student who emailed us with the following: “I had an opportunity to participate in the Humanitarian UAV Training mission in Nepal. It’s because of this training I was able learn how to fly drones and now I can conduct aerial Survey on my own with any hardware.  I would like to thank you and your team for the knowledge transfer sessions.”

This same student (who graduated from KU) added: “The workshop that your team did last time gave us the opportunity to learn how to fly and now we are handling some professional works along with major research. My question to you is ‘How can young graduates from developing countries like ours strengthen their capacity and keep up with their passion on working with technology like UAVs […]? The immediate concern for a graduate in Nepal is a simple job where he can make some money for him and prove to his family that he has done something in return for all the investments they have been doing upon him […]’.

KU campus sign

This is one of several reasons why our approach at WeRobotics is not limited to scaling the positive impact of local humanitarian, development, environmental and public health projects. Our demand-driven Flying Labs model goes the extra (aeronautical) mile to deliberately create local jobs and businesses. Our Flying Labs partners want to make money off the skills and technologies they gain from WeRobotics. They want to take advantage of the new career opportunities afforded by these new AI-powered robotics solutions. And they want their efforts to be sustainable.

In Nepal, we are now interviewing the KU graduate who posed the question above because we’re looking to hire an outstanding and passionate Coordinator for Kathmandu Flying Labs. Indeed, there is much work to be done as we are returning to Nepal in coming months for three reasons: 1) Our local partners have asked us to provide them with the technology and training they need to carry out large scale mapping efforts using long-distance fixed-wing platforms; 2) A new local partner needs to create very high-resolution topographical maps of large priority areas for disaster risk reduction and planning efforts, which requires the use of a fixed-wing platform; 3) We need to meet with KU’s Business Incubation Center to explore partnership opportunities since we are keen to help incubate local businesses that offer robotics as a service in Nepal.

How to Democratize Humanitarian Robotics

Our world is experiencing an unprecedented shift from manually controlled technologies to increasingly intelligent and autonomous systems powered by artificial intelligence (AI). I believe that this radical shift in both efficiency and productivity can have significant positive social impact when it is channeled responsibly, locally and sustainably.

WeRobotics_Logo_New

This is why my team and I founded WeRobotics, the only organization fully dedicated to accelerating and scaling the positive impact of humanitarian, development and environmental projects through the appropriate use of AI-powered robotics solutions. I’m thrilled to announce that the prestigious Rockefeller Foundation shares our vision—indeed, the Foundation has just awarded WeRobotics a start-up grant to take Humanitarian Robotics to the next level. We’re excited to leverage the positive power of robotics to help build a more resilient world in line with Rockefeller’s important vision.

Print

Aerial Robotics (drones/UAVs) represent the first wave of robotics to impact humanitarian sectors by disrupting traditional modes of data collection and cargo delivery. Both timely data and the capacity to act on this data are integral to aid, development and environmental projects. This is why we are co-creating and co-hosting global network of “Flying Labs”; to transfer appropriate aerial robotics solutions and relevant skills to outstanding local partners in developing countries who need these the most.

Our local innovation labs also present unique opportunities for our Technology Partners—robotics companies and institutes. Indeed, our growing network of Flying Labs offer a multitude of geographical, environmental and social conditions for ethical social good projects and responsible field-testing; from high-altitude glaciers and remote archipelagos experiencing rapid climate change to dense urban environments in the tropics subject to intense flooding and endangered ecosystems facing cascading environmental risks.

The Labs also provide our Technology Partners with direct access to local knowledge, talent and markets, and in turn provide local companies and entrepreneurs with facilitated access to novel robotics solutions. In the process, our local partners become experts in different aspects of robotics, enabling them to become service providers and drive new growth through local start-up’s and companies. The Labs thus seek to offer robotics-as-a-service across multiple local sectors. As such, the Labs follow a demand-driven social entrepreneurship model designed to catalyze local businesses while nurturing learning and innovation.

Of course, there’s more to robotics than just aerial robotics. This is why we’re also exploring the use of AI-powered terrestrial and maritime robotics for data collection and cargo delivery. We’ll add these solutions to our portfolio as they become more accessible in the future. In the meantime, sincerest thanks to the Rockefeller Foundation for their trust and invaluable support. Big thanks also to our outstanding Board of Directors and to key colleagues for their essential feed-back and guidance.

Humanitarian Cargo Delivery via Aerial Robotics is Not Science Fiction (Updated)

I had the opportunity to visit Zipline’s field-testing site in San Francisco last year after the company participated in an Experts Meeting on Humanitarian UAVs (Aerial Robotics) that I co-organized at MIT. The company has finally just gone public about their good work in Rwanda, so I’m at last able to blog about it on iRevolutions. When I write “finally”, this is not meant to be a complaint; in fact, one aspect that really drew me to Zipline in the first place is the team’s genuine down-to-earth, no-hype mantra. So, I use the word finally since I now finally have public evidence to backup many conversations I’ve had with humanitarian partners on the topic of cargo delivery via aerial robotics.

Zip Delivery

As I had signed an NDA, I was (and still am) only allowed to discuss information that is public, which was basically nothing until today. So below is a summary of what is at last publicly known about Zipline’s pioneering aerial robotics efforts in Rwanda. I’ve also added videos at the end.

Zipic

  • Zipline’s Mission: to deliver critical medical products to health centers and hospitals that are either difficult or impossible to reach via traditional modes of transportation
  • Zipline Fleet: 15 aerial robotics platforms (UAVs) in Rwanda.
  • Aerial Robotics platform: Fixed-wing.
  • Weight of each platform: 10-kg.
  • Power: Battery-operated twin-electric motors.
  • Payload capacity: up to 1.5kg.
  • Cargo: Blood and essential medicines (small vials) to begin with. Eventually cargo will extend to lifesaving vaccines, treatments for HIV/AIDS, malaria, tuberculosis, etc.
  • Range: Up to 120 km.
  • Flight Plans: Pre-programmed and monitored on the ground via tablets. Individual plans are stored on SIM cards.

gallery5

 

 

 

 

 

 

 

  • Flight Navigation: GPS using the country’s cellular network.
  • Launch Mechanism: Via catapult.
  • Maximum Speed: Around 100 km/hour.
  • Landings: Zipline’s aerial robot does not require a runway.
  • Delivery Mechanism: Fully autonomous, low altitude drop via simple paper parachute. Onboard computers determine appropriate parameters (taking into account winds, etc) to ensure that the cargo accurately lands on it’s dedicated delivery site called a “mailbox”.
  • Delivery Sites: Dedicated drop sites at 21 health facilities that can carry out blood transfusions. These cover more than half of Rwanda.

Screen Shot 2016-04-05 at 4.22.02 PM

 

 

 

 

 

 

 

 

 

  • Takeoff Sites: Modified shipping containers located next to existing medical warehouses.
  • Delivery Time: Each cargo is delivered within 1 hour. The aerial robot takes about 1/2 hour reach a delivery site.
  • Flight Frequency: Eventually up to 150 flights per day.
  • Weather: Fixed-wings can operate in ~50km/hour winds.
  • Regulatory Approval: Direct agreements already secured with the Government of Rwanda and country’s Civil Aviation Authority.

Sources: