Combining Radio, SMS and Advanced Computing for Disaster Response

I’m headed to the Philippines this week to collaborate with the UN Office for the Coordination of Humanitarian Affairs (OCHA) on humanitarian crowdsourcing and technology projects. I’ll be based in the OCHA Offices in Manila, working directly with colleagues Andrej Verity and Luis Hernando to support their efforts in response to Typhoon Yolanda. One project I’m exploring in this respect is a novel radio-SMS-computing initiative that my colleague Anahi Ayala (Internews) and I began drafting during ICCM 2013 in Nairobi last week. I’m sharing the approach here to solicit feedback before I land in Manila.

Screen Shot 2013-11-25 at 6.21.33 AM

The “Radio + SMS + Computing” project is firmly grounded in GSMA’s official Code of Conduct for the use of SMS in Disaster Response. I have also drawn on the Bellagio Big Data Principles when writing up the in’s and out’s of this initiative with Anahi. The project is first and foremost a radio-based initiative that seeks to answer the information needs of disaster-affected communities.

The project: Local radio stations in the Philippines would create and broadcast radio programs inviting local communities to serve as “community journalists” to describe how the Typhoon has impacted their communities. The radio stations would provide a free SMS short-code and invite said communities to text in their observations. Each radio station would include in their broadcast a unique 2-letter identifier and would ask those texting in to start their SMS with that identifier. They would also emphasize that text messages should not include any Personal Identifying Information (PII) and no location information either. Those messages that do include PII would be deleted.

Text messages sent to the SMS short code would be automatically triaged by radio station (using the 2-letter identifier) and forwarded to the respective radio stations via SMS. (At this point, few local radio stations have web access in the disaster-affected areas). These radio stations would be funded to create radio programs based on the SMS’s received. These programs would conclude by asking local communities to text in their information needs—again using the unique radio identifier as a prefix in the text messages. Radio stations would create follow-up programs to address the information needs texted in by local communities (“news you can use”). This could be replicated on a weekly basis and extended to the post-disaster reconstruction phase.

Yolanda destruction

In parallel, the text messages documenting the impact of the Typhoon at the community level would be categorized by Cluster—such as shelter, health, education, etc. Each classified SMS would then be forwarded to the appropriate Cluster Leads. This is where advanced computing comes in: the application of microtasking and machine learning. Trusted Filipino volunteers would be invited to tag each SMS by Cluster-category (and also translate relevant text messages into English). Once enough text messages have been tagged per category, the use of machine learning classifiers would enable the automatic classification of incoming SMS’s. As explained above, these classified SMS’s would then be automatically forwarded to a designated point of contact at each Cluster Agency.

This process would be repeated for SMS’s documenting the information needs of local communities. In other words, information needs would be classified by Cluster category and forwarded to Cluster Leads. The latter would share their responses to stated information needs with the radio stations who in turn would complement their broadcasts with the information provided by the humanitarian community, thus closing the feedback loop.

The radio-SMS project would be strictly opt-in. Radio programs would clearly state that the data sent in via SMS would be fully owned by local communities who could call in or text in at any time to have their SMS deleted. Phone numbers would only be shared with humanitarian organization if the individuals texting to radio stations consented (via SMS) to their numbers being shared. Inviting communities to act as “citizen journalists” rather than asking them to report their needs may help manage expectations. Radio stations can further manage these expectations during their programs by taking questions from listeners calling in. In addition, the project seeks to limit the number of SMS’s that communities have to send. The greater the amount of information solicited from disaster-affected communities, the more challenging managing expectations may be. The project also makes a point of focusing on local information needs as the primary entry point. Finally, the data collection limits the geographical resolution to the village level for the purposes of data privacy and protection.

AIDR logo

It remains to be seen whether this project gets funded, but I’d welcome any feedback iRevolution readers may have in any event since this approach could also be used in future disasters. In the meantime, my QCRI colleagues and I are looking to modify AIDR to automatically classify SMS’s (in addition to tweets). My UNICEF colleagues already expressed to me their need to automatically classify millions of text messages for their U-Report project, so I believe that many other humanitarian and development organizations will benefit from a free and open source platform for automatic SMS classification. At the technical level, this means adding “batch-processing” to AIDR’s current “streaming” feature. We hope to have an update on this in coming weeks. Note that a batch-processing feature will also allow users to upload their own datasets of tweets for automatic classification. 


11 responses to “Combining Radio, SMS and Advanced Computing for Disaster Response

  1. Hi Patrick…looking forward to meeting you in person. About the use of Sms, i think that IOM phil wanted to introduce SMS-based displacement tracking for the Bohol earthquake but I am not sure if it worked out. For WASH, we used text blasts to affected communities with support from SMART to share key WASH messages during Bopha. I think CDAC group will be happy about this. Good luck with this new initiative.

  2. Pingback: Links I liked – Development Hack

  3. Best wishes and regards.

  4. Interesting approach. It’s been done for years with Amber alert Networks all over North America. I would try something easier on the radio station and works with most broadcast networks today. Datacasting.

    Datacasting has its roots in another piggy back radio platform invented in 2004. It had the ability through AM / FM Radio carrier wave to send data to in 2006. SPOT, Smart Personal Objects Technology, essentially a device (usually a watch or hand held device or in some parts of the U.S. – Coffee makers!!) that has a radio (AM/FM/Dual Band) receiver inside was put into production in 2004 with Microsoft (with MSN Direct) designing devices and broadcasting commenced a year later. Production was discontinued in 2008.

    But that didn’t kill the technology. Soon afterwards, Digital Broadcast radio began with datacasting as a side feature. (i.e. data signal information to your car radio, showing title of song or radio call sign, etc) and that sprung companies like GARMIN and TomTom with their navigation units to create traffic management signal networks and build them into their GPS receivers. it sends data to GPS / Datacasting receivers determining traffic conditions on major highways and popular roads in large cities.

    The next step is GPRS, a next generation of General Packet Radio System, which is a full two way network that carries I.P. packets over digital radio networks. South Africa has begun deploying it in hybrid form, satellite and terrestrial broadcast digital radio signal.

    Look for similar networks to be deployed throughout Asia over the coming years. Malaysia began implementation in 2005 and continues to upgrade its network.

    Look for small form pocket radios to have a display screen and (virtual) keyboard with two way connectivity delivered through your local radio station over the coming years.

    It will likely become the backbone for Emergency Broadcast Community alerting and replace traditional civil protection warning systems.

  5. Reblogged this on newspeak and commented:
    This is worth reading; but perhaps not just disaster or crisis should motivate such good hacks (and of course they don’t alone).

  6. Hey Patrick, Great idea – only concern is one you’ve identified around expectation management – how you invite participation and what people think they (or their community) will get out of it – and indeed how and what the international community, or the responding community is able to do with the information will be critical. Good luck with it!

  7. Hi Patrick, I work on a number of humanitarian logistics issued (from an academician perspective) and just wanted to not that having locations associated with these texts would provide so much more useful information to the relief agencies. I understand the privacy concerns, but just something to keep in mind.

    • Thanks Irina, the idea is collect location info *only* if said info is actionable and relevant to relief agencies. So said agencies could ultimately text back to invite more specific location info re messages that are directly informative re their mandate. I’m hesitant to collect location info opportunistically.

  8. Pingback: Links I liked – November 25 – Gabriel Krieshok Blog

  9. Pingback: Community radio case studies | Libraries and Crisis Informatics

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s