The second day of the Innovation Track at Web4Dev focused on monitoring and evaluation. Robert Kirkpatrick from InSTEDD, Erik Hersman from Ushahidi and Christopher Strebel from UNESCO each gave a presentation.
Robert introduced InSTEDD’s Mesh4X and GeoChat which I’ve already blogged about here so won’t expand on. But Robert also introduced a new project I was not aware of called Evolve. This tool helps to synthesize data into actionable information, to collaborate around diverse data streams to detect, analyze, triage and track critical events as they unfold.
Erik introduced Ushahidi and described our increasing capacity to crowdsource eyewitness crisis data. However, the challenge is increasingly how to consume and make sense of the incoming data stream. There were thousands of Tweets per minute during the Mumbai attacks. Ushahidi is working on Swift River to explore ways to use crowdsourcing as a filter for data validation.
Christopher Strebel introduced GigaPan, a robotic camera that captures gigapixel images. The tool was developed for the Mars Rover program to take very high resolution images of Mars. UNESCO is introducing the technology for education purposes. I’m not sure I’m entirely convinced about this project; not just because the camera costs $300-$400 but because I don’t see what such a sophisticated tool adds over regular cameras in terms of education and participation.
In any case, while I found all three presentations interesting, none of them actually addressed the second topic of today’s workshop, namely evaluation. I spent most of December and January working with a team of monitoring and evaluation (M&E) experts to develop a framework for a multi-year project in Liberia. I can conclude from this experience that those of us who don’t have expertise in M&E have a huge amount to learn. Developing serious M&E frameworks is a rigorous process.