Technology Salon

Washington DC

Sponsored by

a discussion at the intersection of technology and development

We Should Break Monitoring Apart From Evaluation

One of the sad truths that emerged at the Technology Salon on ICTs and M&E was that failure in development is rarely about the project performance, but about winning the next contract. This means that monitoring and evaluation is less about tracking and improving progress towards social change and more about weaving an advertising pitch.

This is not for a lack of frameworks, tools, mapping measurements against a theory of change, or even the need for more real-time data in development. It is about incentives. What is incentivized at the macro level is getting big numbers on the board and nice clean upwardly-trending graph lines. Micro-level incentives for filing reports to fill out the monitoring side of things focus on report filing as a requirement for salary payments or other basic carrot/stick-driven models. Neither of these actually encourage accurate, honest data, yet only with that accurate data can we remotely hope to tweak models and make improvements.

So, let’s break monitoring apart from evaluation.

then-a-miracle-happens.gif

Monitoring can be real time and deeply embedded into the activities of a project, reducing the need to waste program staff time on reporting (and removing the need to figure out incentive programs). Any project with an ICT4D component should be light years ahead on this, building in complex logging to their work as a default. These logs should themselves be as open as is possible, but at least to the funding and or parent organizations and/or relevant government agencies. Remove the fudging of numbers and reduce the reporting time from weeks or months to as often as there is Internet connectivity (which, admittedly, still might be weeks or months in some situations).

More complex monitoring situations may require additional work outside of logging – qualitative interviews, metrics that don’t pass through the technology components of the systems, and so on. But I would argue that the body of data that does or could be tracked alone would provide powerful proxy indicators of usage, impact, trends and anomalies. Projects like Instedd and the UN Global Pulse – even Google’s Flu Trends find ways to take raw data and compile them into actionable knowledge.

Evaluation then becomes two different things. Part of evaluation is a constant, ongoing process — not something tacked on at the end. Constant attention to the real-time monitoring data, allowing some ongoing adjustments to test methods to improve the project – which is incentivized itself by the ongoing monitoring being more visible.

The holistic evaluation of the project is no longer something that is a last-minute task to frame the project in the best light. Rather, it is a synthesis of the trends, adjustments, and real-time evaluations that have already taken place. It becomes a document discussing the learnings from the project, and can celebrate both failures and successes together, and it frees the document from being an endless set of tables to being able to highlight qualitative impact stories. Evaluation reports might actually be read.

All of this, of course, should be as open as responsibly possible. Obviously the monitoring data may need extensive cleansing for privacy, but imagine if as a sector, development could learn from itself in a rapid, evolutionary process instead of in slow arduous cycles of every organization learning what works in the current trendy topics on their own.

So, how do we start breaking this apart?

This post was originally published as Monitoring and Evaluation is broken. Let’s really break it.

One Response

  1. Jindra says:

    Absolutely excellent!
    Completely agree, the only addition I have is that the whole field of M*E is often seen as punitive, as a judgement on the program rather than a helpful approach where data, analyzed leads to actionable results towards program excellence. The more we also look at what’s successful and how to do more of it, the better!
    Thanks, Jindra