In a blogger-focused forum yesterday, acting-CEO of the Millennium Challenge Corporation (MCC), Rodney Bent, spoke to the importance of “risk-taking” in aid and development. Risk taking isn’t new in development (“Let’s build dams! With loans!”) but it’s refreshing to see the risks being borne by the people making the decisions. The MCC wants transparency, and lots of it.
New transparency tools…
Bent and his staff proudly unveiled MCC’s new online Monitoring and Evaluation analysis tool to track the progress of MCC projects in countries around the world. In this new public portal, excel spreadsheets will be available for download showing progress at standardized MCC benchmarks. While the information is still being compiled and uploaded, the goal is for this site to allow for cross-country comparisons along sector lines–hypothetically; how are agriculture benchmarks being met in Cape Verde compared to Honduras? Bent further emphasizes results-based monitoring in a blog-post based on yesterday’s discussion.
Props to MCC for the emphasis this places on transparency and open monitoring of project success.
But… in wanting to provide a tool for cross-country comparisons, MCC is opening itself up to the same criticism around indicator standards that it already receives for the data it uses for its country eligibility and selection process. Global Integrity has kept an eye on the MCC approach of mashing-up outside indicators to create their set of front-end country selection criteria as part of our broader skepticism towards metrics in our field.
There’s a widespread acceptance of fast and loose use of highly abstracted data to describe big, complex systems in simple terms. Too often, the source data and the distilled “findings” are worlds apart. We hope that’s not what’s happening here.
But we’ve got spreadsheets!
The excitement around the table was palpable: the MCC team is clearly thrilled to be engaging in new technologies and forms of M&E.;
There is still much more development to come in order for this tool to be truly cutting-edge. The group suggested XML, mapping and RSS capabilities as essential components to prepare this platform for the widespread public use that that the MCC is hoping for.
But apart from the technology concerns, there are some more fundamental issues to look at: this tool is only as good as the data, which may not be appropriate for comparing country-specific and locally detailed projects. For this to work, they need well-defined, reliable, comparable data underpinning all the nifty aggregation tools at their disposal. We talk about these problems all the time; MCC aren’t the only ones wrestling with these issues.
- Clearly define the unit of analysis; know (and describe) exactly what you’re measuring.
- Don’t aggregate. Keep data split out into discrete, tightly defined chunks. If those individual unit of analysis don’t match, don’t try to compare them.
- Make efforts to capture short narrative explanations. It’s the difference between being aware of a failure, and knowing what to do differently next time.
- Be transparent with the methodology, not just the results.
Some of this may be planned or happening already. If so, good job, MCC.
As a transparency concept, this new online portal could be a model for aid agencies around the world. It’s ambitious, yet very appropriate. Maybe even overdue.
— Norah Mallaney & Global Integrity