How to make monitoring, evaluation, and learning work for complex change Michael Moses March 28, 2018 No Comments March 28, 2018 Michael Moses, Director of Learning and Programs ***This post was originally published by New Philanthropy Capital (NPC) on March 26, 2018*** To tackle complex, systemic problems you need a plan: enter theory of change (ToC). But like any plan, a theory of change must be reviewed and adapted as new information comes to light, the situation changes or assumptions are questioned. The challenges that the development sector tries to address are complex and systemic. And governance challenges – like the ones we tackle at Global Integrity – are particularly confounding. The challenges the communities we work with face are unique, so generalised solutions developed outside of those communities, in other contexts, will be of limited use to those trying to solve local problems. Instead, solutions that effectively address the issues which local people care about tend to emerge over time, in particular places and led by local stakeholders. Success is contingent on these people engaging with, learning about, and shaping the dynamics of the complex, political systems in their specific contexts. Developing a theory of change is an essential first step to this. A ToC helps make assumptions explicit, lays out an evidence-based hypothesis about how change is expected to occur, and provides a frame for reflection and course corrections throughout a project or program. This latter point is key. As NPC’s recent report on systemic use of theories of change states: ‘The word ‘theory’ in its name is no coincidence. Theories are tested and updated as new knowledge emerges Effective use of a ToC, requires collecting, and reflecting on, data needed for learning and adaptation. The specific design of any approach to monitoring, evaluation and learning (MEL) will vary according to the theory of change it serves, and the context in which it’s used. But three general principles are worth keeping in mind when pursuing change in complex systems. Emphasize (local) action. Monitoring data/evidence aims must be useful to local stakeholders. Whether it helps them interrogate their assumptions; track progress; or consider how to adapt in response to emergent information, understanding opportunities for impact is the key. Reporting to donors is a secondary priority. For example, in Global Integrity’s recent Learning to Make All Voices Count project, we worked with civil society organizations in five countries to explore whether progress expected in their theories of was unfolding as expected. When it wasn’t, we helped our partners use evidence to adapt their strategies. In another case, our partners in Tanzania had initially thought that, if they provided community members with information about a national open government policy, citizens would respond by trying to hold local officials to account for honouring those policy commitments. Initial monitoring data, perhaps unsurprisingly, did not bear out this assumption. Instead, local power dynamics – rather than access to information about government policy – were found to be the issue. So our partners adapted their theory of change, and set to work helping young people and women mobilise into “people’s committees” to reshape those dynamics. See this case story for more. Support participation. A good monitoring, evaluation and learning framework is not developed solely by MEL staff, or by people sitting in offices in places that are geographically or culturally far-away. Rather, they are co-created and used in partnership with local stakeholders and beneficiaries, with their perspectives, priorities, and interests baked into the design and application. For example, our colleagues in the Philippines co-created their MEL framework with regional universities and regional civil society organisations. In doing so, they learned that local partners were most interested in improving their capacity to understand and use district-level budget data. So determining whether and how the project was helping this, became a priority. As a result, our partners to realized that some of the training tools they initially used weren’t having the intended effects. They made course corrections to the assumptions and activities underpinning that aspect of their theory of change. More details are available in this case study. Embrace iteration. This means that data is gathered regularly, in as close to real-time is as feasible, rather than only at beginning, middle and end of a project. Regular data collection becomes the fuel for regular reflection, and adaptation. In Kenya, our partners recognized that their work in two counties was subject to a number of potential risks, from drought to ethnic conflict. So they made sure to regularly revisit, and analyze, those potential risks. And when conflict did break out, our partners were ready – they quickly identified the danger, and worked with local stakeholders to temporarily replace planned in-person community meetings with virtual discussions over WhatsApp. This key change not only kept project participants safe until the violence died down, but helped keep the project on track amid challenging circumstances. More information is available here. Ingraining these principles of action, participation, and iteration into your monitoring, evaluation and learning isn’t easy. It requires time, resources, and buy-in – from donors, implementers, and partners. But it can pay off! It helps organisations unlock the usefulness of theories of change and, over time, learn and adapt their way to successfully addressing the complex, systemic challenges that matter to citizens in countries and communities across the world. Topics: adaptive learningMELNew Philanthropy CapitalNPCtheory of change Michael Moses Managing Director, Programs and Learning Leave a Reply Cancel reply Your email address will not be published. Required fields are marked *Comment You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong> Name * Email * Website Related blog posts Alan Hudson, September 7, 2016 Learning by doing: Our action plan for open governance Alan Hudson, July 26, 2016 Politics matters, so what? Time for bigger bets (and more learning) on adaptive programming Alan Hudson, May 2, 2019 M/E/L Maps – Supporting journeys to impact!