By Matthew Wilson, A360 Project Director, PSI
The piece originally ran on the HCDExchange.
We’re living in an era of complexity, change, and big and bold ideas. There is a strategic imperative to adopt human-centered and adaptive approaches to the way in which we design and implement solutions to some of the hard problems we confront. But as we heard during the recent HCDExchange webinar ‘Measuring a moving target’, such approaches can create challenges for evaluating programmes. The designs are not known at the outset, and they continuously evolve over the life of the project, in response to emerging insights, challenges and opportunities.
During the webinar I posed the question, what if development interventions were also evaluated on our ability to learn and adapt? It’s not the first time this has been raised and, as some participants highlighted, metrics do exist. But how many project results frameworks have metrics measuring a project’s ability to learn and adapt? I will hazard a guess: very few.
Evaluating development interventions on their ability to learn and adapt seems particularly pertinent as the development sector shifts towards a new paradigm. A new paradigm which places greater emphasis on systems thinking, capacity strengthening, shifting the power and locally led development.
But if we were to evaluate a project’s learning agility and adaptive capacity, implementers would need to commit to it in a concerted and systematic way. If it was a self-evaluation, I would give A360 – the project I lead – a 6/10.
- I believe we’ve done a good job of leveraging our human-centered design processes to build a culture of curiosity, and an acceptance that diagnosing problems and developing solutions takes a collective effort that meaningfully engages young people, providers, community members, officials and technical experts from different disciplines.
- We’ve built a strong sense of psychological safety, but know it has its limitations; wherever power dynamics are most pronounced – notably between the donors and the implementers, and between the prime and subs – fostering this psychological safety is more challenging.
- We’ve moved away from the ‘plan and execute’ way of working and embraced a more iterative and adaptive approach. It’s an approach that turns the typical ‘command and control’ hierarchy on its head and empowers those on the front line to adapt within agreed parameters. But we have struggled to simplify the complexity of the adaptive implementation framework and to routinely document adaptations – what emerging insight or contextual change prompted the adaptation, what was the adaptation and what difference did the adaptation make.
- We were fortunate to have a process evaluation during the first phase which helped elicit a lot of learning, but none of our results framework indicators relate to learning agility and adaptive capacity. And we often struggle to slow down enough to have thorough pause and reflect moments. Speed is most definitely the enemy of quality, but time is a luxury we are rarely if ever afforded.
- We have elevated the profile of learning in our reporting and been courageous enough to highlight missteps and failures. But reporting is still predominantly backward looking and focused on upwards accountability.
- Conducting research and learning and generating evidence and insights is one of the project’s strategic pillars and we are part funded by a donor’s implementation research budget. However, reconciling donor agendas – one focused on research and learning, the other on service delivery results – can create a tension that is hard to reconcile.
I wonder too if such a shift would essentially mean focusing on the process more than results. This is a popular mantra in many other fields of work, from professional sport to business. Not any process, but the right process. And perhaps unsurprisingly, there’s a lot of evidence to suggest the right process is one that focuses on collaboration, learning, adaptation, and continuous improvement.
The results may follow. Or they may confound us. But either way, we will have generated learning, strengthened capacity, and built resilience.
Beguiling Events – Kenneth Boulding, Economist
A system is a black box
Of which we can’t unlock the locks,
And all we can find out about
Is what goes in and what comes out.
Perceiving input-output pairs,
Related by parameters,
Permits us, sometimes, to relate
An input, output and a state.
If this relation’s good and stable
Then to predict we may be able,
But if this fails us – heaven forbid!
We’ll be compelled to force the lid!
For more information about the webinar and related resources please click here.