I have been involved in evaluation since the early 2000’s and understand that it has been a discipline that has always understood the complexity of the world in which programs are designed and implemented. In many cases evaluation (and here I am talking about the discipline of evaluation) has not been able to fully respond to the complexity of the so-called “wicked problems” as well as it might. I should clarify that it was not always a shortcoming of ‘evaluation’. Rather it was often the result of a constraint put on the evaluation by a client or time or resources or all of the above. But, over the past decade or so I have seen a real shift in efforts to come to grips with the complexity of the wicked problems through the development of new approaches and a turning of evaluative thinking to ‘complexity awareness’.

Recently, some of the Clear Horizon team took some time out to attend Mark Cabaj’s Complex Community Change Masterclass. Mark is a Canadian thought-leader in complexity, social innovation, and strategic learning and evaluation. I wanted to find out what our team thought of the Masterclass and sat down with Jess, Tom and Carina to find out.

Vicki: What is it about his approach that excites you and what do you think it offers you in your role?

Jess: Mark is involved in working with change makers and in creating change at the system level. When he talks about evaluation he weaves in elements of working at a systems level, working on complex long term change – which are the challenges I face in my own evaluation work. I find his approach really refreshing. It’s the combination of systems/ collective impact and evaluation that make it so exciting

Carina: What I found most exciting was that people actually work in this complexity space where things are dynamic and unpredictable and there are many different players and perspectives. Tackling homelessness is an example of a wicked problem that will take a new way of thinking and working together in a more collaborative way. I could see a really important part for evaluative thinking which needs to be embedded.

Tom: What excites me is that I now see a line of sight to what is happening in evaluation practice and what the theory is for solving complex problems. When I was at university I was taught that single disciplines like engineering were not enough on their own to solve complex challenges like adapting to climate change. It’s great to know that as evaluators we can, and are bringing our rigour and methods to support solving complex problems!

Vicki: What do you think it means for the practice of evaluation?

Carina: There is a huge place for the practice of evaluation in influencing complex change, but not in the traditional sense. For evaluation to be embraced and useful in complexity aware solutions, it needs to focus on producing evidence that generates learning. It needs to be flexible and allow for the development of new measures ‘on the run’ to keep up with the emerging and evolving goals. The focus on accountability is important but more from a sense of being accountable to societal values, and less of a focus on accountability to funders. Maybe more importantly, evaluation needs to support a hunger for learning and not be constrained by the fear of failure.

Tom: Evaluators need to adapt to supporting 21st century problem solving if we want to remain relevant and contribute to better futures. As Carina says, this means embracing uncertainty and being ready to capture and respond to emergent outcomes in a way that supports continuous learning and improvement. To do this we might need to focus more on developing monitoring systems that give innovators real-time information when they need it, and in a format that is useful to them, there might be less of an emphasis on door stopping evaluations.

Carina: Door stopping in that so often evaluation reports are not utilised but used as door stops? Like I said,  I think the fundamental change is that funders need to focus less on accounting for their attribution and see that the only way these large seemly intractable problems can be solved is collaboratively and accountability is to the community as a whole.

Tom: We’d do well to shift our thinking from thinking about mapping and influencing individual actors and behaviours to mapping and influencing system actors and behaviours. In our work the People Centred Logic approach we use helps us to be ‘complexity aware’. The challenge is to scale this approach from largely project and program level evaluation up to systems wide interventions.

Jess: Mark’s suggestions for evaluating systems change were MSC and outcomes harvesting and contribution analysis, which we already use. We don’t always use them in the context he describes – he is working on bigger change processes than we often work with but his really practical ideas around evaluating social innovation – in particular evaluating prototypes offer us some useful guidance.

Jess, Carina and Tom were really excited about what they learnt from Mark at the Masterclass and are already putting some of this new knowledge into practice. I am seeing more appreciation for novel approaches to evaluation of complexity from our clients as well. The practice of evaluation and evaluative thinking is responding to the challenge. Personally, I’m looking forward to the new horizons that are opening up in this complex and wicked world.