Liz Bloom, Consultant (Clear Horizon)

As a new staff member at Clear Horizon it was very helpful to attend Clear Horizon’s Integrated Design, Evaluation and Engagement with Purpose (InDEEP) training recently here in Melbourne. The training was facilitated by the ‘evaluators’ here at Clear Horizon as well as the ‘designers’ from The Australian Centre for Social Innovation (TACSI). Having both perspectives in the room led to some great learnings around the integration of evaluation and Human Centred Design and the ways in which evaluation can support the design process, rather then slowing it down or hindering it.  This is still a new and emerging space and therefore the conversations held over the three days between Clear Horizon and TACSI led to many learnings for the facilitators, as well as the attendees.

After the workshop I had a chat to some of my colleagues and our friends at TACSI to hear some of their learnings and thoughts on the workshop:



Eunice Sotelo, Research Assistant (Clear Horizon)

“The InDeep training was full-on! I learned about the evolving role and 'agnostic' standpoint of the evaluator (coming in with a diverse toolkit versus preferred methodologies), and what service design even looks like, let alone ‘collaboration’. What stood out the most was the symbiosis between evaluation and design. As Jess says, ‘evaluation de-risks design, and design de-risks innovation.’ Both disciplines interweave evaluative and creative thinking in their work; and I think more importantly, evaluation protects the design space so creativity can thrive.

My reflections at the moment revolve around our ability to empathise with the client and end users’ worldviews, and balancing potentially conflicting perspectives. As an evaluator, having an agile mindset is the first step. For design, agility is their bread and butter. Yet both disciplines continue to grapple with complexity; there really is no shortcut, not if we truly want systems change. Michelle, in her attempt to describe what it feels like to be in the design space, shares a personal mantra that I think will also be mine as I journey into developmental evaluation: trust in the process”.


Ione Ardaizosacar, Senior Social innovator (TACSI)

“It was interesting to reflect on the common language between both evaluation and design. We specially covered this point around assumptions. Designers frame assumptions around three design lenses: desirability, feasibility and viability. Evaluators on the other hand, they frame assumptions around criteria (this term corresponds to the design lenses but it covers a wider range of areas) and evaluators focus on the effectiveness criteria.

I also found very valuable to reflect on how the Theory of Change can be a tool to bring evaluation and design together. It is a way to map both the service design ideas/assumptions, a way to keep track of the testing and an approach that supports defining the evaluation approach”.



Jill Campbell, Senior Consultant (Clear Horizon)

“Over the past few months I have been working on two projects providing M&E , one with a substantial design component and the other producing the M&E Framework and providing support for a project focused on strengthening a system. So I was able to contextualise each project within the InDEEP framework and identify useful tools to use right now and as the projects progress. There is nothing like being able to apply what you’ve learnt straightaway!

One of the main reflections I had was that, as evaluators we need to be very clear about where a project is (in the design, development or implementation stage) and to draw on appropriate methods and tools for each stage. This means we also need to educate our clients who have become familiar with M&E for program implementation but not when the project is still at the design and development stages”.



Edgar Daly, Research Assistant (Clear Horizon)

“It was interesting to think about what the disciplines of evaluation and design can offer each other. A lot of the work I’ve been involved in at Clear Horizon has focussed on helping organisations clarify their theory of change at the program or organisational level. This could be for evaluation, capacity building or other strategic purposes. The ‘starter canvas TACSI shared with us seems a pretty handy framework to use when a program’s theory of change is unclear. It might help us to purposefully check back in on scope as the understanding of the problem area and potential solution evolves.

I also really benefitted from hearing the TACSI designers unpack the design process. It helps us evaluators think about how to approach evaluation and the InDEEP framework has been a useful way to think about this relationship”.


Michelle Miller, Senior Social Innovator (TACSI)

“It was great to bring the two disciplines together and see how evaluation can really support the design process throughout the entire journey — even in the famously “fuzzy front end”. As Jess says, “design de-risks innovation”, and “evaluation de-risks design”. Very handy for anyone sponsoring design work. This applies to any form: social innovation, human-centred design, co-design, design thinking”.


Jess Dart, Founding Director (Clear Horizon)

“I got excited about the different way evaluators and designers deal with assumptions. We (evaluators) showed how we map out the theory of change, and then look in particular for any assumptions that are not named, I often call these “dependency” assumptions – they are “killer assumptions” in that if we get them wrong our outcomes won’t be achieved. They are things that we are banking on, but not actually doing anything about. Of course all the boxes and all the arrows are also assumptions, but we like to name a small set of additional assumptions - so very specific use of the term. Our designers on the other hand use the term assumption in a much broader all-encompassing kind of way. They are testing assumptions at every step of the design process, assumptions about what we think is the core problem, assumptions about what people think, what might work, what the opportunity. In fact the whole big thing is one big fat quest to bust assumptions!”


Thomas Hannon, Senior Consultant (Clear Horizon)

“One great insight I got from the training was understanding that social innovators continually iterate on the scope of their designs to continually negotiate and prioritise the boundaries. I found this really powerful to understand because it showed how decisions about what is ‘in’ and what is ‘out’ can be changed as more is found out about challenges and possible solutions. For example, for a project to raise school engagement through working with teachers, our scope might expand if we learn that what goes on outside of school is also more important for raising student engagement. In this case it would lead to a conversation with implementing partners around where to focus the best resourcing effort and how flexible they really are. Having an iterative scope means it’s more possible to prioritise innovation where it’s needed. Bringing it back to evaluation I saw a strong link between syncing up an iterative scope with an iterative Theory of Change that could be also be adjusted to communicate and prioritise the impact of designs”.