Evaluating complex social change programs

posted on 18 May 2017


I have been involved in evaluation since the early 2000’s and understand that it has been a discipline that has always understood the complexity of the world in which programs are designed an...

Piloting a ‘light touch’ Episode Study

posted on 16 November 2016


Recently we have noticed a groundswell of interest in how to evaluate policy influence; a notoriously slippery thing to catch hold of. This is in part due to the multitude of forces that drive po...

Evaluator interrupted

posted on 25 October 2016


Last week I presented a keynote at the Australian Association for Environmental Education annual conference. The conference theme was “future making”. As a mega science fiction nerd, this got me ex...

‘Hold your guns, this is a Summit Workshop’

posted on 07 October 2016


In this blog one of our Senior Consultants Stuart Raetz shares his experience of Summit Workshops. A ‘summit workshop’ is a workshop that Clear Horizon often runs at the end of a collaborative ou...

Highlights from day three of the AES 2016 in Perth, Dr Jess Dart

posted on 22 September 2016


Evidence-based mantra. The day kicked off with Kathryn Newcomer, President Elect of American Evaluation Association (AEA), who described the “evidence-based mantra” that has American decision-ma...

Key themes from day two of AES 16: Design and evaluation, by Dr Jess Dart

posted on 21 September 2016


  A gap in policy evaluation. On day 2 of the conference, John Owen, our keynote speaker, gave us a comprehensive overview of the territory of evaluation thinking and practice in Australasia....

The most significant change (MSC) technique is a form of participatory monitoring and evaluation. It is participatory because many project stakeholders are involved both in deciding the sorts of change to be recorded and in analysing the data. It is a form of monitoring because it occurs throughout the program cycle and provides information to help people manage the program. It contributes to evaluation because it provides data on impact and outcomes that can be used to help assess the performance of the program as a whole.

MSC Publications

MSC User Guide

posted on 11 July 2016


User GuideIn 2015 Rick Davies and Jess Dart published the  User Guide for the Most Significant Change Technique in English. It is aimed at organisations, community groups, students and academics wh...

A Dialogical Story-Based Evaluation Tool

posted on 11 July 2016


Dart, J.J. and Davies R.J. (2003) A dialogical story-based evaluation tool: the most significant change technique, American Journal of Evaluation 24, 137-155.This article provides an introduction t...

A Self-Help Guide for Implementing MSC

posted on 11 July 2016


Dart, J.J. (2003) A Self-Help Guide for Implementing the Most Significant Change Technique (MSC) This aim of this guide is to help groups design an MSC system for their program or project. The gu...

MSC Resources

Target 10 Evaluation Stories

Dart, J.J. (2000) Target 10 Evaluation stories, Department of Natural Resources and Environment, Victorian State Government, MelbourneBetween May 1998 and May 1999, the Target 10 dairy extension pr...

Lotterywest MSC

Lotterywest has been a vital part of Western Australia’s community life since 1933, providing direct support to thousands of community organisations through grants, and to hospitals, cultural and s...

A Story Approach for Monitoring

Dart, J.J. (1999) A Story Approach for monitoring change in an agricultural extension project”, proceedings of the Association for Qualitative Research (AQR), International Conference, Melbourne, A...

Stories for Change: A New Model

Dart, J.J. (2000) Stories for Change: A new model of evaluation for agricultural extension projects in Australia, PhD, Institute of Land and Food Resources, University of MelbourneA model of evalua...

MSC Video: The seven secrets of good monitoring and evaluation

Most Significant Change ‘for ‘beginners’. It’s origin/development, the basic process, the rationale.https://www.youtube.com/watch?v=7mKiESZnmxg

The Most Significant Change approach for monotoring an Australian extension project

This article describes the MSC approach and highlights some experiences gained during a 12-month trial with the Target 10 Dairy Extension Project. It is suggested that this approach constitutes an...

Collaborative Outcomes Reporting (COR) is a participatory approach to impact evaluation. It centres on a performance story that presents evidence of how a program has contributed to outcomes and impacts. Developed by Jess Dart of Clear Horizon, COR combines contribution analysis and Multiple Lines and Levels of Evidence (MLLE), mapping existing and additional data against the program logic to produce a performance story.  This performance story is then reviewed by both technical experts and program stakeholders, which may include community members.The aim is to tell the ‘story’ of a program’s performance using multiple-lines of evidence.  

COR Publications

What is Collaborative Outcomes Reporting?

posted on 16 September 2016


Collaborative Outcomes Reporting (COR) is a participatory approach to impact evaluation. It centres on a performance story that presents evidence of how a program has contributed to outcomes and ...

Step by step guide to Collaborative Outcomes Reporting

posted on 11 July 2016


Guide to the use of Collaborative Outcomes Reporting developed for the Better Evaluation website.

COR Resources

Stronger, Smarter Realities

In Australia there is a big disparity between educational outcomes for Indigenous children compared to non-indigenous children, and in the last 8 years educational outcomes have been either stable ...

AG Performance Story Report

Australian Government, Clear Horizon, O’Connor NRM (2008), Mount Lofty Ranges Southern Emu-wren and Fleurieu Peninsula Swamps Recovery ProgramPerformance Story – MLRSEW and FPS RPThe aim of the Mou...

Castlemaine 500 Project Outcomes

In 2006, the Central Victorian Greenhouse Alliance (CVGA) secured the Victorian Government’s support to fund a behaviour change program that would test – by engaging a significant proportion of a t...

"People-centred logic" refers to a specific technique for developing program logic where the focus is on the key people that the project aims to influence. A limitation to traditional logic models has been their tendency to focus predominantly on activities and effects without considering who is affected and where the activity is taking place. One of the key reasons Clear Horizon favours a people-centred logic model approach is because social change is all about people. This approach helps distinguish between the different levels of impact experienced by different participant groups. The approach to program logic was first published in a paper written by Jess Dart named people-centred evaluation (2006).



PCL Publications

The national Lifetimewool project: A journey in evaluation

posted on 16 September 2016


This paper, co-authored by Dr Jess Dart, discusses the evaluation of the national Lifetimewool project. The objective of this project was to develop practical grazing management guidelines that wou...

People-Centred Evaluation

posted on 16 September 2016


This paper outlines the emerging ‘People-Centred Evaluation’ (PCE) approach that guides the development of practical internal monitoring, evaluation and learning (MEL) frameworks for projects and p...

Blending Most Significant Change (MSC) with Outcomes Harvesting, this new tool “Significant Policy Improvement” tool, offers a structured way to capture policy influence, and a process for verifying and ranking the importance of each instance. It can even be used to develop measurable targets! SPI is a new tool developed by Jess Dart, and was piloted with the Australian aid program in Indonesia (DFAT). A paper is being written! Meanwhile please see the slides from the AES 16 conference.