Developing Theories of Action through behaviour change knowledge and practice

posted on 15 September 2017


It was a privilege to present in front of a full room at AES 2017! Who knew so many people were interested in behaviour change! Slides from the presentation are here.Behaviour change intervention...

Technology in M&E: Four questions to help you find your technology solution

posted on 13 September 2017


Ask M&E professionals about technology in M&E and you will get a mix of excitement and uncertainty. There is excitement about the possibilities that technology can offer, but uncertainty abou...

Interrogating the logic of evaluators’ influence

posted on 08 September 2017


The themes I noticed at #AES17 included quite a few disheartening ones: the powerlessness of evidence and knowledge (our fine evaluation products!) in the current socio-political environment; th...

AES conference 2017 Canberra Reflections

posted on 08 September 2017


Tired but encouraged after 5 days at the #AES17CBR conference. The keynote speakers were outstanding this year. Three main themes stood out for me. In different ways, all the keynote speakers cha...

Of sustainability and reconciliation - AES conference 2017 Canberra

posted on 07 September 2017


There were two stand-out presentations for me at the #aes17CBR, which concludes in Canberra today. On reflection I’ve realised a large part of why these resonated with me has to do with the value...

Understanding Failure to Manage Conflict

posted on 07 September 2017


“Should evaluators avoid conflict?”  Anthea posed this incisive question while presenting her case study, only to be retorted by Scott that there is a “difference in managing conflict as opposed ...

The most significant change (MSC) technique is a form of participatory monitoring and evaluation. It is participatory because many project stakeholders are involved both in deciding the sorts of change to be recorded and in analysing the data. It is a form of monitoring because it occurs throughout the program cycle and provides information to help people manage the program. It contributes to evaluation because it provides data on impact and outcomes that can be used to help assess the performance of the program as a whole.

MSC Publications

MSC User Guide

posted on 11 July 2016


User GuideIn 2015 Rick Davies and Jess Dart published the  User Guide for the Most Significant Change Technique in English. It is aimed at organisations, community groups, students and academics wh...

A Dialogical Story-Based Evaluation Tool

posted on 11 July 2016


Dart, J.J. and Davies R.J. (2003) A dialogical story-based evaluation tool: the most significant change technique, American Journal of Evaluation 24, 137-155.This article provides an introduction t...

A Self-Help Guide for Implementing MSC

posted on 11 July 2016


Dart, J.J. (2003) A Self-Help Guide for Implementing the Most Significant Change Technique (MSC) This aim of this guide is to help groups design an MSC system for their program or project. The gu...

MSC Resources

Target 10 Evaluation Stories

Dart, J.J. (2000) Target 10 Evaluation stories, Department of Natural Resources and Environment, Victorian State Government, MelbourneBetween May 1998 and May 1999, the Target 10 dairy extension pr...

Lotterywest MSC

Lotterywest has been a vital part of Western Australia’s community life since 1933, providing direct support to thousands of community organisations through grants, and to hospitals, cultural and s...

A Story Approach for Monitoring

Dart, J.J. (1999) A Story Approach for monitoring change in an agricultural extension project”, proceedings of the Association for Qualitative Research (AQR), International Conference, Melbourne, A...

Stories for Change: A New Model

Dart, J.J. (2000) Stories for Change: A new model of evaluation for agricultural extension projects in Australia, PhD, Institute of Land and Food Resources, University of MelbourneA model of evalua...

MSC Video: The seven secrets of good monitoring and evaluation

Most Significant Change ‘for ‘beginners’. It’s origin/development, the basic process, the rationale.https://www.youtube.com/watch?v=7mKiESZnmxg

The Most Significant Change approach for monotoring an Australian extension project

This article describes the MSC approach and highlights some experiences gained during a 12-month trial with the Target 10 Dairy Extension Project. It is suggested that this approach constitutes an...

Collaborative Outcomes Reporting (COR) is a participatory approach to impact evaluation. It centres on a performance story that presents evidence of how a program has contributed to outcomes and impacts. Developed by Jess Dart of Clear Horizon, COR combines contribution analysis and Multiple Lines and Levels of Evidence (MLLE), mapping existing and additional data against the program logic to produce a performance story.  This performance story is then reviewed by both technical experts and program stakeholders, which may include community members.The aim is to tell the ‘story’ of a program’s performance using multiple-lines of evidence.  

COR Publications

What is Collaborative Outcomes Reporting?

posted on 16 September 2016


Collaborative Outcomes Reporting (COR) is a participatory approach to impact evaluation. It centres on a performance story that presents evidence of how a program has contributed to outcomes and ...

Step by step guide to Collaborative Outcomes Reporting

posted on 11 July 2016


Guide to the use of Collaborative Outcomes Reporting developed for the Better Evaluation website.

COR Resources

Stronger, Smarter Realities

In Australia there is a big disparity between educational outcomes for Indigenous children compared to non-indigenous children, and in the last 8 years educational outcomes have been either stable ...

AG Performance Story Report

Australian Government, Clear Horizon, O’Connor NRM (2008), Mount Lofty Ranges Southern Emu-wren and Fleurieu Peninsula Swamps Recovery ProgramPerformance Story – MLRSEW and FPS RPThe aim of the Mou...

Castlemaine 500 Project Outcomes

In 2006, the Central Victorian Greenhouse Alliance (CVGA) secured the Victorian Government’s support to fund a behaviour change program that would test – by engaging a significant proportion of a t...

"People-centred logic" refers to a specific technique for developing program logic where the focus is on the key people that the project aims to influence. A limitation to traditional logic models has been their tendency to focus predominantly on activities and effects without considering who is affected and where the activity is taking place. One of the key reasons Clear Horizon favours a people-centred logic model approach is because social change is all about people. This approach helps distinguish between the different levels of impact experienced by different participant groups. The approach to program logic was first published in a paper written by Jess Dart named people-centred evaluation (2006).



PCL Publications

The national Lifetimewool project: A journey in evaluation

posted on 16 September 2016


This paper, co-authored by Dr Jess Dart, discusses the evaluation of the national Lifetimewool project. The objective of this project was to develop practical grazing management guidelines that wou...

People-Centred Evaluation

posted on 16 September 2016


This paper outlines the emerging ‘People-Centred Evaluation’ (PCE) approach that guides the development of practical internal monitoring, evaluation and learning (MEL) frameworks for projects and p...

Blending Most Significant Change (MSC) with Outcomes Harvesting, this new tool “Significant Policy Improvement” tool, offers a structured way to capture policy influence, and a process for verifying and ranking the importance of each instance. It can even be used to develop measurable targets! SPI is a new tool developed by Jess Dart, and was piloted with the Australian aid program in Indonesia (DFAT). A paper is being written! Meanwhile please see the slides from the AES 16 conference.

SPI Resources

Slides show of SIPSI methodology

This slide show provides an overview of the SIPSI methodology as presented at the AES 2016 conference. It also provides a case study of how it was used in DFAT Indonesia. SIPSI technique.pdf