Recently we have noticed a groundswell of interest in how to evaluate policy influence; a notoriously slippery thing to catch hold of. This is in part due to the multitude of forces that drive policy change but also because change usually happens over a long period with many different actors and interests. In addition, external ‘environmental’ factors such as unforeseen events often change the agenda, creating opportunities and risks for policy makers. How do we untangle this morass? This is...
Last week I presented a keynote at the  Australian Association for Environmental Education annual conference. The conference theme was “future making”. As a mega science fiction nerd, this got me excited and thinking about the future trends and innovations that are going to affect us as evaluators. In the short term I can see that other fields are going disrupt the space of evaluators in positive and interesting ways. Human-centred Design Human-centred Design is changing the way services are...
In this blog one of our Senior Consultants Stuart Raetz shares his experience of Summit Workshops. A ‘summit workshop’ is a workshop that Clear Horizon often runs at the end of a collaborative outcomes reporting (COR) evaluation. The summit is usually the grand finale to an evaluation, which may have run for months or even years. Recently I’ve run a few summits that have coincided with the end of a program.  When this happens summit workshops are a great opportunity for people to look back o...
Evidence-based mantra . The day kicked off with Kathryn Newcomer, President Elect of American Evaluation Association (AEA), who described the “evidence-based mantra” that has American decision-makers in its grips. Kathryn gave us a fascinating glimpse into how evidence does, and doesn’t, influence government policy in America. She painted a picture of a myriad of actors demanding and collecting evidence on impact - in particular ‘DEBIS’ – “demonstrated evidence-based intervention solutions”....
  A gap in policy evaluation . On day 2 of the conference, John Owen, our keynote speaker, gave us a comprehensive overview of the territory of evaluation thinking and practice in Australasia. He did some analysis of articles in the AES journal over the last few years and found some interesting things. The thing that stood out for me was that there was very little written on policy evaluation – it was mostly on program evaluation, and setting up evaluation systems. Fascinating given the impo...
Great first day at the AES conference in Perth and I am proud that Clear Horizon is sponsoring such a great event. A great start with a strong focus on Culture . Our very own Director, Carina Calzoni, opened the conference and handed over to Shaun Nannup for the welcome to country. Shaun described the landscapes of WA vividly and invited us to centre ourselves, then link arms and focus on the 10% of all the inputs from our senses that we actually take in. Following Shaun was a great keynote ...
Design and Evaluation Support Services (DESS) for the DFAT Indonesian aid program – 2014-2016. In 2014-15 The Australian Government Department of Foreign Affairs and Trade (DFAT) invested $627.7 million dollars in the Indonesian aid program. Each year it produces a public document – the Indonesia Aid Program Performance report. All of the investments need to be quality assured, with the highest standards of design, monitoring and evaluation. However, designing and evaluating complex aid prog...
This paper, co-authored by Dr Jess Dart, discusses the evaluation of the national Lifetimewool project. The objective of this project was to develop practical grazing management guidelines that would enable wool growers throughout Australia to increase lifetime production of wool per hectare from ewes. The paper focuses specifically on the evaluation work that was conducted on the project between 2003 and 2008. The Lifetimewool project used ‘people-centred evaluation’ to help guide the creati...
This paper outlines the emerging ‘People-Centred Evaluation’ (PCE) approach that guides the development of practical internal monitoring, evaluation and learning (MEL) frameworks for projects and programs. This approach to developing MEL frameworks is conducted in a participatory manner,  and centres around the notion of creating a shared understanding of who the program can realistically influence, and what outcomes – or practice changes – are expected from these people. PCE is both a peopl...
Collaborative Outcomes Reporting (COR) is a participatory approach to impact evaluation . It centres on a performance story that presents evidence of how a program has contributed to outcomes and impacts. This performance story is then reviewed by both technical experts and program stakeholders, which may include community members. Developed by Jess Dart of Clear Horizon, COR combines contribution analysis and Multiple Lines and Levels of Evidence (MLLE), mapping existing and additional data...