Since 2017, Clear Horizon has been providing Monitoring and Evaluation (M&E) support for Partnership for Knowledge-based Poverty Reduction (PKPR) and Local Solutions to Poverty (LSP) facility, implemented by the World Bank in Indonesia. The programs are funded by a Multi-Donor Trust Fund (MDTF). The Partnership for Knowledge-based Poverty Reduction (PKPR) forms one part of the World Bank’s Indonesia program that works with the Government of Indonesia to strengthen evidence-based policy and p...
Clear Horizon and GHD Pty Ltd (GHD) are partners in Monitoring and Evaluation (M&E) House (Buka Hatene) in Timor-Leste. M&E House is a four-year investment funded by the Australia’s development program. M&E House provides M&E technical and advisory services to the Australian Embassy in Timor-Leste and implementing partners to collect and use credible information to improve Australia’s development program. The overall goal for M&E House is to ensure that Australia’s aid program is high perfor...
Clear Horizon was engaged, between 2016 and 2017, to facilitate and support a partner-led evaluation of KNOWFOR. KNOWFOR was a forestry research and knowledge program funded by the UK Government. It was a partnership program between the Department for International Development (DFID) and three implementing partners (Center for International Forestry Research (CIFOR); the International Union for Conservation of Nature (IUCN); and the World Bank Programme of Forests (PROFOR)). A partner-led ev...
Recently, a few of the members from the Social Innovation Team at Clear Horizon were lucky enough to attend the Complexity and Evaluation conference, led by Collaboration for Impact. It was a busy couple of days and a fantastic chance to learn more about how to work in social innovation spaces in ways that don’t stifle big, messy and exciting, systems change initiatives. Critical systems Kate McKegg ’s presentation on critical systems challenged us to critically reflect on our role within a ...
As part of our internal staff capacity building at Clear Horizon, we organise fortnightly learning and development sessions. Last week we discussed adult learning principles and styles, and how these guide the facilitation process of training activities and workshops that we deliver. In the 1970’s, Malcolm Knowles coined the term “andragogy” which refers to methods and principles used in adult education. Later in 1984, he identified six adult learning principles including: The need to know: ...
I recently attended a five-day course on Qualitative Comparative Analysis run by the Australian Consortium for Social and Political Research at the Australian National University. Apart from wanting to be a university student again, if only for a week, I wanted to better understand QCA and its use as an evaluation method. QCA is a case-based method that attempts to bridge qualitative and quantitative analysis through capturing the richness and complexity of individual cases, while at the sam...
Designing Rubrics for Evaluation Last week, I attended an Australasian Evaluation Society (AES) workshop on “Foundations of Rubric Design”. It was a thought-provoking workshop. Kystin Martens, the presenter of the workshop, explained and challenged our understanding about rubric design as well as presented some practical tips to develop and use rubrics properly in evaluation. Here are my key takeaway points from the workshop: Why do we use rubrics in evaluation? A rubric is a tool or matrix o...
Our lead facilitators Dr Jess Dart (founding director) and Carina Calzoni (Director) have made an exciting change to the format of Clear Horizon’s 5-day Complexity Aware Monitoring and Evaluation, Learning based and Adaption training  (CAMELA) to provide course participants an easier way to manage their professional development. While the content has not changed, the training is now structured in 2 parts: Part A (day 1 and 2) - Program Logic and Theory of Change for complex programs. Check t...
Join Clear Horizon and TACSI for a new 3-day training on bringing together the worlds of Design and Evaluation! For more information and to sign up check out the “Evaluation and Design” public training course . You can also see our reflections from the first time we ran the training on our blog . AES members, students, NGO staff and group bookings may be eligible for a discount.
Byron Pakula and I presented yesterday on ‘ Why do well designed M&E systems seldom inform decision making?’  at the 2018 Australasian Aid Conference . The presentation considered the lack of learning and decision-making in programs, as a result of weaknesses in monitoring systems. Monitoring and evaluation (M&E) is often geared more towards accountability, where monitoring data feeds into formal evaluations and reporting. But this approach does not support the rapid feedback loops required ...