Designing Rubrics for Evaluation Last week, I attended an Australasian Evaluation Society (AES) workshop on “Foundations of Rubric Design”. It was a thought-provoking workshop. Kystin Martens, the presenter of the workshop, explained and challenged our understanding about rubric design as well as presented some practical tips to develop and use rubrics properly in evaluation. Here are my key takeaway points from the workshop: Why do we use rubrics in evaluation? A rubric is a tool or matrix o...
Our lead facilitators Dr Jess Dart (founding director) and Carina Calzoni (Director) have made an exciting change to the format of Clear Horizon’s 5-day Complexity Aware Monitoring and Evaluation, Learning based and Adaption training  (CAMELA) to provide course participants an easier way to manage their professional development. While the content has not changed, the training is now structured in 2 parts: Part A (day 1 and 2) - Program Logic and Theory of Change for complex programs. Check t...
Join Clear Horizon and TACSI for a new 3-day training on bringing together the worlds of Design and Evaluation! For more information and to sign up check out the “Evaluation and Design” public training course . You can also see our reflections from the first time we ran the training on our blog . AES members, students, NGO staff and group bookings may be eligible for a discount.
Byron Pakula and I presented yesterday on ‘ Why do well designed M&E systems seldom inform decision making?’  at the 2018 Australasian Aid Conference . The presentation considered the lack of learning and decision-making in programs, as a result of weaknesses in monitoring systems. Monitoring and evaluation (M&E) is often geared more towards accountability, where monitoring data feeds into formal evaluations and reporting. But this approach does not support the rapid feedback loops required ...
Damien Sweeney and I were fortunate enough to participate in the Australian Aid Conference 2018, hosted by the Crawford School at ANU.  The conversations in the halls between panels were just as good as the presentations themselves, sparking many nuggets of thoughts to take us through 2018.  So, the Three Golden Nuggets that we will take away from the conference:  Firstly, complexity is everywhere and we need better ways to deal with it particularly in terms of M&E.  The presentation from Da...
It’s that time of year again. The team at Clear Horizon have returned from their respective holidays—hiking the Andes nourished only by water and thoughts of the year to come; making prides of lions more efficient in the African savannah; lying on a desert island accompanied by the wisdom of the latest publication by a celebrity evaluator—and are excited to jump in to 2018 with gusto. 2017, wow. What a year, right? Wrong. If 2017 was a confusing poster of a dolphin with an inspirational quote...
Last month Clear Horizon and The Australian Centre for Social Innovation (TACSI) had a great time delivering our new and sold out course on reconciling the worlds of Human Centred Design (HCD) & evaluation. Hot off the press, we’re proud to introduce the Integrated Design, Evaluation and Engagement with Purpose (InDEEP) Framework (Figure 1) which underpinned the course. (Figure 1) The InDeep Framework The Integrated Design, Evaluation and Engagement (InDEEP) Framework (Figure 1) has been dev...
Day 3 started for me at 8am on evaluation failures. It was a very honest session, (felt a wee bit like giving confession) but a great way to open up the field differently. Stephanie Evergreen shared failing to get people to use effective data visualization, Patricia Rogers got up there and bravely spoke about persisting when an evaluation became symbolic and bombed. Then we all felt better by singing along with Michael Patton and Leonard Cohen with evaluation lyrics... "You never really care...
Wednesday afternoon at the #Eval17 AEA was quite extraordinary - I sat there with over 4,000 evaluators in the plenary as Kathy Newcomer, president of the AEA presented her crisp 7 challenges and 7 solutions for learning and evaluation. The last of her 7 challenges was about evaluation being blind to bias and racism – and that these are baked into policies and programs. We are not immune. She urged us to incorporate equity into our evaluations. Then this morning our keynote was Melvin Hall ,...
I had the privilege of attending Michael Quinn Patton's workshop on principles-focused developmental evaluation. To be honest, it wasn't what I was expecting; I was thinking I would learn more about developmental evaluation. While I did this,  it was also a different and fascinating journey into re-thinking not just how we evaluate programs, but also how we might design more learning-focused (emergent) and holistic programs or even organizations. This was aimed at complex settings. It makes ...