Posts filed under ‘Evaluation reporting’
An interesting post on the Learning Portal for Design, Monitoring and Evaluation for Peacebuilding that focuses on communicating evaluation findings and brings forward three tips for those who hope to influence decisions with evaluation data:
- Answer the right questions;
- Speak their language;
- Be humble.
The notion of listening to the voices of the affected populations is nothing new in humanitarian evaluation. However, in the past there has been a lot of talk with little action. The Listening Project is one of the first structured and global initiatives to look at this issue – not only from the evaluation perspective but more broadly – and have recently produced a summary study Time to Listen: Hearing People on the Receiving End of International Aid (pdf) based on discussions with almost 6,000 people in 20 countries. You can also read a news report about this issue on IRIN news.
As part of a stakeholder consultations I’ve been involved with for the Joint Standards Initiative, we’ve also been listening to affected populations – from Senegal to Pakistan to Mexico. The video below provides some short excerpts of interviews with affected populations, in addition to humanitarian workers from these consultations.
Blogging and other social media are often used as part of a communicating evaluation results – that is, once the evaluation is finished. However, blogging can also be useful to communicate the evaluation process – that is, as the evaluation is collecting data. I’ve recently been involved in a stakeholder consultation for the Joint Standards Initiative, where as part of communicating the progess of the consultation, myself and the other team members have been blogging “snapshots from the consultation” – from various and diverse locations such as Beirut, Juba and Richard Toll (Senegal).
This we found useful to provide stakeholders with an update of our work and offer some insights into our initial findings.
(The image above is taken from a discussion in Cairo by team member Inji El Abd)
The US-based Innovation Network has published a very interesting study on the State of Evaluation in US non-profit organisations.
The study, based on a survey of some 550 non-profits in the US produced some interesting findings, including the headline above, which is admittedly the more pessimistic of the following:
- 90% of organizations report evaluating their work (up from 85% in 2010)
- 100% (!) of organizations reported using and communicating their evaluation findings
- Budgeting for evaluation is still low. More than 70% of organizations are spending less than 5% of organizational budgets on evaluation
- On average, evaluation-and its close relation, research, continue to be the lowest priorities (compared to fundraising, financial management, communications, etc.)
I find it incredible that 100% report using and communicating their evaluations – If only this would be “significant” usage then we would all be happy…
Further to my earlier post on my presentation at the recent EES conference on “seven new ways to present evaluation findings”, a video was made of my presentation that you can view below.
Thanks to Denis Bours of SEA Change Community of Practice for filming me!
As regular readers will know, I am very interested in how findings of evaluations are presented and used, as I’ve written about before. I’ve put together a brief presentation on this subject (see below) entitled “Seven new ways to present evaluation findings” that I’m presenting today at the European Evaluation Society Conference in Helsinki, Finland. Comments and other ideas welcome!
Often I don’t get to share the findings of the evaluations I undertake, but in this case of an advocacy evaluation, an area that I’ve written about before, the findings are public and can be shared.
I was part of a team that evaluated phase 1 of an advocacy/research project – the Africa Climate Change Resilience Alliance (ACCRA). ACCRA aims to increase governments’ and development actors’ use of evidence in designing and implementing interventions that increase communities’ capacity to adapt to climate hazards, variability and change. Advocacy plays a large role in trying to influence governments and development actors in this project. You can read more in the Executive_Summary (pdf) of the evaluation findings.
The evaluation also produced 5 case studies highlighting successesful advocacy strategies:
- Capacity building and district planning
- Secondment to a government ministry
- Reaching out to government and civil society in Uganda
- Disaster risk profiling in Ethiopia
- Exchanging views and know-how between ACCRA countries
The case studies can be viewed on the ACCRA Eldis community blog (n.b. you have to join the Eldis community to view the case studies, it’s free of charge).
To disseminate the evaluation findings widely we also produced a multimedia clip, as featured below.
Together with Raj Rana, I will be running a workshop on communications and evaluation this coming November in Bern, Switzerland, further information:
Integrating communications in evaluation
Date and place : 10-11 November 2011, Bern
Organisers: University of Fribourg & Swiss Evaluation Society
An often-overlooked step of evaluation is ensuring that findings are communicated, understood and acted upon. Communicating throughout the evaluation process equally poses many challenges. Communicating effectively implies using different means, messages and methods to reach different stakeholder groups, with very different needs and expectations.
A mix of presentations, case studies and practical exercises will be used to promote new approaches for communicating results including social media, interactive presentations and data visualization. The workshop delivery will include a mix of facilitation techniques to introduce effective means of engaging stakeholders in the evaluation process (World Café methodology, buzz groups, visualization techniques, developing consensus, etc.) Participants are encouraged to bring examples of evaluations they have commissioned/implemented, to be used as case studies during the workshop.