Presentation is loading. Please wait.

Presentation is loading. Please wait.

Innovations in Evaluation IPDET Workshop, Ottawa, June 14 2013 Simon Roy & Louise Mailloux.

Similar presentations


Presentation on theme: "Innovations in Evaluation IPDET Workshop, Ottawa, June 14 2013 Simon Roy & Louise Mailloux."— Presentation transcript:

1 Innovations in Evaluation IPDET Workshop, Ottawa, June 14 2013 Simon Roy & Louise Mailloux

2  Definitions  Innovations in Canada  Innovations on the International Scene  Discussion: What’s your experience? 2 Outline

3  Innovations can be defined as alternative and new ways of conducting evaluations (methods, analyses, governance, etc.)  Many drivers: −Methodological challenge affecting data quality or availability −Opportunities stemming from new technologies −Influence from other disciplines/professions −HR or governance challenges 3 A Definition…

4  Innovations are region-specific: What is innovative in one place may not be in another area  Some innovations may work in one country, but not in another 4 Contextual factors

5 5 Recent Innovations in Canada

6  Multimode approaches in surveys  Focus on cost analyses  Professionalization of evaluation: certification of evaluators 6 Three notable innovations in last decade In Canada

7  Surveys traditionally done in single mode: mail or phone or fax  Low response rates now major problem  Evaluators have moved to surveys administered in multiple modes: respondents offered to complete it online, by phone or by mail  Advantages: Higher response rates, less bias in terms of sampling  Disadvantage: There is a bias associated to the mode 7 Multi-Mode Surveys

8  Many governments moving towards “value for money” analyses, including analysis of input, outputs and outcomes in view of the costs involved  Innovation is in the refinement of the approaches to conduct such analyses 8 Cost Analyses

9 9 Perspectives on Assessing Resources Utilization and the Results Chain Primary focus of analysis Informs analysis Operational Efficiency InputsOutputsActivities Immediate Outcomes Intermediate Outcomes Ultimate Outcomes Results Chain Allocative Efficiency Economy The analysis for economy, operational efficiency and allocative efficiency occurs along the results chain.

10  Canadian Evaluators have an association: The Canadian Evaluation Society (CES) (http://www.evaluationcanada.ca/)  The CES implemented an Evaluation Credentialing Program in 2010. Evaluators can become “Credentialed Evaluators”  This is an association-led initiative. The Governments of Canada have no direct control over this credential.  It is not a requirement to conduct evaluations. 10 Credentialing

11  Canadian Evaluators can receive a credential if they meet criteria (demonstration of competency), including 2 years of evaluation experience and competencies in 5 areas (see appendix)  Expected benefits: Evaluators gain recognition. Credentials help evaluation users select evaluation provider.  About 200 credentialed evaluators to date. 11 Credentialing

12  Evaluation is evolving – becoming more and more complex  Before discounting new ways, look at the advantages, especially how they can compensate for limitations of traditional approaches (traditional methods have gaps too!)  Weigh the advantages vs. disadvantages, manage them to reduce the latter. Have a backup plan. 12 Our Overall Lessons to Date

13 13 Innovations International Development Context

14 Real Time Evaluations (RTE) Digital Data

15 A definition of RTE  A real-time evaluation (RTE) is an evaluation in which the primary objective is to provide feedback in a participatory way in real time (i.e. during the evaluation fieldwork) to those executing and managing a humanitarian response. Source: Real-time evaluations of humanitarian action An ALNAP Guide Pilot Version, John Cosgrave Ben Ramalingam and Tony Beck, 2009

16 Origins of RTEs  In the humanitarian sector, UNHCR’s Evaluation and Policy Analysis Unit (EPAU) was for several years the chief proponent of RTE  WFP, UNICEF, the Humanitarian Accountability Project, CARE, World Vision, Oxfam GB, the IFRC, FAO, WFP and others have all to some degree taken up the practice. Source: ISSUE 32 December 2005 Humanitarian Exchange Magazine Real-Time Evaluation: where does its value lie? by Maurice Herson and John Mitchell, ALNAP

17 RTE vs other types of evaluations  RTEs look at today to influence this week’s/month’s programming  Mid-term evaluations look at the first phase to influence programming in the second phase  Ex-post evaluations are retrospective: they look at the past to learn from it

18 Key Features/Methods  Semi-structured interviews  Purposeful sampling – complemented by snowball sampling in the field  Interviews with beneficiary groups important  Observation

19 Methodological Contraints of RTE  Limited use of statistical sampling (sample frame)  Limited use of surveys  Lack of pre-planned coordination between humanitarian actors  Baseline studies usually inexistent  Attribution (cause and effect) difficult given the multiplicity of actors Source: Brusset, E., Cosgrave, J., & MacDonald, W. (2010). Real-time evaluation in humanitarian emergencies. In L. A. Ritchie & W. MacDonald (Eds.), Enhancing disaster and emergency preparedness, response, and recovery through evaluation. New Directions for Evaluation, 126, 9–20.

20 Lessons - Advantages  Timeliness: RTEs bring in an external perspective, analytical capacity and knowledge at a key point in a response.  Perspective: RTEs reduce the risks that early operational choices bring about critical problems in the longer term.  Interactivity: RTEs enable programming to be influenced as it happens, allowing agencies to make key changes at an intermediate point in programming.

21 Lessons - Challenges  Utilisation: Weakness in the follow up on recommendations  Ownership: workers, managers, beneficiaries?  Focus: What are the key questions?  Meeting each partners’ needs for accountability and learning  Few RTEs in complex emergencies Source:Lessons from recent Inter Agency Real Time Evaluations (IA RTEs) Riccardo Polastro

22 Digital data and tools

23 Rationale behind it  Explosion in the quantity and diversity of high frequency digital data e.g. mobile-banking transactions, online user-generated content such as blog posts and Tweets, online searches, satellite images, computerized data analysis.  Digital data hold the potential—as yet largely untapped— to allow decision makers to track development progress, improve social protection, and understand where existing policies and programmes require adjustment Source: Global Pulse, Big Data for Development: Challenges & Opportunities May 2012, www.unglobalpulse.org

24

25 Big Data – UN Initiative  1) Early warning: early detection of anomalies in how populations use digital devices and services can enable faster response in times of crisis  2) Real-time awareness: Big Data can paint a fine- grained and current representation of reality which can inform the design and targeting of programs and policies  3) Real-time feedback: makes it possible to understand human well-being and emerging vulnerabilities, in order to better protect populations from shocks

26 Potential Uses and Focus  ILO, UNICEF and WFP, researching changes in social welfare, especially with regard to food and fuel prices, and employment issues The number of tweets discussing the price of rice in Indonesia in 2011 follows a similar function as the official inflation statistics for the food basket.

27 What is Big Data? "Big Data" is a popular phrase used to describe a massive volume of both structured and unstructured data that is so large that it's difficult to process with traditional database and software techniques. Types of digital data sources 1.Data Exhaust 2.Online Information 3.Physical Sensors 4.Citizen Reporting or Crowd-sourced Data

28 Lessons Learned to Date Privacy  Privacy is an overarching concern that has a wide range of implications vis-à-vis data acquisition, storage, retention, use and presentation −People routinely consent to the collection and use of web-generated data by simply ticking a box without fully realising how their data might be used or misused. −Do bloggers consent to have their content analyzed by publihing on the web?

29 Lessons Learned to Date Access and Sharing  Much of the publicly available online data (data from the “open web”) has potential value for development, there is a great deal more valuable data that is closely held by corporations and is not accessible  “The next movement in charitable giving and corporate citizenship may be for corporations and governments to donate data, which could be used to help track diseases, avert economic crises, relieve traffic congestion, and aid development.” Source: Data Philanntropy where are we now. Andreas Pawelke and Anoush Rima TatevossianMay 8, 2013

30 Lessons Learned to date Analysis  “conceptualisation” (i.e. defining categories, clusters);  selection bias (representative of general population?)  “measurement” (i.e. assigning categories and clusters to unstructured data, or vice-versa)  “verification” (i.e. assess how well steps 1 and 2 fare in extracting relevant information)

31 31 Discussion: What’s happening in your organization/ country in terms of innovation in evaluation? What lessons can you share about what works and what does not work?

32 32 Thank You! Louise Mailloux – lmailloux@ggi.ca Simon Roy – sroy@ggi.ca

33 1.0 Reflective Practice: competencies focus on the fundamental norms and values underlying evaluation practice and awareness of one’s evaluation expertise and needs for growth. 2.0 Technical Practice: competencies focus on the specialized aspects of evaluation, such as design, data collection, analysis, interpretation and reporting. 3.0 Situational Practice: competencies focus on the application of evaluative thinking in analyzing and attending to the unique interests, issues, and contextual circumstances in which evaluation skills are being applied. 4.0 Management Practice: competencies focus on the process of managing a project/evaluation, such as budgeting, coordinating resources and supervising. 5.0 Interpersonal Practice: competencies focus on people skills, such as communication, negotiation, conflict resolution, collaboration, and diversity. 33 Appendix: Competency Domains in Evaluation

34 Appendix RTE Distinguishing Features Real-time evaluations Traditional evaluations Need In-the-moment feedback at critical decision points In-depth analysis in a detailed report, with the clarity of hindsight. Types of deliverables Frequent in-person meetings and data summaries. Full report at a defined end point and potentially at mid-point. End goal Getting the program to work as efficiently as possible, as soon as possible. Learning what worked and what didn’t, and using that information to inform the next iteration of the program. Cost May be more costly due to multiple rounds of data analysis and meetings. Since evaluation activities may evolve to meet changing information needs, costs are not always as predictable. Costs are generally more predictable because you know what activities will be conducted at the evaluation outset. Trade-offs The analysis will not be as rigorous because in-the-moment feedback cannot achieve the same clarity as hindsight. The analysis will not be available until midway through or after a program’s end. However, with the additional time available, a higher degree of rigor is possible. Source: Getting Real About Real-Time Evaluation, Clare Nolan and Fontane Lo, Non-Profit Magazine, March 29, 2012

35 Appendix: Types of digital data sources  (1) Data Exhaust – passively collected transactional data from people’s use of digital services like mobile phones, purchases, web searches, etc., and/or operational metrics and other real-time data collected by UN agencies, NGOs and other aid organisations to monitor their projects and programmes (e.g. stock levels, school attendance). These digital services create networked sensors of human behaviour.  (2) Online Information – web content such as news media and social media interactions (e.g. blogs, Twitter), news articles, e-commerce, job posting. This approach considers web usage and content as a sensor of human intent, sentiments, perceptions, and want. Source : www.unglobalpulse.org

36 Appendix: Types of digital data sources  (3) Physical Sensors – satellite or infrared imagery of changing landscapes, traffic patterns, light emissions, urban development and topographic changes, etc. This approach focuses on remote sensing of changes in human activity  (4) Citizen Reporting or Crowd-sourced Data – Information actively produced or submitted by citizens through mobile phone-based surveys, hotlines, user- generated maps, etc. While not passively produced, this is a key information source for verification and feedback Source : www.unglobalpulse.org


Download ppt "Innovations in Evaluation IPDET Workshop, Ottawa, June 14 2013 Simon Roy & Louise Mailloux."

Similar presentations


Ads by Google