Innovations in Evaluation IPDET Workshop, Ottawa, June 14 2013 Simon Roy & Louise Mailloux.

Slides:



Advertisements
Similar presentations
Outcome mapping in child rights-based programming
Advertisements

Tips and Resources IASC Cluster/Sector Leadership Training
Introduction to Monitoring and Evaluation
Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
Thematic evaluation on the contribution of UN Women to increasing women’s leadership and participation in Peace and Security and in Humanitarian Response.
Assessing student learning from Public Engagement David Owen National Co-ordinating Centre for Public Engagement Funded by the UK Funding Councils, Research.
Donald T. Simeon Caribbean Health Research Council
PUBLIC EMPLOYMENT SERVICES: Active Labour Market Policies Relevant ILO Conventions.
Project Monitoring Evaluation and Assessment
Frank Yu Australian Bureau of Statistics Unstructured Data 1.
Return On Investment Integrated Monitoring and Evaluation Framework.
PPA 502 – Program Evaluation
Principles of Marketing
ONS Big Data Project. Plan for today Introduce the ONS Big Data Project Provide a overview of our work to date Provide information about our future plans.
Principles of Analysis and Dissemination Country Course on Analysis and Dissemination of Population and Housing Census Data with Gender Concern October.
What is Business Analysis Planning & Monitoring?
UNICEF Turkey Country Programme
Monitoring Evaluation Impact Assessment Objectives Be able to n explain basic monitoring and evaluation theory in relation to accountability n Identify.
Informing a data revolution right data, to the right people, at the right time, in the right format From data to policy action in low income countries:
Margaret J. Cox King’s College London
Community Services Programme Strand 1 & 3 Business Planning Re-contracting April 2014.
Techniques in Civic Engagement Presented by Bill Rizzo Local Government Specialist UW-Extension Local Government Center
Kim Andreasson Managing Director DAKA advisory AB Bahrain International eGovernment Forum Kingdom of Bahrain 8-10 April 2013 Measuring E-Government.
1. IASC Operational Guidance on Coordinated Assessments (session 05) Information in Disasters Workshop Tanoa Plaza Hotel, Suva, Fiji June
Impact evaluation: External and internal stakes Impact evaluation seminar - 2 to 6 December, Phnom Penh.
The Measurement and Evaluation of the PPSI Oregon Pilot Program Paint Product Stewardship Initiative Portland, Oregon December 10, 2009 Matt Keene Office.
IAOD Evaluation Section, the Development Agenda (DA) and Development Oriented Activities Julia Flores Marfetan, Senior Evaluator.
Information Systems. What are Information Systems? The largest growth in most economies is coming from 'information' industries. The success of such knowledge-based.
1 Mid-Term Review of the Hyogo Framework for Action Roadmap to Disaster Risk Reduction in the Americas & HFA Mid-Term Review.
Contact Monitoring Regional Network (CMKN). Why procurement It is estimated that an effective public procurement system could save as much as 25% of government.
Screen 1 of 26 Markets Assessment and Analysis Markets and Food Security LEARNING OBJECTIVES Identify the components of a typical market assessment for.
Dr. David Mowat June 22, 2005 Federal, Provincial & Local Roles Surveillance of Risk Factors and Determinants of Chronic Diseases.
Monitoring & Evaluation Presentation for Technical Assistance Unit, National Treasury 19 August 2004 Fia van Rensburg.
Public Health Advocacy in Low Income Settings: Views and Experiences on Effective Strategies and Evaluation of Health Advocates in Malawi IFGH Conference:
Screen 1 of 26 Baseline Food Security Assessments Introduction to Baseline and Action-oriented Assessments LEARNING OBJECTIVES At the end of this lesson.
PPA 502 – Program Evaluation Lecture 2c – Process Evaluation.
AES PROFESSIONAL LEARNING COMMITTEE DAVID EARLE MARGARET MACDONALD RITA PERKINS PAUL CHESTERTON RICK CUMMINGS Core Knowledge and Skills for Evaluators.
Copyright 2000 Prentice Hall5-1 Chapter 5 Marketing Information and Research: Analyzing the Business Environment.
ASEF Risk Communication for Public Health Emergencies, 2015 Overview.
BCO Impact Assessment Component 3 Scoping Study David Souter.
Donor Coordination Forum 16 October, key challenges Poverty Social exclusion Functional gaps and system weaknesses in social services.
New World, New World Bank Group Presentation to Fiduciary Forum On Post Crisis Direction and Reforms March 01, 2010.
Market research for a start-up. LEARNING OUTCOMES By the end of this lesson I will be able to: –Define and explain market research –Distinguish between.
Screen 1 of 20 Vulnerability Vulnerability Assessment LEARNING OBJECTIVES Define the purpose and scope of vulnerability assessment. Understand how vulnerability.
Project Management Learning Program 23 Mar – 3 Aprl 2009, Mekong Institute, Khon Kaen, Thailand Managing for Development Results Results-Oriented Monitoring.
Developing a Framework In Support of a Community of Practice in ABI Jason Newberry, Research Director Tanya Darisi, Senior Researcher
Kathy Corbiere Service Delivery and Performance Commission
1 Unstructured Data (UD) What is unstructured data? How is it statistically valuable? Challenges of turning UD into information.
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
How to measure the impact of R&D on SD ? Laurence Esterle, MD, PhD Cermes and Ifris France Cyprus, 16 – 17 October L. ESTERLE Linking science and.
27/04/2017 Strengthening of the Monitoring and Evaluation system for FTPP/FTTP in FAO /SEC December 2015 FTPP/FTFP Workshop, Bishkek, Kyrgyzstan.
High level seminar on the implementation of the System of National Accounts 2008 in the GCC countries Muscat, Oman, 27 May 2010 United Nations Statistics.
Introduction to research
Version VTT TECHNOLOGY STUDIES Evaluating the societal impacts of Public research organisations: A (belated) paradigm shift in the making Kaisa.
United Nations Economic Commission for Europe Statistical Division WHAT MAKES AN EFFECTIVE AND EFFICIENT STATISTICAL SYSTEM Lidia Bratanova, Statistical.
Managing Marketing Information 4 Principles of Marketing.
FROM PRINCIPLE TO PRACTICE: Implementing the Principles for Digital Development Perspectives and Recommendations from the Practitioner Community.
Marketing II Chapter 4: Managing Marketing Information to Gain Customer Insights.
European Agency for Development in Special Needs Education Project updates Marcella Turner-Cmuchal.
Country-led Joint Evaluation Dutch ORET/MILIEV Programme in China NCSTE Country-led Joint Evaluation Dutch ORET/MILIEV Programme in China Chen Zhaoying.
UN GLOBAL PULSE: Product Harnessing innovation to protect the vulnerable Sara Farmer Chief Platform Architect, UN Global Pulse Executive Office of the.
Assessments ASSESSMENTS. Assessments The Rationale and Purpose for Assessments.
Session 1. The Central Principles of HiAP WORKSHOP: PREPARING FOR TRAINING IN HEALTH IN ALL POLICIES (HiAP) USING THE NEWLY LAUNCHED WHO HiAP TRAINING.
Module 1: Introducing Development Evaluation
Introduction on the outline and objectives of the workshop
State of World’s Cash Report:
International Statistics
How can DTM Multi-Sectoral Location Assessment be useful for Partners?
Big Data in Official Statistics: Generalities
Presentation transcript:

Innovations in Evaluation IPDET Workshop, Ottawa, June Simon Roy & Louise Mailloux

 Definitions  Innovations in Canada  Innovations on the International Scene  Discussion: What’s your experience? 2 Outline

 Innovations can be defined as alternative and new ways of conducting evaluations (methods, analyses, governance, etc.)  Many drivers: −Methodological challenge affecting data quality or availability −Opportunities stemming from new technologies −Influence from other disciplines/professions −HR or governance challenges 3 A Definition…

 Innovations are region-specific: What is innovative in one place may not be in another area  Some innovations may work in one country, but not in another 4 Contextual factors

5 Recent Innovations in Canada

 Multimode approaches in surveys  Focus on cost analyses  Professionalization of evaluation: certification of evaluators 6 Three notable innovations in last decade In Canada

 Surveys traditionally done in single mode: mail or phone or fax  Low response rates now major problem  Evaluators have moved to surveys administered in multiple modes: respondents offered to complete it online, by phone or by mail  Advantages: Higher response rates, less bias in terms of sampling  Disadvantage: There is a bias associated to the mode 7 Multi-Mode Surveys

 Many governments moving towards “value for money” analyses, including analysis of input, outputs and outcomes in view of the costs involved  Innovation is in the refinement of the approaches to conduct such analyses 8 Cost Analyses

9 Perspectives on Assessing Resources Utilization and the Results Chain Primary focus of analysis Informs analysis Operational Efficiency InputsOutputsActivities Immediate Outcomes Intermediate Outcomes Ultimate Outcomes Results Chain Allocative Efficiency Economy The analysis for economy, operational efficiency and allocative efficiency occurs along the results chain.

 Canadian Evaluators have an association: The Canadian Evaluation Society (CES) (  The CES implemented an Evaluation Credentialing Program in Evaluators can become “Credentialed Evaluators”  This is an association-led initiative. The Governments of Canada have no direct control over this credential.  It is not a requirement to conduct evaluations. 10 Credentialing

 Canadian Evaluators can receive a credential if they meet criteria (demonstration of competency), including 2 years of evaluation experience and competencies in 5 areas (see appendix)  Expected benefits: Evaluators gain recognition. Credentials help evaluation users select evaluation provider.  About 200 credentialed evaluators to date. 11 Credentialing

 Evaluation is evolving – becoming more and more complex  Before discounting new ways, look at the advantages, especially how they can compensate for limitations of traditional approaches (traditional methods have gaps too!)  Weigh the advantages vs. disadvantages, manage them to reduce the latter. Have a backup plan. 12 Our Overall Lessons to Date

13 Innovations International Development Context

Real Time Evaluations (RTE) Digital Data

A definition of RTE  A real-time evaluation (RTE) is an evaluation in which the primary objective is to provide feedback in a participatory way in real time (i.e. during the evaluation fieldwork) to those executing and managing a humanitarian response. Source: Real-time evaluations of humanitarian action An ALNAP Guide Pilot Version, John Cosgrave Ben Ramalingam and Tony Beck, 2009

Origins of RTEs  In the humanitarian sector, UNHCR’s Evaluation and Policy Analysis Unit (EPAU) was for several years the chief proponent of RTE  WFP, UNICEF, the Humanitarian Accountability Project, CARE, World Vision, Oxfam GB, the IFRC, FAO, WFP and others have all to some degree taken up the practice. Source: ISSUE 32 December 2005 Humanitarian Exchange Magazine Real-Time Evaluation: where does its value lie? by Maurice Herson and John Mitchell, ALNAP

RTE vs other types of evaluations  RTEs look at today to influence this week’s/month’s programming  Mid-term evaluations look at the first phase to influence programming in the second phase  Ex-post evaluations are retrospective: they look at the past to learn from it

Key Features/Methods  Semi-structured interviews  Purposeful sampling – complemented by snowball sampling in the field  Interviews with beneficiary groups important  Observation

Methodological Contraints of RTE  Limited use of statistical sampling (sample frame)  Limited use of surveys  Lack of pre-planned coordination between humanitarian actors  Baseline studies usually inexistent  Attribution (cause and effect) difficult given the multiplicity of actors Source: Brusset, E., Cosgrave, J., & MacDonald, W. (2010). Real-time evaluation in humanitarian emergencies. In L. A. Ritchie & W. MacDonald (Eds.), Enhancing disaster and emergency preparedness, response, and recovery through evaluation. New Directions for Evaluation, 126, 9–20.

Lessons - Advantages  Timeliness: RTEs bring in an external perspective, analytical capacity and knowledge at a key point in a response.  Perspective: RTEs reduce the risks that early operational choices bring about critical problems in the longer term.  Interactivity: RTEs enable programming to be influenced as it happens, allowing agencies to make key changes at an intermediate point in programming.

Lessons - Challenges  Utilisation: Weakness in the follow up on recommendations  Ownership: workers, managers, beneficiaries?  Focus: What are the key questions?  Meeting each partners’ needs for accountability and learning  Few RTEs in complex emergencies Source:Lessons from recent Inter Agency Real Time Evaluations (IA RTEs) Riccardo Polastro

Digital data and tools

Rationale behind it  Explosion in the quantity and diversity of high frequency digital data e.g. mobile-banking transactions, online user-generated content such as blog posts and Tweets, online searches, satellite images, computerized data analysis.  Digital data hold the potential—as yet largely untapped— to allow decision makers to track development progress, improve social protection, and understand where existing policies and programmes require adjustment Source: Global Pulse, Big Data for Development: Challenges & Opportunities May 2012,

Big Data – UN Initiative  1) Early warning: early detection of anomalies in how populations use digital devices and services can enable faster response in times of crisis  2) Real-time awareness: Big Data can paint a fine- grained and current representation of reality which can inform the design and targeting of programs and policies  3) Real-time feedback: makes it possible to understand human well-being and emerging vulnerabilities, in order to better protect populations from shocks

Potential Uses and Focus  ILO, UNICEF and WFP, researching changes in social welfare, especially with regard to food and fuel prices, and employment issues The number of tweets discussing the price of rice in Indonesia in 2011 follows a similar function as the official inflation statistics for the food basket.

What is Big Data? "Big Data" is a popular phrase used to describe a massive volume of both structured and unstructured data that is so large that it's difficult to process with traditional database and software techniques. Types of digital data sources 1.Data Exhaust 2.Online Information 3.Physical Sensors 4.Citizen Reporting or Crowd-sourced Data

Lessons Learned to Date Privacy  Privacy is an overarching concern that has a wide range of implications vis-à-vis data acquisition, storage, retention, use and presentation −People routinely consent to the collection and use of web-generated data by simply ticking a box without fully realising how their data might be used or misused. −Do bloggers consent to have their content analyzed by publihing on the web?

Lessons Learned to Date Access and Sharing  Much of the publicly available online data (data from the “open web”) has potential value for development, there is a great deal more valuable data that is closely held by corporations and is not accessible  “The next movement in charitable giving and corporate citizenship may be for corporations and governments to donate data, which could be used to help track diseases, avert economic crises, relieve traffic congestion, and aid development.” Source: Data Philanntropy where are we now. Andreas Pawelke and Anoush Rima TatevossianMay 8, 2013

Lessons Learned to date Analysis  “conceptualisation” (i.e. defining categories, clusters);  selection bias (representative of general population?)  “measurement” (i.e. assigning categories and clusters to unstructured data, or vice-versa)  “verification” (i.e. assess how well steps 1 and 2 fare in extracting relevant information)

31 Discussion: What’s happening in your organization/ country in terms of innovation in evaluation? What lessons can you share about what works and what does not work?

32 Thank You! Louise Mailloux – Simon Roy –

1.0 Reflective Practice: competencies focus on the fundamental norms and values underlying evaluation practice and awareness of one’s evaluation expertise and needs for growth. 2.0 Technical Practice: competencies focus on the specialized aspects of evaluation, such as design, data collection, analysis, interpretation and reporting. 3.0 Situational Practice: competencies focus on the application of evaluative thinking in analyzing and attending to the unique interests, issues, and contextual circumstances in which evaluation skills are being applied. 4.0 Management Practice: competencies focus on the process of managing a project/evaluation, such as budgeting, coordinating resources and supervising. 5.0 Interpersonal Practice: competencies focus on people skills, such as communication, negotiation, conflict resolution, collaboration, and diversity. 33 Appendix: Competency Domains in Evaluation

Appendix RTE Distinguishing Features Real-time evaluations Traditional evaluations Need In-the-moment feedback at critical decision points In-depth analysis in a detailed report, with the clarity of hindsight. Types of deliverables Frequent in-person meetings and data summaries. Full report at a defined end point and potentially at mid-point. End goal Getting the program to work as efficiently as possible, as soon as possible. Learning what worked and what didn’t, and using that information to inform the next iteration of the program. Cost May be more costly due to multiple rounds of data analysis and meetings. Since evaluation activities may evolve to meet changing information needs, costs are not always as predictable. Costs are generally more predictable because you know what activities will be conducted at the evaluation outset. Trade-offs The analysis will not be as rigorous because in-the-moment feedback cannot achieve the same clarity as hindsight. The analysis will not be available until midway through or after a program’s end. However, with the additional time available, a higher degree of rigor is possible. Source: Getting Real About Real-Time Evaluation, Clare Nolan and Fontane Lo, Non-Profit Magazine, March 29, 2012

Appendix: Types of digital data sources  (1) Data Exhaust – passively collected transactional data from people’s use of digital services like mobile phones, purchases, web searches, etc., and/or operational metrics and other real-time data collected by UN agencies, NGOs and other aid organisations to monitor their projects and programmes (e.g. stock levels, school attendance). These digital services create networked sensors of human behaviour.  (2) Online Information – web content such as news media and social media interactions (e.g. blogs, Twitter), news articles, e-commerce, job posting. This approach considers web usage and content as a sensor of human intent, sentiments, perceptions, and want. Source :

Appendix: Types of digital data sources  (3) Physical Sensors – satellite or infrared imagery of changing landscapes, traffic patterns, light emissions, urban development and topographic changes, etc. This approach focuses on remote sensing of changes in human activity  (4) Citizen Reporting or Crowd-sourced Data – Information actively produced or submitted by citizens through mobile phone-based surveys, hotlines, user- generated maps, etc. While not passively produced, this is a key information source for verification and feedback Source :