International engagement: m & e meeting Monitoring & Evaluation: an introduction for practitioners Liz Allen.

Slides:



Advertisements
Similar presentations
Developing ESD: what role for indicators? William Scott Centre for Research in Education and the Environment University of Bath
Advertisements

Anonymous Services, Hard to Reach Participants, and Issues around Outcome Evaluation Centre for Community Based Research United Way of Peel Region Roundtable.
Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
From Research to Advocacy
Assessing student learning from Public Engagement David Owen National Co-ordinating Centre for Public Engagement Funded by the UK Funding Councils, Research.
Developing an Evaluation Strategy – experience in DFID Nick York Director – Country, Corporate and Global Evaluations, World Bank IEG Former Chief Professional.
Evaluation Mary Rowlatt MDR Partners. Definition of project evaluation Evaluation focuses on whether the project was effective, achieved its objectives,
Evaluation What, How and Why Bother?.
Impact assessment in the funding sector: the role of altmetrics Adam Dinsmore
Mywish K. Maredia Michigan State University
Monitoring and Evaluation in the CSO Sector in Ghana
Monitoring and evaluation of carers’ services and projects Dr Andrea Wigfield - Associate Professor of Social Policy Centre for International Research.
Theory of Change, Impact Monitoring, and Most Significant Change EWB-UK Away Weekend – March 23, 2013.
Project Monitoring Evaluation and Assessment
Lesson 2: Project: Evaluation, Monitoring, Auditing Macerata, 22 nd October Alessandro Valenza, Director, t33 srl.
Overview 1. Building a social impact strategy.
Capturing the impact of research Briony Rayfield.
6th Conference on Open Access Scholarly Publishing September 17th – 19th, 2014 Kevin Dolby Wellcome Trust OA Publishing Community Standards: Article Level.
The Lumina Center Grantseeking Workshop Series Presents Outcomes & Evaluations April 20, 2006.
Evaluation. Practical Evaluation Michael Quinn Patton.
Pestalozzi Children‘s Foundation emPower 2012 Monitoring & Evaluation Lecturers: Beatrice Schulter.
Evaluating Physical Activity Intervention Programs Thomas Schmid, PhD Physical Activity and Health Branch CDC Atlanta, Georgia, USA.
CDC Evaluation Process Harlen Hays, MPH Office of Health Promotion, KDHE.
Plan © Plan Assessing programme effectiveness at the global level in a large and complex organisation Presentation delivered to the conference on Perspectives.
1 Qualitative Evaluation Terms Coding/categorization The process of condensing qualitative data through the identification of common themes. Data Matrix.
Monitoring Evaluation Impact Assessment Objectives Be able to n explain basic monitoring and evaluation theory in relation to accountability n Identify.
Lessons from RAPID’s work on research-policy links John Young.
Impact assessment framework
Lesson 3: Monitoring and Indicator Macerata, 23 nd October Alessandro Valenza, Director, t33 srl.
Sociology 3322a. “…the systematic assessment of the operation and/or outcomes of a program or policy, compared to a set of explicit or implicit standards.
Accountability in Health Promotion: Sharing Lessons Learned Management and Program Services Directorate Population and Public Health Branch Health Canada.
Outcome Based Evaluation for Digital Library Projects and Services
Developing Indicators
Lesson 2: Project: Evaluation, Monitoring, Auditing Macerata, 22 nd October Alessandro Valenza, Director, t33 srl.
Addressing methodological challenges: measuring resilience + international coherence Juliet Field Climate and Environment Dept.
Setting the Stage: Workshop Framing and Crosscutting Issues Simon Hearn, ODI Evaluation Methods for Large-Scale, Complex, Multi- National Global Health.
EVALUATION APPROACHES Heather Aquilina 24 March 2015.
Implementation and process evaluation: developing our approach Ann Lendrum University of Manchester Neil Humphrey University of Manchester Gemma Moss Institute.
UNDAF M&E Systems Purpose Can explain the importance of functioning M&E system for the UNDAF Can support formulation and implementation of UNDAF M&E plans.
Monitoring & Evaluation: The concepts and meaning Day 9 Session 1.
Integrating Knowledge Translation and Exchange into a grant Maureen Dobbins, RN, PhD SON, January 14, 2013.
Donor perspectives on planning and evaluation Janice Astbury The J.W. McConnell Family Foundation.
Impact Evaluation in Education Introduction to Monitoring and Evaluation Andrew Jenkins 23/03/14.
April_2010 Partnering initiatives at country level Proposed partnering process to build a national stop tuberculosis (TB) partnership.
1 of 27 How to invest in Information for Development An Introduction Introduction This question is the focus of our examination of the information management.
December_2009 Partnership maintenance. December_2009 Partnership maintenance $$ $ $
BCO Impact Assessment Component 3 Scoping Study David Souter.
Third Sector Evaluation: Challenges and Opportunities Presentation to the Public Legal Education in Canada National Conference on “Making an Impact” 26.
Project Management Learning Program 7-18 May 2012, Mekong Institute, Khon Kaen, Thailand Writing Project Report Multi-Purpose Reporting.
Screen 1 of 22 Food Security Policies – Formulation and Implementation Policy Monitoring and Evaluation LEARNING OBJECTIVES Define the purpose of a monitoring.
Alain Thomas Overview workshop Background to the Principles Definitions The National Principles for Public Engagement What.
Monitoring and Evaluation
Evaluation design and implementation Puja Myles
The Starting Point Things to consider include Defining the overarching aim(s) of the event/activity Deciding the type of event you wish to undertake Defining.
Developing Monitoring & Evaluation Frameworks: Process or Product? Anne Markiewicz.
Session 13: Monitoring and Evaluation PubH325 Global Social Marketing Donna Sherard, MPH November 30, 2009.
Implementation Science: Finding Common Ground and Perspectives Laura Reichenbach, Evidence Project, Population Council International Conference on Family.
Middle Managers Workshop 2: Measuring Progress. An opportunity for middle managers… Two linked workshops exploring what it means to implement the Act.
ICT4D: Evaluation Dr David Hollow DSV, University of Stockholm.
Evaluation Nicola Bowtell. Some myths about evaluation 2Presentation title - edit in Header and Footer always involves extensive questionnaires means.
PROGRAM EVALUATION A TOOL FOR LEARNING AND CHANGE.
Evaluating the Quality and Impact of Community Benefit Programs
Gathering a credible evidence base
CILIP Performance Framework – Business metrics & KPI
Governance and leadership roles for equality and diversity in Colleges
Participatory Action Research
Resourcing Consumer Engagement
LÁSZLÓ DOMOKOS President of the SAO
Integrating Gender into Rural Development M&E in Projects and Programs
Data for PRS Monitoring: Institutional and Technical Challenges
Presentation transcript:

International engagement: m & e meeting Monitoring & Evaluation: an introduction for practitioners Liz Allen

Source: Subsetum; monitoring & evaluation can play a vital role in the management & improvement of an activity, organisation or process

Monitoring & evaluation: why do it? Theory & practice M & E: challenges Keeping it real: practical & proportionality my talk Monitoring & evaluation: an introduction for practitioners

o accountability & validation o management of funding initiatives o strategy & planning o policy & advocacy o learning my funder requires it … I want my programme to work well … I want to do more in the future … I want to tell people about what we do … & show the benefits … I want to find out what works … rationale Monitoring & evaluation: rationale

Theory

Source: Subsetum; o M & E framework o Monitoring o Set up review o Process review o Formative review o Summative review

Source: Subsetum; o M & E framework o Monitoring o Set up review o Process review o Formative review o Summative review o Inputs/activities o outputs o outcomes o … impact? o (longer term impact)

Source: Subsetum; o M & E framework o Monitoring o Set up review o Process review o Formative review o Summative review o inputs/activities o outputs o outcomes o … impact? (longer term impact) Things to do & things to ‘measure/track’

Theory & formal frameworks o Logical framework o Outcome mapping o Payback framework o Theories of change o Results-based management

Monitoring & evaluation: rationale M & E in project cycle Monitoring & review points INPUTS & OUTPUTS ‘ M & E Framework’ design Set-up/process review INPUTS Project start Taking stock OUTPUTS/OUTCOMES Learning & lessons for next time

Frameworks good for o keeping outcomes in mind o ‘forces’ definition of: o project objectives o indicators o appropriate data collection method & timings o organisation o simplification Monitoring & evaluation: rationale

Frameworks less good o keeping outcomes in mind o ‘forces’ definition of: o project objectives o indicators o appropriate data collection method & timings o organisation o simplification Monitoring & evaluation: rationale o accommodating externalities & context o dealing with the unexpected o understanding process

Practice doing m & e

(my) basic principles o involvement of stakeholders o definition of objectives/outcomes & associated indicators & method o integrated into project plan from start o properly resourced o practical (usable) & proportionate Monitoring & evaluation: doing m & e

(my) basic principles o involvement of stakeholders o definition of objectives/outcomes & associated indicators & method o integrated into project plan from start o properly resourced o practical (usable) & proportionate Monitoring & evaluation: doing m & e

an indicator “quantitative or qualitative factor or variable that provides a simple and reliable means to measure achievement.” “ something that helps us to understand where we are, where we are going and how far we are from the goal... They are bits of information that summarize the characteristics of systems or highlight what is happening in a system. “ Introduction to LFA: defining indicators

SMART & SPICED indicators SMART Specific Measurable Achievable Relevant Timebound SPICED Subjective Participatory Interpreted/communicable Cross-checked Empowering Diverse & disaggregated Introduction to LFA: defining indicators

“Everything that can be counted does not necessarily count; everything that counts cannot necessarily be counted.” attributed to Albert Einstein theoretical physicist, philosopher & author

Quantitative o access & participation data (‘penetration’) o attendance & visitor data (‘reach’) o ‘audience’ characteristics (e.g. demographics) o consumer characteristics o scientific output assessment (e.g. publications) o ‘alt metrics’ o amount raised/ follow-on funding o structured opinion & feedback

Qualitative o interviews o open questions (in questionnaire) o focus groups o ethnography o participatory research o observation o comment/bulletin boards o visitor books

Monitoring & evaluation: rationale & overview Theory & practice M & E: challenges Keeping it real: practical & proportionality my talk Monitoring & evaluation: an introduction for practitioners

the elusive ‘impact’ time frame involved serendipity attribution & contribution ‘ripple effects’ counter-factual value of ‘negative’ findings

and closer to home … getting agreement on goals, objectives, indicators & method

and resourcing m & e properly … adequately, systematically, proportionately…

Monitoring & evaluation: rationale & overview Theory & practice M & E: challenges Keeping it real: practical & proportionality my talk Monitoring & evaluation: an introduction for practitioners

Understand stakeholder & audience requirements Be prospective: build m & e in from the start Choose right method & tailor Resourcing: ensure access to key data & information & someone to manage this Consider option for trends & benchmarks Keep it real: be proportionate & practical – measures can evolve Be flexible – learning is iterative & should be part of the process Beware over-monitoring & evaluation! Summary Monitoring & evaluation: keeping it real

Wellcome Trust’s Indicators of Progress OutcomesKey indicators of progress Discoveries Applications Engagement Research leaders Research environment Influence 1. significant advances in the generation of new knowledge 2. contribute to discoveries with tangible impacts on health 3. contribute to the development of enabling technologies, products and devices 4. uptake of research into policy and practice 5. enhanced level of informed debate in biomedicine 6. significant engagement of key audiences & increased reach 7. develop a cadre of research leaders 8. evidence of significant career progression among those we support 9. key contributions to the creation, development and maintenance of major research resources 10. contributions to the growth of centres of excellence 11. significant impact on science funding & policy developments 12. significant impact on global research priorities and processes

Wellcome Trust’s Indicators of Progress OutcomesKey indicators of progress Discoveries Applications Engagement Research leaders Research environment Influence 1. significant advances in the generation of new knowledge 2. contribute to discoveries with tangible impacts on health 3. contribute to the development of enabling technologies, products and devices 4. uptake of research into policy and practice 5. enhanced level of informed debate in biomedicine 6. significant engagement of key audiences & increased reach 7. develop a cadre of research leaders 8. evidence of significant career progression among those we support 9. key contributions to the creation, development and maintenance of major research resources 10. contributions to the growth of centres of excellence 11. significant impact on science funding & policy developments 12. significant impact on global research priorities and processes

Wellcome Trust’s Indicators of Progress OutcomesKey indicators of progress Discoveries Applications Engagement Research leaders Research environment Influence 1. significant advances in the generation of new knowledge 2. contribute to discoveries with tangible impacts on health 3. contribute to the development of enabling technologies, products and devices 4. uptake of research into policy and practice 5. enhanced level of informed debate in biomedicine 6. significant engagement of key audiences & increased reach 7. develop a cadre of research leaders 8. evidence of significant career progression among those we support 9. key contributions to the creation, development and maintenance of major research resources 10. contributions to the growth of centres of excellence 11. significant impact on science funding & policy developments 12. significant impact on global research priorities and processes

Questions

Suggested resources Ahmed, S. & Palermo, A-G (2010) Community Engagement in Research: Frameworks for Education and Peer Review. American Journal of Public Health, 100, 8: Matthew, B. & Ross, L. (2010) Research Methods: a practical guide for the social sciences. Pearson Education Limited NORAD (2008) Results Management in Norwegian Development Cooperation: a practical guide. Stein, D. & Valters, C. (2012) Understanding ‘Theory of Change’ in international al development : a review of existing knowledge Vogel, I. (2012) Review of the use of ‘Theory of Change’ in international development. For DfID UK Web Center for Social Research methods