Quality Evaluations in Education Interventions 1 March 2016 Dr Fatima Adam Zenex Foundation.

Slides:



Advertisements
Similar presentations
Program Evaluation: What is it?
Advertisements

Administrators Meeting April 21, Key Areas of Grant-Based Monitoring Schools to be Served Instructional Assessments Instructional Strategies and.
The Teacher Work Sample
Introduction to Monitoring and Evaluation
Postgraduate Course 7. Evidence-based management: Research designs.
Bringing it all together!
Mywish K. Maredia Michigan State University
Research Methodology For reader assistance, have an introductory paragraph in which attention is given to the organization of the section in relation to.
Designs to Estimate Impacts of MSP Projects with Confidence. Ellen Bobronnikov March 29, 2010.
Learning Objectives, Performance Tasks and Rubrics: Demonstrating Understanding and Defining What Good Is Brenda Lyseng Minnesota State Colleges.
Formative and Summative Evaluations
Second Language Acquisition and Real World Applications Alessandro Benati (Director of CAROLE, University of Greenwich, UK) Making.
Evidence: What It Is And Where To Find It Building Evidence of Effectiveness Copyright © 2014 by JBS International, Inc. Developed by JBS International.
INSTRUCTIONAL LEADERSHIP FOR DIVERSE LEARNERS Susan Brody Hasazi Katharine S. Furney National Institute of Leadership, Disability, and Students Placed.
Experimental Design The Gold Standard?.
Click to add title Household energy efficiency programme evaluation: does it tell us what we need to know? Dr Joanne Wade CXC
Enhancing assessment capacity For teachers of Authority and Authority-registered subjects.
CNCS Evaluation Highlights Carla Ganiel, Senior Program and Project Specialist AmeriCorps State and National.
Measuring the impact of education interventions Stephen Taylor STELLENBOSCH, AUGUST 2015.
Dissertation Theme “The incidence of using WebQuests on the teaching-learning process of English Foreign Language (EFL) for students attending the seventh.
Parliamentary Portfolio Committee for Basic Education 4 May 2010 Development of Workbooks.
ARROW Trial Design Professor Greg Brooks, Sheffield University, Ed Studies Dr Jeremy Miles York University, Trials Unit Carole Torgerson, York University,
Evaluating a Research Report
Building Evidence in Education: Conference for EEF Evaluators 11 th July: Theory 12 th July: Practice
Experimental Research Methods in Language Learning Chapter 16 Experimental Research Proposals.
Ensuring that Professional Development Leads to Improved Mathematics Teaching & Learning Kristen Malzahn Horizon Research, Inc. TDG Leadership Seminar.
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
Diana Boland Rebekah Gerard Philip Dixon May 2013.
EDU 385 CLASSROOM ASSESSMENT Week 1 Introduction and Syllabus.
Lessons Learned about Going to Scale with Effective Professional Development Iris R. Weiss Horizon Research, Inc. February 2011.
Evaluating Impacts of MSP Grants Hilary Rhodes, PhD Ellen Bobronnikov February 22, 2010 Common Issues and Recommendations.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
Second Language Classroom Research (Nunan, D. 1990) Assoc. Prof. Dr. Sehnaz Sahinkarakas.
1 Instructional Practices Task Group Chicago Meeting Progress Report April 20, 2007.
A Close Examination of Policy-Relevant Education Evaluations: Criteria for Judging Quality Matthew Linick & Diane Fuselier-Thompson.
Evaluating Impacts of MSP Grants Ellen Bobronnikov Hilary Rhodes January 11, 2010 Common Issues and Recommendations.
Internal Assessment Report: A qualitative report on the implementation of district performance assessments and an update on the AIM initiative Presentation.
Comments on: The Evaluation of an Early Intervention Policy in Poor Schools Germano Mwabu June 9-10, 2008 Quebec City, Canada.
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov February 16, 2011.
How to design and deliver a successful evaluation 19 th October 2015 Sarah Lynch, Senior Research Manager National Foundation for Educational Research.
Building an evidence-base from randomised control trials Presentation of the findings of the impact evaluation of the Reading Catch-Up Programme 18 August.
CAPS: COACHING TEACHERS Facilitator: Dr. Lynne Paradis BELIZE LITERACY PROGRAM June 2011.
Effectiveness of Selected Supplemental Reading Comprehension Interventions: Impacts on a First Cohort of Fifth-Grade Students June 8, 2009 IES Annual Research.
Open Forum: Scaling Up and Sustaining Interventions Moderator: Carol O'Donnell, NCER
Candidate Support. Working Agreements Attend cohort meetings you have agreed upon. Start and end on time; come on time and stay for the whole time. Contribute.
Session 7: Planning for Evaluation. Session Overview Key definitions:  monitoring  evaluation Process monitoring and process evaluation Outcome monitoring.
An Analysis of the Grade 3 Department of Basic Education workbooks as curriculum tools Ursula Hoadley & Jaamia Galant University of Cape Town Presentation.
Zenex Foundation RESEP 24 May Presentation Introduce Zenex & rationale for our focus on Literacy Tracing our intervention journey Design, roll-out.
4 th Biennial SAMEA Conference Meaningful Evaluation: Improving Use and Results Evaluation of Teacher-Directed ICT Initiatives Presenter Mokete Mokone.
Deciphering “Evidence” in the New Era of Education Research Standards Ben Clarke, Ph.D. Research Associate - Center for Teaching and Learning, University.
Impact evaluations of the UNICEF-IKEA Foundation programme on Improving Adolescents Lives in Afghanistan, India and Pakistan: Integrating an equity and.
Issues in Evaluating Educational Research
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov March 23, 2011.
Research Problems, Purposes, & Hypotheses
Food and Agriculture Organization of the United Nations
The Research Design Continuum
Evaluation of An Urban Natural Science Initiative
The Year of Core Instruction
New Goal Clarity Coach Training October 27, 2017
Zenex Foundation RESEP 24 May 2016.
Monitoring and Evaluation of Postharvest Training Projects
ISTE Workshop Research Methods in Educational Technology
Brahm Fleisch Research supported by the Zenex Foundation October 2017
Consider the Evidence Evidence-driven decision making
STRENGTHENING QUALITY: REFLECTION ON DESIGN 26 October 2017
Evaluation and Testing
Linking Evaluation to Coaching and Mentoring Models
Fidelity of Implementation in Scaling-up Highly Rated Science Curriculum Units for Diverse Populations Carol O’Donnell and Sharon Lynch The George Washington.
Reminder for next week CUELT Conference.
Improving Early Grade Reading Outcomes
Presentation transcript:

Quality Evaluations in Education Interventions 1 March 2016 Dr Fatima Adam Zenex Foundation

A donor perspective on evaluations

Presentation Introductory comments Evaluation design Brief analysis of one design

Introductory Remarks

Increasing commitment and support for education evaluations

Education change as complex endeavour Education interventions are complex and solutions are equally complex: multiple factors interact in multiple ways There is no magic bullet Continue to engage with new ideas within a frame of limited resources Often faced with slow and incremental changes over a medium to long term: success is not always visible

No longer about whether we should do evaluations, but about how we do quality evaluations

Evaluation Design

Many ways of improving quality of evaluations Evaluation design is key

Why is design important Provides a systematic approach to programme evaluation Makes explicit what is to be evaluated, how and why Aligns the purpose of the intervention-purpose of the evaluation Informs the evaluation questions, methodology and data collection

Evaluation design: 6 Steps

Step 1: Design evaluations side by side programmes

Step 2: Understand the intended purpose for evaluating Accountability versus Knowledge/learning Develop clarity about what you want to evaluate and work with experts to establish whether this is feasible: measurable and realistic resource Processes, outcomes or impact

Step 3: Explain the intervention logic Explicit theory of change, outcomes, indicators: Zenex Learner Programme example Clarificatory process – opportunity to tighten the programme logic and ensure that the alignment between the intent, programme and outcome.

Step 4: Develop appropriate evaluation questions Aligned to the intention of the intervention Aligned to the purpose of your evaluation

Step 5: A ppropriate methods and required data Deliberate about methodological issues including what data will be collected at various levels Don’t collect data for its own sake. Its better to collect key strategic data than too much

Types of Evaluations Qualitative/Quantitative Experimental – X causes Y, Laboratories or RCT Quasi experimental (without randomisation e.g. comparing levels of violence between genders) Non Experimental- correlational and qualitative case studies Selecting a design depends on what you are trying to evaluate and why They all have strengths and weaknesses

Common Practice Programme-Post tests Pre-test and Post-test Pre and post test and comparison group More comprehensive approaches – mixed methods Make clear what the strengths and weaknesses of the different approaches are

Step 6: Defining the scope and scale Period, coverage, geography, budget

Case Study

Case 1: Reading Catch Up The Reading Catch Up Programme was aimed at assisting Grade 4 learners to develop the appropriate competence in English in order to transition effectively into Grade 4 This was implemented in underperforming schools in Gauteng who were unable to absorb the GPLMS

Case 1: Reading Catch Up Step 1: Evaluation was designed side by side with the programme.

Case 1: Reading Catch Up Step 2: Foregrounded Knowledge focus – Preliminary findings suggested that the programme was successful – There was a need to systematically evaluate the programme - know what works – Catch Up Programmes are of increasing strategic importance in our context - if they work we need to implement them on a wider scale

Case 1: Reading Catch Up Step 3: Logic of the intervention - If learners suspend their grade 4 English (FAL) curriculum for a short period and undergo an English Catch up Programme they will acquire the level of competence required to succeed in grade 4 given that English is the medium of instruction - The programme offers structured lesson plans, reading resources and coaching to teachers who will offer the catch up programme to their learners over 11 weeks

Case 1: Reading Catch Up Step 4: Research Question – What is the impact of the Catch Up Programme on the performance of Grade 4 learners in English

Case 1: Reading Catch Up Step 5: Methods RCT o The essence of the method: – Education performance is affected by a large number of variables, especially poverty – if we have a large enough sample, chosen at random from the population – and if we randomly assign schools to Treatment and Control groups, – any difference Treatment and Control can be attributed to the effect of the intervention

Evaluation Design Step 6: Scope and Scale – Pre and Post testing – Treatment and Control groups – Statistically significant sample – 50 Intervention and 50 Control schools – Uncontaminated schools - Not Gauteng

Findings Analysis of pre and post test data showed that both control and intervention schools improved their performance Differences in performances between the two groups was not significant When the data was disaggregated by topic the data showed that the intervention group did better in spelling and grammar Intervention works better for learners with better pre-test scores

Strengths/weaknesses of the RCT design If we did not have the control group we would have concluded that the intervention was hugely successful However, the RCT only explained what, but it was unable to help us understand why and how: – Why spelling and grammar – Why did control groups improve – How did training translate into the classroom practices – Was there increased curriculum coverage, did teachers teach more reading and writing, did learners do more reading and writing, what was the quality of the reading and writing

Conclusion The purpose: should have focused on processes, outcomes and impact Logic: fine Questions: Add why and how Methods: mixed methods

What is Evaluation Evaluation is a process that critically examines a program. It involves collecting and analyzing information about a program’s activities, characteristics, and outcomes. Its purpose is to make judgments about a program, to improve its effectiveness, and/or to inform programming decisions (Patton, 1987).