4 th Biennial SAMEA Conference Meaningful Evaluation: Improving Use and Results Evaluation of Teacher-Directed ICT Initiatives Presenter Mokete Mokone.

Slides:



Advertisements
Similar presentations
Program Evaluation: What is it?
Advertisements

Some impressions from the school visits and the conference -No systematic report 1 st Some general wisdom 2 nd Key analysis questions of the project Conference.
Performance Assessment
Progress Towards Reading Success: The Reading First Evaluation Prepared by: Amy Kemp, Ph.D. Research Associate and Patricia A. Muller, Ph.D. Associate.
Site-Based Decision Making Campus Planning. Restructuring A process through which a district or school alters the pattern of its structures (vision, rules,
K-6 Science and Technology Consistent teaching – Assessing K-6 Science and Technology © 2006 Curriculum K-12 Directorate, NSW Department of Education and.
Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
Achieve Benefit from IT Projects. Aim This presentation is prepared to support and give a general overview of the ‘How to Achieve Benefits from IT Projects’
M & E for K to 12 BEP in Schools
Decision Making Tools for Strategic Planning 2014 Nonprofit Capacity Conference Margo Bailey, PhD April 21, 2014 Clarify your strategic plan hierarchy.
Program Evaluation. Lecture Overview  Program evaluation and program development  Logic of program evaluation (Program theory)  Four-Step Model  Comprehensive.
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
Evaluating and Revising the Physical Education Instructional Program.
Diploma in Teaching in the Lifelong Learning Sector Curriculum Design and Development (Unit 7 )
Week 2 Standards and evidence Building your professional persona and portfolio.
Evaluation of Training and Education Activities. Objectives By the end of this presentation, participants will be able to List reasons why evaluation.
Program Evaluation Debra Spielmaker, PhD Utah State University School of Applied Sciences, Technology & Education - Graduate Program Advisor USDA-NIFA,
How to Develop the Right Research Questions for Program Evaluation
Dr.Mohamed E. Osman & Prof.Thuwayba A. Al Barwani With Dr.Abdo M. Al Mekhlafi Dr. Khalid Al Saadi Ms.Laila Alhashar Ms.Fathiya Al Maawali Ms.Zuhor Al lawati.
Minimum Core Skills and embedding. A study by the National Research and Development Centre (NRDC) 2006 discovered that…. Learners on embedded courses.
FORMATIVE EVALUATION Intermediate Injury Prevention Course August 23-26, 2011, Billings, MT.
Year Seven Netbook Project. Aims of the Project To evaluate the impact on learning and teaching of using portable technologies both within and outside.
Product Evaluation the outcome phase. Do the magic bullets work? How do you know when an innovative educational program has “worked”? How do you know.
What is Effective Professional Development? Dr. Robert Mayes Science and Mathematics Teaching Center University of Wyoming.
One Voice – One Plan Office of Education Improvement and Innovation MI-CSI: Do Stage Implement Plan and Monitor Plan.
Evaluation Basics Purpose of Evaluation As adults and youth design and implement their evaluation, there are several important principles that will help.
Performance Measures AmeriCorps Project Director Training Saratoga, NY October 7 th – 9 th, 2013.
Monitoring and Evaluation for Adult Education Programmes Module 1 © 2013 PRIA International Academy | Appreciation Courses Monitoring and Evaluation for.
Instructional leadership: The role of promoting teaching and learning EMASA Conference 2011 Presentation Mathakga Botha Wits school of Education.
Donald R. Rainey, Sr., CPPB/VCO Director, Office of General Services Virginia Department of Social Services.
Lecture 8A Designing and Conducting Formative Evaluations English Study Program FKIP _ UNSRI
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 10, 2011.
Seminar on Mid Term Evaluation in Objective 1 and 2 Regions Lessons from the Mid Term Evaluation of Merseyside Objective One.
Quality Management.  Quality management is becoming increasingly important to the leadership and management of all organisations. I  t is necessary.
KEYWORDS REFRESHMENT. Activities: in the context of the Logframe Matrix, these are the actions (tasks) that have to be taken to produce results Analysis.
1 WERT: WP 5 RG EVANS ASSOCIATES November 2010 Aim To pilot and evaluate the content and context of the course material with target groups To help women.
Building Evidence in Education: Conference for EEF Evaluators 11 th July: Theory 12 th July: Practice
Kainoa Hopfe, Teacher, NHIS Marie Pineda, Teacher, NHIS Robyn Faumuina, Teacher, NHIS Bernice Kihara, Retired Literacy Coach Implementing Change through.
Implementation and process evaluation: developing our approach Ann Lendrum University of Manchester Neil Humphrey University of Manchester Gemma Moss Institute.
1 Women Entrepreneurs in Rural Tourism Evaluation Indicators Bristol, November 2010 RG EVANS ASSOCIATES November 2010.
1Management Sciences for Health Principles of Curriculum Development.
Monitoring & Evaluation: The concepts and meaning Day 9 Session 1.
The Most Significant Change ProLearn Project in India Inka Píbilová Impact Evaluation Conference in Wageningen, 25 March 2012.
Dimensions and parameters for the evaluation of e-learning Dr. Bernhard Ertl.
Professional Development PLC Lead Training Together, we can make a difference.
Workshop on ICT Policy reform and rural communications infrastructure 22August – 2 nd Sept 2004 Tokyo, Japan Rural Communications Development – Uganda.
Local Evaluation Overview and Preliminary Findings Diane Schilder, EdD.
Practice of INSET in Mathematics and Science Teachers and its Impact on Quality of Basic Education in Kenya By ADEA-WGMSE.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 24, 2009.
Projects #9, 17, 29, and 32 Mentor: Helga Bernard, Ph. D. Clark County School District School Improvement and Research.
Establishing E&I capability and best practices at Statistics NZ Vera Costa & Tracey Savage 2008 UNECE Work Session on Statistical Data Editing.
FASA Middle School Principal ’ s Leadership Academy Don Griesheimer Laura Hassler Lang July 22, 2007.
Personalised Learning NOCN Level 2. Induction Welcome and introductions Completion of enrolment forms Qualification Initial assessment Personalised learning.
Knowledge building in the 21 st century at The Geelong College: Information-to-Knowledge Continuum “As we increasingly move toward an environment of instant.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 9, 2012.
Carol Stewart Kennesaw State University. Purpose  To conduct a comprehensive needs assessment of the school that addresses academic areas of math and.
Dr. Derrica Davis Prospective Principal Candidate: Fairington Elementary School.
The purpose of evaluation is not to prove, but to improve.
Prior Learning Assessment (PLA) Pilot Project At VSU Prepared by the PLA Assessors Group.
Tools and techniques to measuring the impact of youth work (Caroline Redpath and Martin Mc Mullan – YouthAction NI)
Basic Concepts of Outcome-Informed Practice (OIP).
Quality Evaluations in Education Interventions 1 March 2016 Dr Fatima Adam Zenex Foundation.
District Literacy Leaders Network Meeting March 24, :00am-12:00pm Dr. LaWonda Smith Manager, English Language Arts Dr. Argentina Back Manager, Multilingual.
Evaluation Planning Checklist (1 of 2) Planning Checklist Planning is a crucial part of the evaluation process. The following checklist (based on the original.
Monitoring and Evaluation Systems for NARS organizations in Papua New Guinea Day 4. Session 10. Evaluation.
Evaluation Plan Akm Alamgir, PhD June 30, 2017.
Discussion of CRVS strategies
GETTING ‘EVEN’ BETTER: FIVE LEVELS FOR PD EVALUATION
Dr. Elsa Fourie North West University SOUTH AFRICA
Presentation transcript:

4 th Biennial SAMEA Conference Meaningful Evaluation: Improving Use and Results Evaluation of Teacher-Directed ICT Initiatives Presenter Mokete Mokone M&E Associate, FeedbackRA 20 September 2013

About the Presentation Purpose of the Evaluation Design & Questions Benefits and Findings Changes to the Evaluation Design Lessons for Evaluations Conclusion

Purpose of the Evaluation To provide baseline, formative and summative information about the project: Improve project implementation strategy Improve project delivery mechanisms Establish the role of ICT solutions in improving teacher content knowledge

Evaluation Design Evaluation TypeComponentEvaluation Questions Design Review To explicate the theory of change underlying the project and investigate implementation arrangements that may have an impact on the implementation of the project =reviewing project plans, concept document and conducting key informant interviews. What is the postulated theory Targeting? Stakeholders How were teachers motivated to participate in the project?

Evaluation Design Evaluation TypeComponentEvaluation Questions Implementation Evaluation Which aimed to comment on the implementation fidelity drawing on sources such as system usage statistics and interviews with teachers and other key informants in the project. To what extent was the pilot project implemented as planned? Service Delivery Drop-off rates Frequency rates Dosage Levels

Evaluation Design (Cont.) Evaluation TypeComponentEvaluation Questions Outcome EvaluationUsing pre-post testing to determine teacher knowledge gains. The testing of teachers was based on a scholastic assessment representative of the content a teacher would focus on in that year. The test was scheduled at project start and completion. Did the project result in the desired / expected process outcomes? What is the level of teachers’ content knowledge or Maths, Science and English at baseline and at the end of the project?

Benefits and Findings Design review (assessing the theory of change) revealed flaws in project design (insufficient training, limited knowledge of ICT, no support structure, curriculum challenges) Implementation review (examining fidelity) showed that participants where not spending the required amount of time on the ICT platform, low dosage levels, weak participation Outcome findings through a pre and post baseline test was planned to determine knowledge pre-intervention and after project implementation. The post test, could not take place due to low levels of participation

Evaluation Scope Change Move away from determining programme effectiveness = drawing lessons on failures Lessons for future ICT teacher directed projects Key informant interviews and participant interviews= 1. What has been learnt from the implementation of Teacher ICT projects, which may be useful to keep in mind when designing similar future projects? 2.What enabling and hindering factors determine the success of Teacher ICT projects? 3. What appropriate systems and infrastructure should be in place before initiating a Teacher ICT project?

Conclusion Evaluations and data collection is often done up front, but it should be responsive to changes in context and implementation High chance of implementation failures requires the evaluation to spend more time to establish if necessary circumstances exist to make implementation possible. Its critical to respond to changes affecting an evaluation plan, at the same time developing an appropriate solution that still allows critical information to be obtained.

Conclusion (cont.) Although the original findings that were sought, were not achieved, usable data was still acquired and helped inform organisational strategy for future Teacher ICT projects. Evaluative findings must also be hugely embedded in the development context and seek to answer a long term view of empowering communities. The success and failure of innovative ICT pilot projects still have lessons that can be applied in developing new and improved interventions.