Presentation is loading. Please wait.

Presentation is loading. Please wait.

Elements of Compelling Evaluation Reporting

Similar presentations


Presentation on theme: "Elements of Compelling Evaluation Reporting"— Presentation transcript:

1 Elements of Compelling Evaluation Reporting
Jon K. Price K-12 Evaluation Research Manager

2 IIE K-12 Evaluation Goals
To learn how to improve the effectiveness of the program To collect data on, and to observe the extent and quality of teacher implementation of new techniques in the classroom. To determine the effectiveness and impact of K-12 programs on teachers classroom performance. To communicate effectiveness, thus encourage participating teachers to continue learning and implementing new techniques and encouraging nonparticipating teachers to participate. To provide evidence for an effective curriculum, pedagogy and processes of classroom interaction that directly influence learning. What was said at Education Summit Evaluation: Provides data for continuous improvement; Supports Govt. adoption, endorsement and funding Provides data that makes our programs competitive during annual budget allocations Provides consistency through an aligned corporate evaluation resulting in a unique global perspective on programs Adds to the body of research on effective use of technology in education We Need: Evaluation plans as a part of core program strategies that align to our corporate goals and protocols Local evaluation needs to supplement the global core surveys Manage the local evaluation process to ensure data is collected, compiled, distributed and shared in a timely manner Review and reflect on evaluation results for program continuous improvements Utilize evaluation data and reports to lobby MOE for educational reform

3 Basic Report Design Executive Summary I. Methodology and Data Sources
II. Impact of Essentials Course Teacher Use of Technology Student Use of Technology Variations Analysis and Synthesis of Data and its meaning III. Conclusion Appendices: Intel® Teach End of Training or Impact Survey Reported Class Size (as a range) References

4 Making Your Evaluation Data Work for You
What do you need from the data? Quality Control Ministry support Policy and/or Education Reform Background: Some details of the program being evaluated. Some details of the Evaluation methods –What, To Whom, When, Where, How, For how long? Does it correlate to existing data, country standards or efforts? Findings: What the practice was that needed to be changed… What the evaluation said… Did you do what you set out to do? Did you do it well? Did you do it to whom you intended? Did they learn what you expected them to learn? Did they change the way they do things as a result of what you did? Were there any unintended outcomes? * What barriers to implementation did you find?

5 Making Your Evaluation Data Work for You
Intel Teach: Use End of Training Evaluation as your initial quality measurement. We will no longer collect EoT data for a global dataset & report. How will you manage this quality measurement internally? Impact Evaluations start with measuring integration of technology in the classroom and develop into the tool for reform. Examples: Sharepoint site - Core surveys – can still be used to benchmark against 5 years of data. Geo Evaluation Catalog – contains proposals, instruments, & reports. Optional Qualitative Research Modules. Sharepoint Case Study folder. Actions: Highlights (Capture key points) Issues (Capture key points) Plans to address Issues/next steps What did you do to change the program? (Shifted/reduced/changed/added program support/money/people/policy etc., Other?) Are there ongoing evaluation efforts? Recommendations: Marketing tips?

6 Terms/Definitions Term Definition Evaluation
A detailed study for the purpose of program review and continuous improvement Assessment A detailed study of the impact/outcome of student centered interventions Survey Ad hoc interviews with users, where a set of questions is asked and the users' responses recorded. Questionnaire Written lists of questions that are distributed to a target response audience. Interview A method of formal/structured field observation that allows the interviewer to directly interact with individual respondents to investigate their opinions, experiences and preferences regarding the product. Focus Group A method of formal/structured field observation that allows the interviewer to directly interact with a group of respondents to investigate their opinions, experiences and preferences regarding the product. The interaction among multiple participants may raise additional issues, or identify common issues. * Often, the issues identified by interviews and focus groups work well to construct surveys and questionnaires. Journals/Self Report Logs On-line or paper-and-pencil journals in which users are requested to note their actions and observations while interacting with a product. This technique allows an evaluator to perform user evaluation at a distance. Formative A continuous improvement study to assist in the formation or development of a program. Summative A study of outcome to determine effects of an intervention, (causal relationships between the intervention and the outcome measures). Quantitative Analysis Procedures taken to analyze numeric data using inferential statistical techniques. Qualitative Analysis Procedures for deriving meaning from non quantified narrative information, often involving inductive, interactive, and iterative process.

7 Evaluation Methods Method Definition Evaluation Plan
A living document that identifies evaluation project evaluators, stakeholders, scope of work, budget, participants, methodology, localization plans, timeline forecast and deliverables. End of Training Survey A set of questions asked where the users' responses recorded immediately following an Intel Teach to the Future training session. Data collected should provide feedback on the training context, content and process. Impact Survey A set of questions asked where the users' responses recorded no earlier than 6 months following an Intel Teach to the Future training session. Data collected should provide feedback on participant (Master) Teacher use and application of material in the classroom. Additional Evaluation Efforts Field observations such as Interviews, Focus Groups, Case Studies or Journals that provide qualitative data regarding opinions, experiences and/or application of Intel Teach to the Future pedagogy.

8 Suggested Themes for Exploration
Suggested Themes for Program Implementation Did teachers like the training? Do teachers ask more essential questions? Did teachers use unit plans? Do teachers use technology more? How many teachers were trained? What are barriers to implementation? What makes teachers successful? Suggested themes for Teacher studies. Teachers perceptions of technology Teachers experience with technology. Teachers professional development using technology Teachers involvement in innovative curriculum development/reform Teachers perceptions of administrative support for technology use Teachers perceptions of student application of knowledge using technology. Teachers perceptions of application to student’s general life skills and attitudes Teachers perceptions of application to subject skills Teachers perceptions of “21st Century Thinking Skills” Suggested Themes for Student studies. Students experience with technology Students attitudes of technology Students views of subjects taught using technology Students with ‘Special Needs’ Gender issues. ‘Disaffected’ students

9 Elements of Compelling Evaluation
Compelling numbers (high or low depending on context) Demonstrates progress of government objectives or initiatives Shows the success of the program with the participants EXAMPLE: Learner completion rates of on average 97% for an informal education program Testimonials that are emotional and evoke strong emotion in the audience Displays program success in terms of the individual on personal level Conveys importance and impact to people’s lives EXAMPLE: “Intel® Teach to the Future is amazing. It’s changed my teaching practice. Now, I am utilizing technology in my curriculum and seeing a difference in my students’ critical thinking skills.” – PT 9/2005, Chiapas, Mexico Change, improvement, and milestones Reveals the chain reaction of change which result in education reform, economic growth, technology adoption, increase in technology literacy and 21st century skills, or improvement of public services Shows areas for improvement and constructive feedback on how to improve the program Local Increases preference with potential program participants and decision makers Increases loyalty among existing program audiences Global Exponential impact of global preference and loyalty contributes to: 10mm teachers trained

10 Elements of Compelling Evaluation Reporting
BAD EXAMPLE: “Question 5: Since your training, have you implemented some or all of the unit plans you developed in your Intel® Teach to the Future training? 44.21% of the teachers answered: Yes, more than once; % answered: Yes, once; 19.28% answered: Not yet, but I plan to use the lesson before the end of this school year; and 15.48% of the teachers answered: No, never.” GOOD EXAMPLE: “Majority of the respondents reported positive changes in their teaching practices like using more of the following: essential questions to structure lessons, computer technology to present information to students and create handouts, and rubrics to evaluate students. Several MT and PT respondents claimed positive effects of the ITTF program on their students, such as greater concept understanding, development of higher-level thinking skills, increased motivation and involvement in class, and more students working together. In-depth evaluation validated positive effects of the program on development of ICT skills of MTs and PTS. In cases where the MTs and PTs implemented their unit plans, the students demonstrated improved ICT skills, motivation, team work, class participation, and multiple intelligences in their outputs.”

11 What Matters Most? How to read results: 1. End of Training
Look for teacher reactions Look for indication of teacher learning 2. Impact Look for organizational support Look for classroom implementation Impact + Use qualitative methods Look Impact on school ecosystem, policies Look for evidence of classroom interaction that directly influence learning 1.The approach to the program - Within the individual schools - Across the schools involved 2. What are the key issues - such as lack of time - need for better resourcing 3. What ways are they working to improve or overcome these difficulties 4. What are the Key Take Aways - Pre-Svce - vital in bringing about sustained change is their willingness to embed the program within the curriculum - In-Svce - The role of the Principals within the institutions 5. What are Next Steps’ * Handouts: “Evaluation Report Checklist” & “Making Evaluation Meaningful…”

12 Benchmark Key Objectives
Global Benchmark Objective: To identify Intel Teach Essentials End of Training and Impact Evaluation benchmarks that will enable immediate measurement of local evaluation data when compared to established indicators. End of Training Benchmarks Resulting from the analysis of existing longitudinal End of Training evaluation data. Benchmarks identified based on 3 questions that look at training effectiveness. Question 2. To what extent do the following statements describe the Intel® Teach to the Future training in which you participated? (Great and Moderate Extent). Benchmarks identified based on 3 questions that look at the teachers reported readiness to implement technology in their classrooms. Question 3. Having completed your training, how well prepared do you feel to do the following activities with your students? (Very well and Moderately Prepared) A review of new program data for the first three quarters where data was submitted indicates there is no significant deviation from the sustaining benchmarks. However, the data indicates that most countries receive relatively high scores initially, followed by a dip the next quarter, followed by an increase in scores and stabilization in following quarters. In addition, overall scores are higher for the training description items then they are for the teacher preparedness items.

13 Impact Benchmark Resulting from the analysis of existing longitudinal Impact evaluation data. Benchmarks identified based on 4 questions that look at responses indicating the level of classroom implementation of key program components. Question 7. Have you used technology with your students in new ways since you participated in the training? (Yes). Question 14. Since completing your Intel® Teach to the Future training, has there been a change in how frequently you do the following? (Do ‘listed Activities a-f’ more) (Do ‘listed Activities g-k’ more) Question 5. Since your training, have you implemented some or all of the unit plan you developed in your Intel® Teach to the Future training? (Yes, more than once and Yes, once)

14 Global Benchmarks End of Training Benchmarks Impact Benchmarks
89% of teacher respondents indicate the training focused on integration of technology into their curriculum. 81% of teacher respondents indicate the training provided teaching strategies to apply with their students. 86% of teacher respondents indicate the training illustrated effective uses of technology with students. 80% of teacher respondents indicate they are prepared to implement teachings that emphasize independent work by students. 85% of teacher respondents indicate they are prepared to Integrate educational technology into the grade or subject they teach. 82% of teacher respondents indicate they are prepared to support their students in using technology in their schoolwork. Impact Benchmarks 75% of teacher respondents indicate increased use of technology activities with their students 80% of teachers increase use of technology for lesson planning and prep 60% of teachers increase use of project-based approaches in their teaching 75% of teachers use the unit/lesson they developed in training back in their schools

15 Success Criteria Deliver Essentials with comparable quality levels to established benchmarks. End of Training evaluation data indicates a comparable score as established benchmarks. To be reviewed individually at the country level. Impact evaluation data indicates a comparable score as established benchmarks. We no longer require country evaluation data to be submitted for a global roll up and report. It is vital that countries continue evaluation efforts, complete reports and submit the reports in order to maintain visibility into the quality of our program. Key Stakeholders accept and support Essentials data.

16 Marketing Considerations for Intel Teach Essentials Benchmarks
For Teacher Audience: Ensure teachers understand Intel’s involvement. Communicate course design and desired outcomes with the teachers. Have a means to track usage and results which will help us tell the story with proof points/data. Establish a long-term relationship with teachers Achieve better understanding of teacher usage and results for impact stories and continuous improvement For MOE Audience: Consistent communication of Intel messaging throughout the program, (pre, during and post) Establish a user friendly, easy to navigate resource for communicating training and impact evaluation results. Evidence of Impact Web Pages (Evaluation Web Resources) Enable co marketing opportunities with Ministries of Education

17


Download ppt "Elements of Compelling Evaluation Reporting"

Similar presentations


Ads by Google