Presentation on theme: "Elements of Compelling Evaluation Reporting"— Presentation transcript:
1 Elements of Compelling Evaluation Reporting Jon K. PriceK-12 Evaluation Research Manager
2 IIE K-12 Evaluation Goals To learn how to improve the effectiveness of the programTo collect data on, and to observe the extent and quality of teacher implementation of new techniques in the classroom.To determine the effectiveness and impact of K-12 programs on teachers classroom performance.To communicate effectiveness, thus encourage participating teachers to continue learning and implementing new techniques and encouraging nonparticipating teachers to participate.To provide evidence for an effective curriculum, pedagogy and processes of classroom interaction that directly influence learning.What was said at Education SummitEvaluation:Provides data for continuous improvement;Supports Govt. adoption, endorsement and fundingProvides data that makes our programs competitive during annual budget allocationsProvides consistency through an aligned corporate evaluation resulting in a unique global perspective on programsAdds to the body of research on effective use of technology in educationWe Need:Evaluation plans as a part of core program strategies that align to our corporate goals and protocolsLocal evaluation needs to supplement the global core surveysManage the local evaluation process to ensure data is collected, compiled, distributed and shared in a timely mannerReview and reflect on evaluation results for program continuous improvementsUtilize evaluation data and reports to lobby MOE for educational reform
3 Basic Report Design Executive Summary I. Methodology and Data Sources II. Impact of Essentials CourseTeacher Use of TechnologyStudent Use of TechnologyVariationsAnalysis and Synthesis of Data and its meaningIII. ConclusionAppendices:Intel® Teach End of Training or Impact SurveyReported Class Size (as a range)References
4 Making Your Evaluation Data Work for You What do you need from the data?Quality ControlMinistry supportPolicy and/or Education ReformBackground:Some details of the program being evaluated.Some details of the Evaluation methods –What, To Whom, When, Where, How, For how long?Does it correlate to existing data, country standards or efforts?Findings:What the practice was that needed to be changed…What the evaluation said…Did you do what you set out to do?Did you do it well?Did you do it to whom you intended?Did they learn what you expected them to learn?Did they change the way they do things as a result of what you did?Were there any unintended outcomes?* What barriers to implementation did you find?
5 Making Your Evaluation Data Work for You Intel Teach:Use End of Training Evaluation as your initial quality measurement.We will no longer collect EoT data for a global dataset & report.How will you manage this quality measurement internally?Impact Evaluations start with measuring integration of technology in the classroom and develop into the tool for reform.Examples:Sharepoint site -Core surveys – can still be used to benchmark against 5 years of data.Geo Evaluation Catalog – contains proposals, instruments, & reports.Optional Qualitative Research Modules.Sharepoint Case Study folder.Actions:Highlights (Capture key points)Issues (Capture key points)Plans to address Issues/next stepsWhat did you do to change the program? (Shifted/reduced/changed/added program support/money/people/policy etc., Other?)Are there ongoing evaluation efforts?Recommendations:Marketing tips?
6 Terms/Definitions Term Definition Evaluation A detailed study for the purpose of program review and continuous improvementAssessmentA detailed study of the impact/outcome of student centered interventionsSurveyAd hoc interviews with users, where a set of questions is asked and the users' responses recorded.QuestionnaireWritten lists of questions that are distributed to a target response audience.InterviewA method of formal/structured field observation that allows the interviewer to directly interact with individual respondents to investigate their opinions, experiences and preferences regarding the product.Focus GroupA method of formal/structured field observation that allows the interviewer to directly interact with a group of respondents to investigate their opinions, experiences and preferences regarding the product. The interaction among multiple participants may raise additional issues, or identify common issues. * Often, the issues identified by interviews and focus groups work well to construct surveys and questionnaires.Journals/Self Report LogsOn-line or paper-and-pencil journals in which users are requested to note their actions and observations while interacting with a product. This technique allows an evaluator to perform user evaluation at a distance.FormativeA continuous improvement study to assist in the formation or development of a program.SummativeA study of outcome to determine effects of an intervention, (causal relationships between the intervention and the outcome measures).Quantitative AnalysisProcedures taken to analyze numeric data using inferential statistical techniques.Qualitative AnalysisProcedures for deriving meaning from non quantified narrative information, often involving inductive, interactive, and iterative process.
7 Evaluation Methods Method Definition Evaluation Plan A living document that identifies evaluation project evaluators, stakeholders, scope of work, budget, participants, methodology, localization plans, timeline forecast and deliverables.End of Training SurveyA set of questions asked where the users' responses recorded immediately following an Intel Teach to the Future training session. Data collected should provide feedback on the training context, content and process.Impact SurveyA set of questions asked where the users' responses recorded no earlier than 6 months following an Intel Teach to the Future training session. Data collected should provide feedback on participant (Master) Teacher use and application of material in the classroom.Additional Evaluation EffortsField observations such as Interviews, Focus Groups, Case Studies or Journals that provide qualitative data regarding opinions, experiences and/or application of Intel Teach to the Future pedagogy.
8 Suggested Themes for Exploration Suggested Themes for Program ImplementationDid teachers like the training?Do teachers ask more essential questions?Did teachers use unit plans?Do teachers use technology more?How many teachers were trained?What are barriers to implementation?What makes teachers successful?Suggested themes for Teacher studies.Teachers perceptions of technologyTeachers experience with technology.Teachers professional development using technologyTeachers involvement in innovative curriculum development/reformTeachers perceptions of administrative support for technology useTeachers perceptions of student application of knowledge using technology.Teachers perceptions of application to student’s general life skills and attitudesTeachers perceptions of application to subject skillsTeachers perceptions of “21st Century Thinking Skills”Suggested Themes for Student studies.Students experience with technologyStudents attitudes of technologyStudents views of subjects taught using technologyStudents with ‘Special Needs’Gender issues.‘Disaffected’ students
9 Elements of Compelling Evaluation Compelling numbers (high or low depending on context)Demonstrates progress of government objectives or initiativesShows the success of the program with the participantsEXAMPLE: Learner completion rates of on average 97% for an informal education programTestimonials that are emotional and evoke strong emotion in the audienceDisplays program success in terms of the individual on personal levelConveys importance and impact to people’s livesEXAMPLE: “Intel® Teach to the Future is amazing. It’s changed my teaching practice. Now, I am utilizing technology in my curriculum and seeing a difference in my students’ critical thinking skills.” – PT 9/2005, Chiapas, MexicoChange, improvement, and milestonesReveals the chain reaction of change which result in education reform, economic growth, technology adoption, increase in technology literacy and 21st century skills, or improvement of public servicesShows areas for improvement and constructive feedback on how to improve the programLocalIncreases preference with potential program participants and decision makersIncreases loyalty among existing program audiencesGlobalExponential impact of global preference and loyalty contributes to: 10mm teachers trained
10 Elements of Compelling Evaluation Reporting BAD EXAMPLE: “Question 5: Since your training, have you implemented some or all of the unit plans you developed in your Intel® Teach to the Future training? 44.21% of the teachers answered: Yes, more than once; % answered: Yes, once; 19.28% answered: Not yet, but I plan to use the lesson before the end of this school year; and 15.48% of the teachers answered: No, never.”GOOD EXAMPLE: “Majority of the respondents reported positive changes in their teaching practices like using more of the following: essential questions to structure lessons, computer technology to present information to students and create handouts, and rubrics to evaluate students. Several MT and PT respondents claimed positive effects of the ITTF program on their students, such as greater concept understanding, development of higher-level thinking skills, increased motivation and involvement in class, and more students working together.In-depth evaluation validated positive effects of the program on development of ICT skills of MTs and PTS. In cases where the MTs and PTs implemented their unit plans, the students demonstrated improved ICT skills, motivation, team work, class participation, and multiple intelligences in their outputs.”
11 What Matters Most? How to read results: 1. End of Training Look for teacher reactionsLook for indication of teacher learning2. ImpactLook for organizational supportLook for classroom implementationImpact +Use qualitative methodsLook Impact on school ecosystem, policiesLook for evidence of classroom interaction that directly influence learning1.The approach to the program- Within the individual schools- Across the schools involved2. What are the key issues- such as lack of time- need for better resourcing3. What ways are they working to improve or overcome these difficulties4. What are the Key Take Aways- Pre-Svce - vital in bringing about sustained change is their willingness to embed the program within the curriculum- In-Svce - The role of the Principals within the institutions5. What are Next Steps’* Handouts: “Evaluation Report Checklist” & “Making Evaluation Meaningful…”
12 Benchmark Key Objectives Global Benchmark Objective:To identify Intel Teach Essentials End of Training and Impact Evaluation benchmarks that will enable immediate measurement of local evaluation data when compared to established indicators.End of Training BenchmarksResulting from the analysis of existing longitudinal End of Training evaluation data.Benchmarks identified based on 3 questions that look at training effectiveness.Question 2. To what extent do the following statements describe the Intel® Teach to the Future training in which you participated?(Great and Moderate Extent).Benchmarks identified based on 3 questions that look at the teachers reported readiness to implement technology in their classrooms.Question 3. Having completed your training, how well prepared do you feel to do the following activities with your students?(Very well and Moderately Prepared)A review of new program data for the first three quarters where data was submitted indicates there is no significant deviation from the sustaining benchmarks.However, the data indicates that most countries receive relatively high scores initially, followed by a dip the next quarter, followed by an increase in scores and stabilization in following quarters.In addition, overall scores are higher for the training description items then they are for the teacher preparedness items.
13 Impact BenchmarkResulting from the analysis of existing longitudinal Impact evaluation data.Benchmarks identified based on 4 questions that look at responses indicating the level of classroom implementation of key program components.Question 7. Have you used technology with your students in new ways since you participated in the training?(Yes).Question 14. Since completing your Intel® Teach to the Future training, has there been a change in how frequently you do the following?(Do ‘listed Activities a-f’ more)(Do ‘listed Activities g-k’ more)Question 5. Since your training, have you implemented some or all of the unit plan you developed in your Intel® Teach to the Future training?(Yes, more than once and Yes, once)
14 Global Benchmarks End of Training Benchmarks Impact Benchmarks 89% of teacher respondents indicate the training focused on integration of technology into their curriculum.81% of teacher respondents indicate the training provided teaching strategies to apply with their students.86% of teacher respondents indicate the training illustrated effective uses of technology with students.80% of teacher respondents indicate they are prepared to implement teachings that emphasize independent work by students.85% of teacher respondents indicate they are prepared to Integrate educational technology into the grade or subject they teach.82% of teacher respondents indicate they are prepared to support their students in using technology in their schoolwork.Impact Benchmarks75% of teacher respondents indicate increased use of technology activities with their students80% of teachers increase use of technology for lesson planning and prep60% of teachers increase use of project-based approaches in their teaching75% of teachers use the unit/lesson they developed in training back in their schools
15 Success CriteriaDeliver Essentials with comparable quality levels to established benchmarks.End of Training evaluation data indicates a comparable score as established benchmarks.To be reviewed individually at the country level.Impact evaluation data indicates a comparable score as established benchmarks.We no longer require country evaluation data to be submitted for a global roll up and report. It is vital that countries continue evaluation efforts, complete reports and submit the reports in order to maintain visibility into the quality of our program.Key Stakeholders accept and support Essentials data.
16 Marketing Considerations for Intel Teach Essentials Benchmarks For Teacher Audience:Ensure teachers understand Intel’s involvement.Communicate course design and desired outcomes with the teachers.Have a means to track usage and results which will help us tell the story with proof points/data.Establish a long-term relationship with teachersAchieve better understanding of teacher usage and results for impact stories and continuous improvementFor MOE Audience:Consistent communication of Intel messaging throughout the program, (pre, during and post)Establish a user friendly, easy to navigate resource for communicating training and impact evaluation results.Evidence of Impact Web Pages (Evaluation Web Resources)Enable co marketing opportunities with Ministries of Education