Presentation is loading. Please wait.

Presentation is loading. Please wait.

CECV Intervention Framework Module 6 Evaluation

Similar presentations


Presentation on theme: "CECV Intervention Framework Module 6 Evaluation"— Presentation transcript:

1 CECV Intervention Framework Module 6 Evaluation
FACILITATOR NOTES This module comprises two sessions: Session 1 - reflection on participants’ values, purposes, objects and methods pertaining to the overall concept of educational evaluation Session 2 - a focus on the use of Effect Size to calculate student progress (Part A) and the use/value of Self Reflection in evaluation processes (Part B). Each session is likely to require 60 minutes, however these times are only a guide. The size of the group, the make-up of the group (one or multiple school/s) and the previous contact with the topic will determine the time required.

2 Purpose of this Module As a result of participating in this module, you will: Evaluate the effectiveness of the Intervention Framework in guiding your school through the process of: Identifying students with additional learning needs; Assessing students with additional learning needs; Analysing & interpreting the data collected; Designing & carrying out the teaching; and Evaluating & monitoring the student’s progress and the effectiveness of the teaching. 2 Slide 2 Introduce participants to the objectives of this module, and explain its ‘sessional’ structure, namely: Session 1 will enable an opportunity to reflect on values, purposes, objects and methods pertaining to the overall concept of educational evaluation Session 2 will have a focus on the use of Effect Size to calculate student progress (Part A) and will also concentrate on the use/value of Self Reflection in evaluation processes (Part B).

3 Foundations of The Framework
Slide 3 Briefly revisit the philosophical and foundational features of the Intervention Framework. 3

4 Core Principles 1. All students can succeed
2. Effective schools promote a culture of learning 3. Effective teachers are critical to student learning success 4. Teaching and learning is inclusive of all 5. Inclusive schools actively engage and work in partnership with the wider community 6. Fairness is not sameness 7. Effective teaching practices are evidence-based 4 Slide 4 Briefly recap on the Core Principles that underpin the Intervention Framework.

5 “…research seeks to prove, evaluation seeks to improve…”
M Q Patton (2003) 5 Slide 5 Display this quote but at this point do not engage participants in discussion.

6 Focus Questions How do you define evaluation? 1. Why do you evaluate?
2. What do you evaluate? 3. For whom do you evaluate? 4. How do you evaluate? Educational Evaluation Around the World, Danish Evaluation Institute, 2003 6 Slide 6 Invite participants to explore and discuss the overall question by using the four guiding questions to keep the discussion focused. Why do you evaluate? Here the focus is on values and purposes of evaluative activities What do you evaluate? Here the object/s of evaluations (objects being evaluated: the teaching & the student outcomes) are central. For whom do you evaluate? This also concerns values and control & parent / school / teacher relationships. How do you evaluate? This concerns the methods applied. Time to discuss these questions should be approximately 15 minutes (3 minutes per question). A scribe for each group is to record a succinct answer to each question, to be kept for future reference.

7 How do you define Evaluation?
Slide 7 This is a title slide only. Explain to participants that you and they will now be reviewing a number of statements about evaluation (Slides 8-13) in order to unpack and discuss its various components. 7

8 Defining Evaluation In implementing any change it is necessary to evaluate the effect. In considering implementation of the Intervention Framework it is necessary to evaluate the effect on individual student outcomes and more broadly on teacher practice, teacher knowledge, school policies and processes. 8 Slide 8 Discuss as required. Discussion suggestion How do we evaluate the effect on: individual outcomes? teacher practice? teacher knowledge? school policies & processes?

9 The American Evaluation Association
Defining Evaluation Evaluation involves assessing the strengths and weaknesses of programs, policies, personnel, products, and organizations to improve their effectiveness. The American Evaluation Association 9 Slide 9 Discuss as required. Discussion suggestion How do we assess the strengths & weaknesses…?

10 Defining Evaluation Evaluation is the systematic collection and analysis of data needed to make decisions, a process in which most well-run programs engage from the outset. The American Evaluation Association 10 Slide 10 Discuss as required. Discussion suggestions Is evaluation more than data collection? What constitutes data?

11 The American Evaluation Association
Defining Evaluation Evaluation is about finding answers to questions such as, “are we doing the right thing” and “are we doing things right?” The American Evaluation Association 11 Slide 11 Discuss as required. Discussion suggestions How? How do we evaluate whether or not we are doing the right things and doing things right? What ‘data’ do we collect to prove/disprove? What constitutes ‘data’?

12 Defining Evaluation Rossi and Freeman (1993) define evaluation as "the systematic application of social research procedures for assessing the conceptualization, design, implementation, and utility of ... programs." 12 Slide 12 Discuss as required. Explain that there are many other similar definitions and explanations in the literature of what evaluation is. One view is that, although each definition, and in fact each evaluation, is slightly different, there are several different steps that are usually followed in any evaluation. It is these ‘steps’ which guide the evaluation process. An overview of the steps of a ‘typical’ evaluation follows.

13 Defining Evaluation appraise assess critique judge justify predict
prioritise choose Source: Anderson & Krathwohl, 2001 monitor select rate rank prove decide conclude argue as cited in Biggs & Tang, 2007. Slide 13 ACTIVITY. Invite participants to use these verbs to create a series of statements that define ‘evaluation’. Try to cover the different aspects, such as: 1. Why do you evaluate? 2. What do you evaluate? 3. For whom do you evaluate? 4. How do you evaluate? - individual outcomes - teacher practice - teacher knowledge - school policies & processes 13

14 Steps in Evaluation Step 1: Define what you hope to achieve
Step 2: Collect data (pre & post) Step 3: Analyse the data Step 4: Formulate conclusions Step 5: Modify the program 14 Slide 14 Source: Rossi and Freeman (1993) Discuss as required. Discussion suggestions How might these steps apply to evaluating: The Intervention Framework process? A PLP (student outcomes)? A single teaching outcome? Teacher practice? Teacher knowledge? School policies & processes?

15 Types of Evaluation Process Evaluation Outcome Evaluation
Process Evaluations describe and assess the actual program materials and activities. Outcome Evaluation Outcome Evaluations study the immediate or direct effects of the program on participants. Impact Evaluation Impact Evaluations look beyond the immediate results of policies, instruction, or services to identify longer-term as well as unintended program effects. 15 Slide 15 Discuss with participants: For the purpose of evaluating the Intervention Framework processes, which types of evaluation will we focus on: Process Evaluation? Outcome Evaluation? Impact Evaluation? or all three?

16 Process Evaluation to Inform School Improvement
Phases of the process of improvement 0 Preparation/ 1. Identification Diagnostic phase / 2. Assessment Strategic planning phase / 3. Analysis and Interpretation Developmental phase / 4. Teaching and Learning Evaluation phase / 5. Evaluation R Bollen 1997 16 Slide 16 The School Improvement Process is an example of the CEOM using Process Evaluation. This slide shows the parallels. Both follow the same steps. Discussion suggestion Discuss the parallels between the School Improvement Process and the Intervention Framework.

17 Outcome Evaluation The ultimate goal of the Intervention Framework process is to improve student outcomes. How do you know whether it did? One commonly used way to find out whether the process (i.e. the T&L cycle) improved student outcomes is to ask whether the process caused the expected outcome. If the process caused the outcome, then one could argue that the process improved student outcomes. On the other hand, if the process did not cause the outcome, then one would argue that, since the process did not cause the outcome, then the process did not improve student outcomes. 17 Slide 17 Discuss as required, however, given the complexity of these statements, move to Slide 18 for further elaboration before sustained discussion.

18 Outcome Evaluation How to figure this out
Determining whether a process caused the outcome is one of the most difficult problems in evaluation, and not everyone agrees on how to do it. The approach you take depends on how the evaluation will be used, who it is for, what the evaluation users will accept as credible evidence of causality, what resources are available for the evaluation, and how important it is to establish causality with considerable confidence. Michael Quinn Patton One way could be to evaluate the teaching programs implemented. 18 Slide 18 Discuss as required, in particular the final statement, ‘One way could be to evaluate the teaching programs implemented’. What are other ways?

19 Impact Evaluation Impact Evaluations look beyond the immediate results of policies, instruction, or services to identify longer-term as well as unintended program effects. 19 Slide 19 This type of Evaluation is to be noted only - no discussion required.

20 1. Why Evaluate? Slide 20 This is for display purposes only, to introduce perspectives on the ‘why’ of evaluation covered in Slides 21-22, where the focus is on values and purposes of evaluative activities. Note: Here are just some of the evaluation activities that are already likely to be incorporated into many programs or that can be added easily: Pinpointing the services needed for example, finding out what knowledge, skills, attitudes, or behaviours a program should address for that student/group Establishing program objectives and deciding the particular evidence (such as the specific knowledge, attitudes, or behaviour) that will demonstrate that the objectives have been met. A key to successful evaluation is a set of clear, measurable, and realistic program objectives. If objectives are unrealistically optimistic or are not measurable, the program may not be able to demonstrate that it has been successful even if it has done a good job. i.e. student achievement Developing or selecting from among alternative program approaches for example, trying different curricula or policies and determining which ones best achieve the goals for that student/group Tracking program objectives for example, setting up a system that shows who gets services, how much service is delivered, how teachers rate the program, and which approaches are most readily adopted by staff Trying out and assessing new program designs determining the extent to which a particular approach is being implemented faithfully by school or individual teachers or the extent to which it achieves the goals for that student/group. Taken from Dept Ed, US 20

21 Why Evaluate? It is important to evaluate programs/the teaching for many reasons: to ensure that the program is not creating any unintended harm; to determine if the program is making a positive contribution (improved student outcomes); and to improve and learn (i.e. to learn what were the positive elements, how it can be replicated, how challenges can be overcome in the future and how to make the process sustainable). 21 Slide 21 Discuss these and other reasons to conduct evaluations, inviting participants to add to these statements. You may choose to cite the following summary statement: In other words, evaluations help to foster accountability, determine whether programs "make a difference," and give staff the information they need to improve service delivery. (Taken from Dept Ed, US)

22 Why Evaluate? The four main reasons evaluation is conducted:
accountability; learning; program management and development; ethical obligation. Green and South, 2006. 22 Slide 22 Explain to participants that there is a high level of agreement across countries about these reasond (as reported in the Educational Evaluation Around the World, Danish Evaluation Institute, 2003). However, New Zealand frames its goals differently, giving priority to the protection of the interests of learners, and not the schools or education institutions themselves: ‘The purposes of evaluation in New Zealand, for both quality assurance and quality development, are: to protect the interests of learners; to ensure learners have access to opportunities for life-long learning; to ensure learning goals are meaningful and credible; to assure learners that courses and programmes are well taught; to ensure qualifications are obtained in safe environments using appropriate teaching and assessment systems; to contribute to the enhancement of quality systems and processes that improve the quality of research, teaching, learning and community service.’ Though there is consensus that evaluation is done primarily to safeguard and stimulate (high) quality education and improvement, the focus in most of the countries is on the actual educational system and its organisations, whereas in New Zealand the protection of the interests of learners is given priority, and not the schools / institutions themselves. Discuss

23 2. What do you Evaluate? Slide 23
In this section of the session, try to direct the discussion to cover the object/s of evaluations (objects being evaluated: the teaching & the student outcomes). Discuss as required. 23

24 3. For whom do you Evaluate?
Slide 24 In this section of the session, try to direct the discussion to cover concerns values and control & parent / school/ teacher relationships. Discuss as required. 24

25 4. How do you Evaluate? Slide 25
In this section of the session, try to direct the discussion to cover the methods applied. Discuss as required. 25

26 What & How? How does your school evaluate its current programs?
How would you evaluate whether the child/children progressed as a result of participation in this intervention process. 26 Slide 26 Encourage participants to use these questions to guide their future planning and their use of the Intervention Framework.

27 What & How? Is the student progressing satisfactorily against the set goals? How will you monitor and interpret the student’s progress against the set goals? How will you evaluate the effectiveness of the program/approach? 27 Slide 27 Encourage participants to use these questions to guide their future planning and their use of the Intervention Framework.

28 What & How? Making the results useful (student outcomes)
How will you use the results to inform future program development for students? How will the results be reported so that they can be used by the school to make improvements? 28 Slide 28 Encourage participants to use these questions to guide their future planning and their use of the Intervention Framework.

29 “…evaluation seeks to improve…”
Slide 29 Use this slide to make any final summative comments. Activity. Ask participants to write some statements about evaluation using a think / pair / share dynamic. Display statements and provide each participating school with the set of statements to take back to their school. 29

30 Next Session - 2 30 Slide 30 Use this slide to briefly foreshadow what Session 2 will comprise, namely a focus on the use of Effect Size to calculate student progress (Part A) and the use/value of Self Reflection in evaluation processes (Part B).

31 Evaluation Effect Sizes
Slide 31 Explain that the first part of this session will assist participants to use Effects Sizes to evaluate / calculate the student’s progress and thereby evaluate the effectiveness of the teaching. 31

32 “evaluation seeks to improve” Slide 32
Briefly display this slide (introduced in the first session) then move on with the presentation. 32

33 Effect Sizes 1. What is an effect size? 2. Why use effect sizes?
3. How can schools use effect sizes to evaluate the effectiveness the intervention? 33 Slide 33 Source: Visible Learning: A synthesis of over 800 meta-analyses relating to achievement, John Hattie 2009 Explain that effect sizes are presented as one method of evaluating the effect of the teaching (using a quantitative method).

34 Effect Sizes (d) 1a. What is an effect size?
An effect size provides a common expression of the magnitude of study outcomes, across variables, such as improving reading levels in accuracy and comprehension. An effect size of 1.0 indicates an increase of one standard deviation (1SD) on the outcome. One SD increase is typically associated with advancing students’ reading levels by two to three years, improving the rate of learning by 50%. 34 Slide 34 Work through the definition/explanation with participants.

35 Effect Sizes (d) 1b. What is a reasonable effect size?
Cohen (1988) suggests that: d = 0.2 is small, d = 0.5 is medium, d = 0.8 is large Whereas the results from Hattie’s meta-analyses could suggest when judging educational outcomes: d = 0.2 is small, d = 0.4 is medium, d = 0.6 is large Reference: Cohen, J. (1988). Statistical power analysis for the behavioural sciences (2nd ed.). Hillsdale, NJ: L. Erlbaum Assoc. 35 Slide 35 Present the differing ranges of effect sizes and invite comment, questions as required.

36 John Hattie - Visible Learning
What is John Hattie on about, in a nutshell? 15 years of research 800+ meta-analyses 50,000 studies 200+ million students Outcome: What are the major influences on student learning? 36 Slide 36 Briefly introduce the work of John Hattie, whose research is informing our recent focus on the use of effect sizes for evaluating the effect on student learning.

37 Hattie’s Effect Sizes (d)
Slide 37 Present and explain the graph depicting the factors John Hattie explored that effect student learning. 37

38 Effect Sizes (d) Effect size = Average (post) - Average (Pre)
The Formula Effect size = Average (post) - Average (Pre) (d) Average Standard Deviation (the spread) 38 Slide 38 Present and explain the Formula used.

39 Effect Sizes (d) 2. Why use effect sizes?
• To compare progress over time on the same test. • To compare results measured on different tests. • To compare different groups doing the same test. 39 Slide 39 Explain/discuss these different uses of effect sizes.

40 Effect Sizes (d) 3. How can schools use effect sizes?
Discussion in school groups 40 Slide 40 Invite participants to discuss this in school groups, then take feedback from the whole group. Review real data to demonstrate the use of effect sizes.

41 Evaluation Self Reflection
Slide 41 41

42 What is Self Reflection? Slide 42
Use this slide to introduce the final focus of this session. 42 42

43 "Test all things; hold fast what is good"
Self Reflection "Test all things; hold fast what is good" I Thessalonians 5:21 43 Slide 43 Use this slide to offer the following reflection: I Thessalonians 5:21 instructs us to ‘test all things’, which would include our old notions, and then "hold fast" to the good ones—the ones that pass the test. A mistake many make is to follow tenaciously the instruction of Revelation 3:11 to ‘hold fast to what we have’ while completely ignoring the additional instructions of I Thessalonians 5:21 to ‘test first’.

44 What is Self Reflection?
Self Reflection is simply a form of self evaluation undertaken by participants in social situations in order to improve the rationality and justice of their own practices, their understanding of these practices, and the situations in which the practices are carried out. Adapted from Carr and Kemmis, 1986 44 Slide 44 Introduce this description and discuss as required.

45 Self Reflection in Schools
Self reflection is a process in which teachers examine their own educational practice systematically and carefully, using the techniques of action research. 45 Slide 45 Invite participants to offer comment on the following question: In your experience, to what extent do teachers (your colleagues) self reflect?

46 Self Reflection Leads to Improvement?
Slide 46 Discuss the concept that action must follow reflection if there are to be any improvements / changes … or the water will continue to rise!!!!! 46

47 Follow effective action with quiet reflection.
Out of the quiet reflection will come even more effective action. Peter. F. Drucker 47 Slide 47 Use for display only and participants’ reflection.

48 Teachers account for about 30% of the variance in student achievement
“It is what teachers know, do, and care about, which is very powerful in this learning equation, not just what they know” (p. 2). They must put this knowledge into practice if they are to produce gains in student learning outcomes. Hattie (2003) 48 Slide 48 Use for display only and participants’ reflection.

49 Where to from here? Slides 49-52
Work towards concluding the session by engaging participants in discussion, in school groups, of the questions raised in Slides Ask groups to report back to the whole group any main points (only if different to points already raised). 49 49

50 Where to from here? Each school to reflect on:
Existing evaluation processes of: existing intervention programs currently in use in the school. teacher performance Student performance 50 Slide 50 50

51 In your Context Has this process highlighted the need to review your school’s policies and/or processes? 51 Slide 51

52 In your Context Has this process made a difference to your students’ performances? Evaluation of change is fundamental to the process. 52 Slide 52 If time permits, invite participants to add to the previous group statements about evaluation pertaining to effect sizes and self reflection (refer session 1, slide 29 activity). Display statements and provide each school with the set of statements to take back to their school.

53 “When ignorance is bliss, tis folly to be wise”
Thomas gray from ‘Ode on a Distant Prospect of Eton College’. 53 Slide 53 For display and reflection only.

54 “self reflection seeks to improve” Slide 54
For display and reflection only.

55 References Educational Evaluation Around the World: An International Anthology international-anthology/download An Education Primer: An overview of education evaluation Evaluation definition: Evaluation Toolkit: Introduction to program Evaluation: What is Program Evaluation? A beginner’s guide: Patton, M Q 2002, Qualitative research & evaluation methods (3rd ed.), Thousand Oaks, CA: Sage Publications 55 Slide 55 Recommend these resources to participants and encourage further professional reading. Conclude the session. NOTE Following the completion of all six modules (Term 3, 2012) : Teaching staff will re-do the ‘Essential Components Rubric’ LSOs will re-do the LSO questionnaire and complete the LSO self-reflection Case studies will be submitted.


Download ppt "CECV Intervention Framework Module 6 Evaluation"

Similar presentations


Ads by Google