CECV Intervention Framework Module 6 Evaluation

Slides:



Advertisements
Similar presentations
MAPP Process & Outcome Evaluation
Advertisements

Introduction to Monitoring and Evaluation
Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
Action Research Not traditional educational research often research tests theory not practical Teacher research in classrooms and/or schools/districts.
Measuring Value: Using Program Evaluation to Understand What’s Working -- Or Isn’t Juliana M. Blome, Ph.D. , MPH Office of Program Analysis and Evaluation.
What You Will Learn From These Sessions
Student Learning Development, TCD1 Systematic Approaches to Literature Reviewing Dr Tamara O’Connor Student Learning Development Trinity College Dublin.
Project Monitoring Evaluation and Assessment
Program Evaluation. Lecture Overview  Program evaluation and program development  Logic of program evaluation (Program theory)  Four-Step Model  Comprehensive.
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
Gathering Credible Data Seattle Indian Health Board Urban Indian Health Institute Shayla R. Compton, MPH University of Washington School of Public Health.
Evaluation.
Action Research Not traditional educational research often research tests theory not practical Teacher research in classrooms and/or schools/districts.
CISB444 - Strategic Information Systems Planning
Decision-Making and Strategic Information Workshop on M&E of PHN Programs July 24-August 11, 2006 Addis Ababa.
INSTRUCTIONAL LEADERSHIP FOR DIVERSE LEARNERS Susan Brody Hasazi Katharine S. Furney National Institute of Leadership, Disability, and Students Placed.
Evaluation. Practical Evaluation Michael Quinn Patton.
EDU555 CURRICULUM & INSTRUCTION ENCIK MUHAMAD FURKAN MAT SALLEH WEEK 4 CURRICULUM EVALUATION.
Grade 12 Subject Specific Ministry Training Sessions
School Development Planning Initiative “An initiative for schools by schools” Self-Evaluation of Learning and Teaching Self-Evaluation of Learning and.
CECV Intervention Framework Module 1 Introduction & Philosophy
How to Develop the Right Research Questions for Program Evaluation
“Walking Through the Steps and Standards” Presented by: Tom Chapel Focus On…
February 8, 2012 Session 3: Performance Management Systems 1.
Impact assessment framework
Sociology 3322a. “…the systematic assessment of the operation and/or outcomes of a program or policy, compared to a set of explicit or implicit standards.
What is an effective induction (within an academic context) and how do you implement it across the whole university or college? Michael Hill Action on.
Elizabeth Godfrey 1.  Periodic assessment of results Appropriateness, effectiveness, efficiency, impact, sustainability  Identifies intended and unintended.
Unit1: How to ensure your curriculum is consistent with your aims and values The Year of the Curriculum What are we trying to achieve? How shall we organise.
Inspire Personal Skills Interpersonal & Organisational Awareness Developing People Deliver Creative Thinking & Problem Solving Decision Making, Prioritising,
Evaluation. HPS is a “change” process that takes place within a school community A key consideration is that the change needs to be sustainable.
1Management Sciences for Health Principles of Curriculum Development.
EDPQS in 10 minutes: Overview of European Drug Prevention Quality Standards (EDPQS) With financial support from the Drug Prevention and Information Programme.
Professional Learning and Development: Best Evidence Synthesis Helen Timperley, Aaron Wilson and Heather Barrar Learning Languages March 2008.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
Evidence-based Education and the Culture of Special Education Chair: Jack States, Wing Institute Discussant: Teri Palmer, University of Oregon.
Better Community Engagement Training for Trainers Course Day 1 This was developed as part of the Scottish Government’s Better Community Engagement Programme.
Queen’s Management & Leadership Framework
Developing a Framework In Support of a Community of Practice in ABI Jason Newberry, Research Director Tanya Darisi, Senior Researcher
Program Evaluation Overview. Definitions of Program Evaluation systematic collection of information abut the activities, characteristics, and outcome.
An Expanded Model of Evidence-based Practice in Special Education Randy Keyworth Jack States Ronnie Detrich Wing Institute.
The Literacy and Numeracy Secretariat Le Secrétariat de la littératie et de la numératie October – octobre 2007 The School Effectiveness Framework A Collegial.
CRITICAL THINKING AND THE NURSING PROCESS Entry Into Professional Nursing NRS 101.
CECV Intervention Framework Module 2 Identification 1.
1 CECV Intervention Framework Module 5A Learning & Teaching EFFECTIVE INTERVENTION.
Overview of nursing research Nursing research 471 Rawhia salah Assistant Prof. Of Nursing 2015/2016.
Some Definitions Monitoring – the skill of effectively over- viewing and analysing a learning situation Assessment – is the closer examination of pupil’s.
What Is Action Research? Action Research is : Action Research is : - A research methodology - Participative - Responsive - Cyclic “A cycle of posing questions,
Monitoring and evaluation Objectives of the Session  To Define Monitoring, impact assessment and Evaluation. (commonly know as M&E)  To know why Monitoring.
CAREER AND SKILLS TRAINING STRATEGIC FRAMEWORK Planning is key to success.
Social Work Competencies Social Work Ethics
Middle Managers Workshop 2: Measuring Progress. An opportunity for middle managers… Two linked workshops exploring what it means to implement the Act.
Donna Lipscomb EDU 695 MAED Capstone Common Core Presentation INSTRUCTOR KYGER MAY 21, 2015.
Collaborative & Interpersonal Leadership
edTPA: Task 1 Support Module
Competency Based Learning and Project Based Learning
Nursing Process Applied to Community Health Nursing
National Association For Court Management
Right-sized Evaluation
Chapter 17 Evaluation and Evidence-Based Practice
Continuous Improvement through Accreditation AdvancED ESA Accreditation MAISA Conference January 27, 2016.
Business Environment
Business Environment
Business Environment
Business Environment
14 Cultural Competence Training, Assessment, and Evaluation of Cultural Competence, and Evidence-Based Practices in Culturally Competent Agencies.
Unit 7: Instructional Communication and Technology
Integrating Gender into Rural Development M&E in Projects and Programs
Presentation transcript:

CECV Intervention Framework Module 6 Evaluation FACILITATOR NOTES This module comprises two sessions: Session 1 - reflection on participants’ values, purposes, objects and methods pertaining to the overall concept of educational evaluation Session 2 - a focus on the use of Effect Size to calculate student progress (Part A) and the use/value of Self Reflection in evaluation processes (Part B). Each session is likely to require 60 minutes, however these times are only a guide. The size of the group, the make-up of the group (one or multiple school/s) and the previous contact with the topic will determine the time required.  

Purpose of this Module As a result of participating in this module, you will: Evaluate the effectiveness of the Intervention Framework in guiding your school through the process of: Identifying students with additional learning needs; Assessing students with additional learning needs; Analysing & interpreting the data collected; Designing & carrying out the teaching; and Evaluating & monitoring the student’s progress and the effectiveness of the teaching. 2 Slide 2 Introduce participants to the objectives of this module, and explain its ‘sessional’ structure, namely: Session 1 will enable an opportunity to reflect on values, purposes, objects and methods pertaining to the overall concept of educational evaluation Session 2 will have a focus on the use of Effect Size to calculate student progress (Part A) and will also concentrate on the use/value of Self Reflection in evaluation processes (Part B).

Foundations of The Framework Slide 3 Briefly revisit the philosophical and foundational features of the Intervention Framework. 3

Core Principles 1. All students can succeed 2. Effective schools promote a culture of learning 3. Effective teachers are critical to student learning success 4. Teaching and learning is inclusive of all 5. Inclusive schools actively engage and work in partnership with the wider community 6. Fairness is not sameness 7. Effective teaching practices are evidence-based 4 Slide 4 Briefly recap on the Core Principles that underpin the Intervention Framework.

“…research seeks to prove, evaluation seeks to improve…” M Q Patton (2003) 5 Slide 5 Display this quote but at this point do not engage participants in discussion.

Focus Questions How do you define evaluation? 1. Why do you evaluate? 2. What do you evaluate? 3. For whom do you evaluate? 4. How do you evaluate? Educational Evaluation Around the World, Danish Evaluation Institute, 2003 6 Slide 6 Invite participants to explore and discuss the overall question by using the four guiding questions to keep the discussion focused. Why do you evaluate? Here the focus is on values and purposes of evaluative activities What do you evaluate? Here the object/s of evaluations (objects being evaluated: the teaching & the student outcomes) are central. For whom do you evaluate? This also concerns values and control & parent / school / teacher relationships. How do you evaluate? This concerns the methods applied. Time to discuss these questions should be approximately 15 minutes (3 minutes per question). A scribe for each group is to record a succinct answer to each question, to be kept for future reference.

How do you define Evaluation? Slide 7 This is a title slide only. Explain to participants that you and they will now be reviewing a number of statements about evaluation (Slides 8-13) in order to unpack and discuss its various components. 7

Defining Evaluation In implementing any change it is necessary to evaluate the effect. In considering implementation of the Intervention Framework it is necessary to evaluate the effect on individual student outcomes and more broadly on teacher practice, teacher knowledge, school policies and processes. 8 Slide 8 Discuss as required. Discussion suggestion How do we evaluate the effect on: individual outcomes? teacher practice? teacher knowledge? school policies & processes?

The American Evaluation Association Defining Evaluation Evaluation involves assessing the strengths and weaknesses of programs, policies, personnel, products, and organizations to improve their effectiveness. The American Evaluation Association 9 Slide 9 Discuss as required. Discussion suggestion How do we assess the strengths & weaknesses…?

Defining Evaluation Evaluation is the systematic collection and analysis of data needed to make decisions, a process in which most well-run programs engage from the outset. The American Evaluation Association 10 Slide 10 Discuss as required. Discussion suggestions Is evaluation more than data collection? What constitutes data?

The American Evaluation Association Defining Evaluation Evaluation is about finding answers to questions such as, “are we doing the right thing” and “are we doing things right?” The American Evaluation Association 11 Slide 11 Discuss as required. Discussion suggestions How? How do we evaluate whether or not we are doing the right things and doing things right? What ‘data’ do we collect to prove/disprove? What constitutes ‘data’?

Defining Evaluation Rossi and Freeman (1993) define evaluation as "the systematic application of social research procedures for assessing the conceptualization, design, implementation, and utility of ... programs." 12 Slide 12 Discuss as required. Explain that there are many other similar definitions and explanations in the literature of what evaluation is. One view is that, although each definition, and in fact each evaluation, is slightly different, there are several different steps that are usually followed in any evaluation. It is these ‘steps’ which guide the evaluation process. An overview of the steps of a ‘typical’ evaluation follows.

Defining Evaluation appraise assess critique judge justify predict prioritise choose Source: Anderson & Krathwohl, 2001 monitor select rate rank prove decide conclude argue as cited in Biggs & Tang, 2007. Slide 13 ACTIVITY. Invite participants to use these verbs to create a series of statements that define ‘evaluation’. Try to cover the different aspects, such as: 1. Why do you evaluate? 2. What do you evaluate? 3. For whom do you evaluate? 4. How do you evaluate? - individual outcomes - teacher practice - teacher knowledge - school policies & processes 13

Steps in Evaluation Step 1: Define what you hope to achieve Step 2: Collect data (pre & post) Step 3: Analyse the data Step 4: Formulate conclusions Step 5: Modify the program 14 Slide 14 Source: Rossi and Freeman (1993) Discuss as required. Discussion suggestions How might these steps apply to evaluating: The Intervention Framework process? A PLP (student outcomes)? A single teaching outcome? Teacher practice? Teacher knowledge? School policies & processes?

Types of Evaluation Process Evaluation Outcome Evaluation Process Evaluations describe and assess the actual program materials and activities. Outcome Evaluation Outcome Evaluations study the immediate or direct effects of the program on participants. Impact Evaluation Impact Evaluations look beyond the immediate results of policies, instruction, or services to identify longer-term as well as unintended program effects. 15 Slide 15 Discuss with participants: For the purpose of evaluating the Intervention Framework processes, which types of evaluation will we focus on: Process Evaluation? Outcome Evaluation? Impact Evaluation? or all three?

Process Evaluation to Inform School Improvement Phases of the process of improvement 0 Preparation/ 1. Identification Diagnostic phase / 2. Assessment Strategic planning phase / 3. Analysis and Interpretation Developmental phase / 4. Teaching and Learning Evaluation phase / 5. Evaluation R Bollen 1997 16 Slide 16 The School Improvement Process is an example of the CEOM using Process Evaluation. This slide shows the parallels. Both follow the same steps. Discussion suggestion Discuss the parallels between the School Improvement Process and the Intervention Framework.

Outcome Evaluation The ultimate goal of the Intervention Framework process is to improve student outcomes. How do you know whether it did? One commonly used way to find out whether the process (i.e. the T&L cycle) improved student outcomes is to ask whether the process caused the expected outcome. If the process caused the outcome, then one could argue that the process improved student outcomes. On the other hand, if the process did not cause the outcome, then one would argue that, since the process did not cause the outcome, then the process did not improve student outcomes. 17 Slide 17 Discuss as required, however, given the complexity of these statements, move to Slide 18 for further elaboration before sustained discussion.

Outcome Evaluation How to figure this out Determining whether a process caused the outcome is one of the most difficult problems in evaluation, and not everyone agrees on how to do it. The approach you take depends on how the evaluation will be used, who it is for, what the evaluation users will accept as credible evidence of causality, what resources are available for the evaluation, and how important it is to establish causality with considerable confidence. Michael Quinn Patton One way could be to evaluate the teaching programs implemented. 18 Slide 18 Discuss as required, in particular the final statement, ‘One way could be to evaluate the teaching programs implemented’. What are other ways?

Impact Evaluation Impact Evaluations look beyond the immediate results of policies, instruction, or services to identify longer-term as well as unintended program effects. 19 Slide 19 This type of Evaluation is to be noted only - no discussion required.

1. Why Evaluate? Slide 20 This is for display purposes only, to introduce perspectives on the ‘why’ of evaluation covered in Slides 21-22, where the focus is on values and purposes of evaluative activities. Note: Here are just some of the evaluation activities that are already likely to be incorporated into many programs or that can be added easily: Pinpointing the services needed for example, finding out what knowledge, skills, attitudes, or behaviours a program should address for that student/group Establishing program objectives and deciding the particular evidence (such as the specific knowledge, attitudes, or behaviour) that will demonstrate that the objectives have been met. A key to successful evaluation is a set of clear, measurable, and realistic program objectives. If objectives are unrealistically optimistic or are not measurable, the program may not be able to demonstrate that it has been successful even if it has done a good job. i.e. student achievement Developing or selecting from among alternative program approaches for example, trying different curricula or policies and determining which ones best achieve the goals for that student/group Tracking program objectives for example, setting up a system that shows who gets services, how much service is delivered, how teachers rate the program, and which approaches are most readily adopted by staff Trying out and assessing new program designs determining the extent to which a particular approach is being implemented faithfully by school or individual teachers or the extent to which it achieves the goals for that student/group. Taken from Dept Ed, US 20

Why Evaluate? It is important to evaluate programs/the teaching for many reasons: to ensure that the program is not creating any unintended harm; to determine if the program is making a positive contribution (improved student outcomes); and to improve and learn (i.e. to learn what were the positive elements, how it can be replicated, how challenges can be overcome in the future and how to make the process sustainable). 21 Slide 21 Discuss these and other reasons to conduct evaluations, inviting participants to add to these statements. You may choose to cite the following summary statement: In other words, evaluations help to foster accountability, determine whether programs "make a difference," and give staff the information they need to improve service delivery. (Taken from Dept Ed, US)

Why Evaluate? The four main reasons evaluation is conducted: accountability; learning; program management and development; ethical obligation. Green and South, 2006. 22 Slide 22 Explain to participants that there is a high level of agreement across countries about these reasond (as reported in the Educational Evaluation Around the World, Danish Evaluation Institute, 2003). However, New Zealand frames its goals differently, giving priority to the protection of the interests of learners, and not the schools or education institutions themselves: ‘The purposes of evaluation in New Zealand, for both quality assurance and quality development, are: to protect the interests of learners; to ensure learners have access to opportunities for life-long learning; to ensure learning goals are meaningful and credible; to assure learners that courses and programmes are well taught; to ensure qualifications are obtained in safe environments using appropriate teaching and assessment systems; to contribute to the enhancement of quality systems and processes that improve the quality of research, teaching, learning and community service.’ Though there is consensus that evaluation is done primarily to safeguard and stimulate (high) quality education and improvement, the focus in most of the countries is on the actual educational system and its organisations, whereas in New Zealand the protection of the interests of learners is given priority, and not the schools / institutions themselves. Discuss

2. What do you Evaluate? Slide 23 In this section of the session, try to direct the discussion to cover the object/s of evaluations (objects being evaluated: the teaching & the student outcomes). Discuss as required. 23

3. For whom do you Evaluate? Slide 24 In this section of the session, try to direct the discussion to cover concerns values and control & parent / school/ teacher relationships. Discuss as required. 24

4. How do you Evaluate? Slide 25 In this section of the session, try to direct the discussion to cover the methods applied. Discuss as required. 25

What & How? How does your school evaluate its current programs? How would you evaluate whether the child/children progressed as a result of participation in this intervention process. 26 Slide 26 Encourage participants to use these questions to guide their future planning and their use of the Intervention Framework.

What & How? Is the student progressing satisfactorily against the set goals? How will you monitor and interpret the student’s progress against the set goals? How will you evaluate the effectiveness of the program/approach? 27 Slide 27 Encourage participants to use these questions to guide their future planning and their use of the Intervention Framework.

What & How? Making the results useful (student outcomes) How will you use the results to inform future program development for students? How will the results be reported so that they can be used by the school to make improvements? 28 Slide 28 Encourage participants to use these questions to guide their future planning and their use of the Intervention Framework.

“…evaluation seeks to improve…” Slide 29 Use this slide to make any final summative comments. Activity. Ask participants to write some statements about evaluation using a think / pair / share dynamic. Display statements and provide each participating school with the set of statements to take back to their school. 29

Next Session - 2 30 Slide 30 Use this slide to briefly foreshadow what Session 2 will comprise, namely a focus on the use of Effect Size to calculate student progress (Part A) and the use/value of Self Reflection in evaluation processes (Part B).

Evaluation Effect Sizes Slide 31 Explain that the first part of this session will assist participants to use Effects Sizes to evaluate / calculate the student’s progress and thereby evaluate the effectiveness of the teaching. 31

“evaluation seeks to improve” Slide 32 Briefly display this slide (introduced in the first session) then move on with the presentation. 32

Effect Sizes 1. What is an effect size? 2. Why use effect sizes? 3. How can schools use effect sizes to evaluate the effectiveness the intervention? 33 Slide 33 Source: Visible Learning: A synthesis of over 800 meta-analyses relating to achievement, John Hattie 2009 Explain that effect sizes are presented as one method of evaluating the effect of the teaching (using a quantitative method).

Effect Sizes (d) 1a. What is an effect size? An effect size provides a common expression of the magnitude of study outcomes, across variables, such as improving reading levels in accuracy and comprehension. An effect size of 1.0 indicates an increase of one standard deviation (1SD) on the outcome. One SD increase is typically associated with advancing students’ reading levels by two to three years, improving the rate of learning by 50%. 34 Slide 34 Work through the definition/explanation with participants.

Effect Sizes (d) 1b. What is a reasonable effect size? Cohen (1988) suggests that: d = 0.2 is small, d = 0.5 is medium, d = 0.8 is large Whereas the results from Hattie’s meta-analyses could suggest when judging educational outcomes: d = 0.2 is small, d = 0.4 is medium, d = 0.6 is large Reference: Cohen, J. (1988). Statistical power analysis for the behavioural sciences (2nd ed.). Hillsdale, NJ: L. Erlbaum Assoc. 35 Slide 35 Present the differing ranges of effect sizes and invite comment, questions as required.

John Hattie - Visible Learning What is John Hattie on about, in a nutshell? 15 years of research 800+ meta-analyses 50,000 studies 200+ million students Outcome: What are the major influences on student learning? 36 Slide 36 Briefly introduce the work of John Hattie, whose research is informing our recent focus on the use of effect sizes for evaluating the effect on student learning.  

Hattie’s Effect Sizes (d) Slide 37 Present and explain the graph depicting the factors John Hattie explored that effect student learning.   37

Effect Sizes (d) Effect size = Average (post) - Average (Pre) The Formula Effect size = Average (post) - Average (Pre) (d) Average Standard Deviation (the spread) 38 Slide 38 Present and explain the Formula used.  

Effect Sizes (d) 2. Why use effect sizes? • To compare progress over time on the same test. • To compare results measured on different tests. • To compare different groups doing the same test. 39 Slide 39 Explain/discuss these different uses of effect sizes.

Effect Sizes (d) 3. How can schools use effect sizes? Discussion in school groups 40 Slide 40 Invite participants to discuss this in school groups, then take feedback from the whole group. Review real data to demonstrate the use of effect sizes.

Evaluation Self Reflection Slide 41 41

What is Self Reflection? Slide 42 Use this slide to introduce the final focus of this session. 42 42

"Test all things; hold fast what is good" Self Reflection "Test all things; hold fast what is good" I Thessalonians 5:21 43 Slide 43 Use this slide to offer the following reflection: I Thessalonians 5:21 instructs us to ‘test all things’, which would include our old notions, and then "hold fast" to the good ones—the ones that pass the test. A mistake many make is to follow tenaciously the instruction of Revelation 3:11 to ‘hold fast to what we have’ while completely ignoring the additional instructions of I Thessalonians 5:21 to ‘test first’.

What is Self Reflection? Self Reflection is simply a form of self evaluation undertaken by participants in social situations in order to improve the rationality and justice of their own practices, their understanding of these practices, and the situations in which the practices are carried out. Adapted from Carr and Kemmis, 1986 44 Slide 44 Introduce this description and discuss as required.

Self Reflection in Schools Self reflection is a process in which teachers examine their own educational practice systematically and carefully, using the techniques of action research. 45 Slide 45 Invite participants to offer comment on the following question: In your experience, to what extent do teachers (your colleagues) self reflect?

Self Reflection Leads to Improvement? Slide 46 Discuss the concept that action must follow reflection if there are to be any improvements / changes … or the water will continue to rise!!!!! 46

Follow effective action with quiet reflection. Out of the quiet reflection will come even more effective action. Peter. F. Drucker 47 Slide 47 Use for display only and participants’ reflection.

Teachers account for about 30% of the variance in student achievement “It is what teachers know, do, and care about, which is very powerful in this learning equation, not just what they know” (p. 2). They must put this knowledge into practice if they are to produce gains in student learning outcomes. Hattie (2003) 48 Slide 48 Use for display only and participants’ reflection.

Where to from here? Slides 49-52 Work towards concluding the session by engaging participants in discussion, in school groups, of the questions raised in Slides 50-52. Ask groups to report back to the whole group any main points (only if different to points already raised). 49 49

Where to from here? Each school to reflect on: Existing evaluation processes of: existing intervention programs currently in use in the school. teacher performance Student performance 50 Slide 50 50

In your Context Has this process highlighted the need to review your school’s policies and/or processes? 51 Slide 51

In your Context Has this process made a difference to your students’ performances? Evaluation of change is fundamental to the process. 52 Slide 52 If time permits, invite participants to add to the previous group statements about evaluation pertaining to effect sizes and self reflection (refer session 1, slide 29 activity). Display statements and provide each school with the set of statements to take back to their school.  

“When ignorance is bliss, tis folly to be wise” Thomas gray from ‘Ode on a Distant Prospect of Eton College’. 53 Slide 53 For display and reflection only.

“self reflection seeks to improve” Slide 54 For display and reflection only.

References Educational Evaluation Around the World: An International Anthology http://english.eva.dk/publications/educational-evaluation-around-the-world-an- international-anthology/download An Education Primer: An overview of education evaluation http://www2.ed.gov/offices/OUS/PES/primer1.html Evaluation definition: http://www.evaluationwiki.org/index.php/Evaluation_Definition Evaluation Toolkit: http://mypeer.org.au/monitoring-evaluation Introduction to program Evaluation: http://www.cdc.gov/tb/programs/Evaluation/Guide/Webinar/Eval_101_1_AP22.ppt What is Program Evaluation? A beginner’s guide: http://gsociology.icaap.org/methods/evaluationbeginnersguide.pdf Patton, M Q 2002, Qualitative research & evaluation methods (3rd ed.), Thousand Oaks, CA: Sage Publications 55 Slide 55 Recommend these resources to participants and encourage further professional reading. Conclude the session. NOTE Following the completion of all six modules (Term 3, 2012) : Teaching staff will re-do the ‘Essential Components Rubric’ LSOs will re-do the LSO questionnaire and complete the LSO self-reflection Case studies will be submitted.