Presentation is loading. Please wait.

Presentation is loading. Please wait.

Assessment & Reflective Practice Our Cornerstone for Change 25 Industrial Park Road, Middletown, CT 06457-1520 · (860) 632-1485 Connecticut State Department.

Similar presentations


Presentation on theme: "Assessment & Reflective Practice Our Cornerstone for Change 25 Industrial Park Road, Middletown, CT 06457-1520 · (860) 632-1485 Connecticut State Department."— Presentation transcript:

1 Assessment & Reflective Practice Our Cornerstone for Change 25 Industrial Park Road, Middletown, CT 06457-1520 · (860) 632-1485 Connecticut State Department of Education · Division of Educational Programs and Services

2 2 The Layout of Professional Development for EIP Day 1 -Collaborative Strategic Decision-Making Developing a process and framework Day 2 -Assessment and Reflective Practice Examining the use of assessment Identifying how reflective practice works Day 3 -Instructional Repertoire Building new ways to develop strategies focused on improved student outcomes

3 3 Central Themes Building a Collaborative Learning Community Using Strategic Decision-Making Building Capacity to Develop, Implement and Sustain an Effective Process

4 4 Objectives for Today To examine the use of protocols for analyzing student work in order to define a focus area for improvement; To develop effective monitoring systems that chart student progress from baseline to a specified target; and To define reflective practice and identify how it will improve implementation integrity, as well as enhance instructional practice.

5 5 Components of EIP Leadership Collegial & Family Partnerships Strategic Decision-Making Assessment & Reflective Practice Instructional Repertoire Accountability & Documentation

6 6 Lessons Learned Using assessment and reflection should result in a change in instructional practice. Assessments focus on environment, curriculum, and instruction, not just the student. Reflection is a process that focuses on how teachers can enhance their practice.

7 7 Indicators of a Quality Decision-Making Process Identify the focus area or concern Determine the desired outcome Generate alternative strategies Examine strategies for feasibility Develop a plan of action, including a monitoring system Implement & monitor student progress & the plan Evaluate student progress & the plan

8 8 Which Indicators Relate to Assessment & Reflective Practice? Identify the focus area of improvement Determine the desired outcome Generate alternative strategies Examine strategies for feasibility Develop a plan of action, including a monitoring system Implement & monitor student progress & the plan Evaluate student progress & the plan

9 9 Indicators That Will Be Covered Today Identify the focus area of improvement Determine the desired outcome Generate alternative strategies Examine strategies for feasibility Develop a plan of action, including a monitoring system Implement & monitor student progress & the plan Evaluate student progress & the plan

10 What is Assessment?

11 11 The Purpose of Assessment “Assessment is a process of collecting data for the purpose of making decisions about individuals or groups and this decision-making role is the reason that assessment touches so many people’s lives.” Salvia & Ysseldyke (2001)

12 12 What is the Purpose for Assessment? To make instructional decisions

13 13 Data to Verify From To Perception of an Issue Action What Makes Decision-Making Strategic? Data Driven Action Action Based on SWIS Perception of an Issue

14 14

15 15 Characteristics of Assessment Functional (Effective, Useful) Relevant Direct Multidimensional Formative Frequent, Repeated Individually Focused Technically Adequate

16 16 When You Think “Assessment” What is the question that needs to be answered? What information do you intend to obtain from your assessment? What will you do to get the information? How will you use the information you got?

17 17 Phases of Collaborative Inquiry Collecting Data Analyzing Data Organizing Data-Driven Dialogue Framing the Question Drawing Conclusions, Taking Action Monitoring Results Love, N., 2002

18 18 What Data Do We Use? Looking at Numbers Quantitative data (Numbers) Defining the gap between expectations and current performance Monitoring the progress and growth Move Beyond Numbers Qualitative data (Descriptions) Developing a focus area or the cause of a concern Defining the context Examining the implications of decisions

19 19 Testing vs. Assessment

20 Grading Practices How Do Grades Support or Hinder Assessment?

21 21 What Grade Would You Give?

22 22 What Grade Would You Give Now?

23 23 Let’s Reflect What does this exercise tell us about grading? How reliable are grades in terms of assessing student progress?

24 24 Test Review Observation Interview Examining Student Work Decision- Making

25 25 Test Review Observation Interview Examining Student Work Decision- Making

26 26

27 27 Domains of Assessment Context of learning What we teach Outcomes of Learning How we teach S tudent(s) I nstruction E nvironment C urriculum Adapted from Heartland Area Education Agency

28 28 DOMAINSR (Review)I (Interview)O (Observe)T (Test) E (Examine Student Work) C Curriculum Permanent products District Standards Lesson plans Teachers Curriculum Specialists Administrators Implementation of standards Decisions on selection of content Readability of texts Standards in Practice SLICE Tuning Protocol E Environment School Rules, handbooks Policies Teachers Administrators Parents Students Interaction patterns Environmental analysis Observational based assessments Classroom environment scales, checklists, etc. Initial Line of Inquiry Standards in Practice I Instruction Permanent products Teachers Administrators Parents Students Implementation of CCT Teacher expectations Antecedents, conditions, consequences Classroom environment scales, checklists, etc. Initial Line of Inquiry Descriptive Review Lesson Study Tuning Protocol S Student Student records Teachers Administrators Parents Students Target area Dimensions & nature of the problem Student performance Discrepancy btw setting demands & performance Initial Line of Inquiry Descriptive Review

29 29 Nation/International Assessments Are students performing optimally? Large Scale Assessments Are students meeting the state standards? Diagnostic Assessments What are students’ cognitive strengths and needs? Student Report Cards How are students performing in general? Performance Assessment Can students apply and generalize what they’ve learned? Classroom Curriculum Unit Tests, Quizzes Did Students learn it? Formative Assessments Are students learning it? Figure 1.The Richness and Complexity of Student Assessment Data Specificity of Information Rate of Feedback North Central Regional Educational Laboratory Policy Issues Issue 6 Nov 2000 2 1. Using Student Assessment Data: What can We Learn from Schools? Allison Cromley Annually to students in selected grades As needed/usually 1X/year Once/curriculum unit Weekly Daily

30 30 Assessment & Reflective Practice (Adapted from Ortiz, 1987; Horner, 1998; Sugai, 2001) All Students in School Universal Assessment Focused Assessment Lesson Study Observational-Based Curriculum-Based Observation-Feedback on Instruction In-Depth Analysis Increased Objectivity Focused Assessments Reflective Practice Examining Student Work Problem Validation Formal & Informal Monitoring Student Progress

31 31 A Key Factor for Assessment In 2000, a Harvard study was conducted examining the issue of disproportionality in special education. Connecticut was cited as one of the states identified as in need of improvement in this area.

32 Using Assessment to Identify the Focus Area for Improvement

33 33 Using Your Homework Select a “case” to use for the next session Single student e.g., a gifted student A specific group of students e.g., ELL A classroom or grade level e.g., improving math instruction A whole school e.g., lunchroom behavior A whole district e.g., increasing time with non- disabled peers or a new science curriculum

34 34 Identify the Focus Area for Improvement What is happening? Frame a question in terms of the impact on student learning Examine the context by collecting and analyzing data Develop a hypothesis to define a central area of focus

35 35 Remember… We Need to Develop a Question Frame a question in terms of the impact on student learning Frames our thinking in terms of inquiry vs. judging Aligns our thinking to student learning

36 36 Use Your Case Examine the information you have about your case. What is the question you want to answer? Write your question on your worksheet. #1

37 37 Examine the Context Examine the context by collecting and analyzing data Determine when, where, how long, with whom, and under what conditions Develop a rationale for the occurrence using data Use evidence to explain what we see as reason for performance gaps

38 38 Domains of Assessment Context of learning What we teach Outcomes of Learning How we teach S tudent(s) I nstruction E nvironment C urriculum Adapted from Heartland Area Education Agency

39 39 Essential Questions to Analyze Curriculum What content standards does this address? What are the performance standards? What is the essential content? What is the level of expectation? How are the curricula standards and materials adapted to meet instructional level?

40 40 Essential Questions to Analyze Environment How are expectations clearly communicated? What are the task directions? What are the opportunities for student choice? What are the physical influences on the learning? What are the social/interpersonal influences on the learning? How do the student and teacher collaborate in the learning process?

41 41 Essential Questions to Analyze Instruction What is the amount of student engagement and relevant practice? Is there appropriate pacing? What teaching strategies are used? How are tasks organized for students? Is there an instructional match? How does the feedback support student learning?

42 42 Essential Questions to Analyze Student Performance What does the student know? What can the student do? What are the student’s strengths? What are the student’s interests? What it the instructional level? What learning strategies does the student use? How does the student organize information and approach new learning? How does the student self-monitor? What are the patterns in errors?

43 43 Essential Questions to Ask About Behavior When is the behavior most/least likely to occur? Where is the behavior most/least likely to occur? With whom is the behavior most/least likely to occur? What happens immediately before/after the behavior? What do others do when the behavior occurs? What other environmental conditions may contribute to the behavior? Pennsylvania Department of Education, Initial Line of Inquiry Gary LaVigna (2000) Behavioral Assessment and Advanced Support Strategies

44 44 For Example… Chad 3 4 2 1 Gickliing Dog Cat Apple Ball

45 45 How Does Chad Approach Alphabetizing? Chad 3 4 2 1 Gickliing Dog Cat Apple Ball

46 46 How Does Chad Approach Alphabetizing? Chad 3 4 2 1 Gickliing Dog Cat Apple Ball

47 47 What Does This Tell Us About… Curriculum How effective is the curriculum for Chad? Environment What are the environmental influences on Chad’s learning? Instruction What instructional methodology strengthens Chad’s learning?

48 48 Use Your Case Examine the assessments you currently have about your case. What assessment data could tell you about…? Curriculum Environment Instruction Student(s) #2

49 Using Protocols to Define the Focus Area of Improvement A Means to Collaboratively Analyze Assessments

50 50 What are Protocols? Tools for analysis that are characterized by: Structured dialogue Collaborative inquiry More than one perspective Reflective practice

51 51 The Purpose of Protocols Provide a safe environment to share and reflect with colleagues Give and receive feedback on our practices and the relationship to student learning Focus on student work/performance Make the most efficient use of our time

52 52 Centers Using the premise of your “case” select the most appropriate center Descriptive Review Initial Line of Inquiry Behavior Academic

53 A Sample Protocol for Examining Student Work Descriptive Review

54 54 Descriptive Review What does it look like? Examination of a student product (e.g. writing sample, math assignment, etc.) Round Robin responses to selected questions (e.g. describe what you see?) When would we use it? C Determining next curriculum area E Connecting the context & student work I Determining next steps for instruction S Having a deeper analysis of student learning

55 55 Descriptive Review What do you need? Facilitator to run the process Presenting teacher to provide the context of the student work & a focus for reflection A Student work sample hard copy of the student work How does it work? Follow articulated steps Select key questions to ask for each round (one question per round) Each member of the group provides one response to the question (Round robin fashion) (Can go around more than once for more responses)

56 56 Descriptive Review Sample Timetable StepsTime Review of Process 5 minutes Setting the Tone15 minutes Work is Presented with Context 5 minutes Descriptive Rounds30 minutes Hearing from the Teacher10 minutes Reflecting 5 minutes

57 57 Descriptive Review Review the Process The facilitator provides the directions and timelines for the process. Setting the Tone The group reviews the intention of the process. The group agrees to the reflective process.

58 58 Descriptive Review Work is Presented/Context Teacher puts the work out for the team to see and provides a brief introduction to the work. Descriptive Rounds Selection of rounds is based on type of work and focus of reflection. Each round builds on the previous one, seeking to deepen an appreciation for the instruction, task, and student learning.

59 59 Descriptive Review Hearing from the Teacher Presenter has time to say what was heard. Reflecting The group reflects on the process. Each member highlights what was learned.

60 60 Descriptions vs. Judgments Descriptions See, Hear, Touch Evidence based Specific language Judgments Inferences Feelings Assumptions Perceptions

61 A Sample Protocol for Examining Behavior Initial Line of Inquiry

62 62 Initial Line of Inquiry What does it look like? Facilitated dialogue focused on behavior and the context around behavior Structured responses to key questions using anecdotal and assessment data Develops a hypothesis for the focus area of improvement When would we use it? C Determining curriculum effects on behavior E Connecting environmental conditions to behavior I Determining instructional effects on behavior S Having a deeper analysis of student behavior

63 63 Initial Line of Inquiry What do you need? Facilitator to run the process Team of people who Know the student Know functional analysis General observations Observational Based Assessments Overhead or chart paper How does it work? Follow articulated steps and key questions Record information on the format provided by protocol Facilitate a collaborative dialogue about the meaning of the observations Develop a hypothesis

64 64 Behaviors Exist in Context Behaviors are context related Challenging behaviors result from unmet needs Effective supports come from an understanding of why a behavior occurs

65 65 The ABCs of Behavior A ntecedents B ehavior C onsequences

66 66 ABC Chart TimeAntecedentBehaviorConsequence 9:05 Teacher gives class an independent writing assignment X looks out window Teacher prompts X to begin writing 9:10 Teacher prompts X to begin writing X picks up pen and scribbles on page Teacher walks away 9:17 Teacher prompts X to stop scribbling and begin writing X rips paper up and throws it on the floor Teacher tells X to go to office 9:18 Teacher tells X to go to office X stands up and goes to office X stays in office until next period

67 67 The Format for Initial Line of Inquiry Strengths of Student: Slow Triggers (Setting Events) Fast Triggers (Antecedents) Problem Behavior Perceived Function Actual Consequence Pennsylvania Department of Education, Initial Line of Inquiry

68 68 Consequences Consequence is the immediate natural response to a behavior Undesirable outcome (not likely to occur again) Desirable outcome (likely to occur again) Imposed consequences do not always yield the results we want

69 69 What is the Function of Behavior? Avoidance What is avoided with the behavior? Gains What is gained or achieved with the behavior?

70 70 Make a Statement About the Behavior Three parts include: When {antecedent/trigger} occurs, The {student(s)} do/does {behavior of concern}, In order to {perceived function}. Pennsylvania Department of Education, Initial Line of Inquiry

71 71 Hypothesis Statement: When Jeff is given an independent writing assignment, he rips his paper up and throws it on the floor, in order to escape the writing task.

72 A Sample Protocol for Examining Academic Performance Initial Line of Inquiry

73 73 Initial Line of Inquiry What does it look like? Facilitated dialogue focused on the context around academic achievement Structured responses to key questions using assessment data Develops a hypothesis for the focus area of improvement When would we use it? C Determining curriculum effects on achievement E Connecting environmental conditions to achievement I Determining instructional effects on achievement S Having a deeper analysis of student learning

74 74 Initial Line of Inquiry What do you need? Facilitator to run the process Team of people who Know the student Know the curriculum & instruction General observations Curriculum Based Assessments Overhead or chart paper How does it work? Follow articulated steps and key questions Record information on the format provided by protocol Facilitate a collaborative dialogue about the meaning of the observations & assessments Develop a hypothesis

75 75 Learning Variables This protocol focuses on four learning variables: Curricular Instructional Student Performance Environmental

76 76 The Format for Initial Line of Inquiry CurriculumInstructionStudent PerformanceEnvironment Pennsylvania Department of Education, Initial Line of Inquiry

77 77 Three Part Hypothesis What variables (factors) block learning? How does the student learn? What strategies would support how the student learns? Pennsylvania Department of Education, Initial Line of Inquiry

78 78 Hypothesis Statement When Jeff is given an independent writing assignment that requires at least five paragraphs in respond to a prompt, he writes simple detail sentences that lack a main idea or a central theme and therefore, Jeff needs to organize his writing of main idea and detail sentences under a central theme by using a structured graphic organizer, such as TOWER.

79 79 Use Your Case Reflect on what you learned using this protocol. What can say you about…? Curriculum Environment Instruction Student(s) #3

80 80 Other Protocols to Consider Action Reflection Protocol (Education Development Center, Newton, MA.) Case Story (Coalition for Essential Schools) Collaborative Analysis of Student Learning (CAStle) ASCD Consultancy (CES/Annenberg Institute National School Reform Faculty) Final Word Protocol (Coalition for Essential Schools) Lesson Study (Japan) Primary Language Record (Centre for Language in Primary Education, London) Slice (Joseph McDonald) Tuning Protocol

81 81 Reflection Question Share the methods your school or district currently uses to examine student work? What are the advantages of using a structured protocol?

82 Using Assessment to Develop an Hypothesis

83 83 Develop a Hypothesis Develop a hypothesis to define a central focus Examines the relationship among the context variables Determines why this is

84 84 Symptoms vs. Causes Symptoms Observable Details A list of separate concerns Causes Inferred from behaviors Underlying reason/function Determined by grouping and analyzing objective, observable evidence

85 85 Symptoms vs. Causes Symptoms Lack of fluency Frequent word recognition errors Errors tend to be visual Mispronounces words Frequent spelling errors Cause

86 86 Symptoms vs. Causes Symptoms Does not complete work Frequently moves around the room during academic tasks Acts out during teacher directed lessons Cause

87 87 Making a Statement About the Focus Area of Improvement When {condition or trigger} occurs, {the student, class, school, etc.} does {focus area}, in order to {perceived function}. When there is an indoor recess, the students in grade 4 talk loudly and get out of their seats during lunch, in order to release energy.

88 88 Use Your Case Use your analysis to develop a hypothesis When {condition or trigger} occurs, {the student, class, school, etc.} does {focus area}, in order to {perceived function}. #4

89 89 So What Do We Want to Happen? The desired outcome is developed from changing the currently reality to a new one. Take a look at your hypothesis. What is it that you want to happen instead?

90 Establishing Baseline and Developing Monitoring Systems Measuring Progress

91 Establishing Baseline and Developing Monitoring Systems Measuring Progress

92 92 What Do These Words Mean? Always Occasionally Rarely Often Sometimes Frequently Usually A lot Never Once and a while Independently mark a percentage next to each word.

93 93 What Do These Words Mean? Compare what you wrote with your table group. Record the range of percentages.

94 94 What Do These Words Mean? Always Occasionally Rarely Often Sometimes Frequently Usually A lot Never Once and a while

95 95 What Do These Words Mean? What do these ranges tell us about the way we generally describe what we see?

96 96 Types of Vague Language Nouns/Pronouns and Verbs “My students don’t listen.” Comparators “I want my students to do better on their quizzes.” Rule Words “I have to give C’s to students who have modified work.” Universal Qualifiers “All of the parents are upset about the report card.” L. Lipton & B. Wellman, 2003

97 Baseline

98 98 Establish Baseline Establish baseline of current level of performance Determine a starting point before anything is implemented Determine what the student(s) currently know(s) and able to do

99 99 Baseline Data Baseline data needs to align with the focus area. Clearly define the focus Observable (can be seen or heard) Measurable (can be counted) Specific (clear terms, no room for a judgment call) It is always numbers.

100 100 Which Ones Are Observable, Measurable, & Specific? Paying attention Aggressive behavior Out of seat Off task Throwing objects Homework completion Comprehension Spelling errors Phonemic awareness Math facts known Writing narrative Correct words per minute How would you change vague and non-measurable terms to be observable, measurable, and specific?

101 101 Baseline Data A general rule of thumb is 3. Sensitive to small changes over time.

102 102 Which Assessments Provide Quality Baseline? Holistic writing score Duration of a behavior or task Rubrics Grades Communication journal Frequency count of behavior or task Running record or DRA Anecdotal record Error analysis of student work ABC Chart

103 103 Use Your Case Using the question and hypothesis you developed, develop a plan to establish baseline. What will be assessed? How? By whom? When? #5

104 Setting Targets

105 105 Determine the Gap Determine the specific gap between current and desired performance Determine what needs to specifically change Establish what the student needs to learn Establish what conditions are needed to accelerate the learning

106 106  Demands/ Skills Years in School  The Achievement Gaps KU-CRL Gap Baseline Expected Performance

107 107 Set a Target Set a target for expected outcome and timeframe for accomplishment Determine the grade level performance standard Determine the rate of learning for most students in this area Use the gap analysis to determine a reasonable target and a specific timeframe for this target to be achieved

108 108 Using Benchmarks Break down the time to meet a given goal in shorter increments Set a performance mark for each benchmark Build each benchmark on the previous one-interval monitoring Use to articulate the rate of progress

109 109 Demands/ Skills Time The Goal Line Expectations for All Students  Baseline/Current Level of Performance  Goal Student’s Projected Line of Growth Benchmark -4 weeksBenchmark -8 weeksBenchmark -6 weeks16 weeks

110 110 Use Your Case Using your current information, discuss what is needed for you to develop a target goal and a set of benchmarks. Do you have baseline? Can you define the expected performance for all students? Can you assess the gap? How will you get all this information? #6

111 111 Writing a Desired Outcome Clearly define the outcome Observable (can be seen) Measurable (can be counted) Specific (clear terms, no room for a judgment call) May sometimes require smaller benchmarks When {condition} occurs, {the student} will {desired outcome} from {baseline} to {target} by {timeline}.

112 112 Use Your Case Using your current information, develop a desired outcome. When {condition} occurs, {the student} will {desired outcome} from {baseline} to {target} by {timeline}. What are you missing to complete this sentence? When will you obtain this? #7

113 Monitoring Systems

114 114 Develop a Monitoring System Develop a monitoring system that aligns with the baseline data and a criterion for measuring the progress

115 115 Monitoring vs. Evaluating Monitoring On-going and frequent Part of the implementation process Provide information for adjustments in plan Evaluating A specific point in time A review of the implementation process Provide information for decisions on next steps

116 116 How Will We Monitor? Determine who will monitor the progress Determine the assessment process to use and connect it to the baseline Predetermine intervals for monitoring Determine a timeline for evaluation Daily Weekly

117 117 Monitor the Progress Monitor the level and rate of progress of student learning Monitor on a frequent basis (daily or weekly) Student progress Implementation Integrity Check for rate of progress as it relates to the target goal line

118 118 Demands/ Skills Time Charting Progress Expectations for All Students  Baseline/Current Level of Performance  Goal Student’s Current Progress

119 119 Demands/ Skills Time Charting Progress Expectations for All Students  Baseline/Current Level of Performance  Goal Student’s Progress

120 120 Documenting Student Progress Quantitative Information Graphing progress (e.g., attendance, homework completion, correct words per minute, etc.) Noting scores/levels and assessments used Stating student growth in terms of numbers Qualitative Information Narratives written in objective, observable language Noting the analysis of scores and the context (curriculum, instruction, and environment)

121 121 Tips for Documenting Student Progress Use the same assessment process and tools for baseline and monitoring Sensitive to small changes over time. Report the information in the same format (e.g. graphing). Align the assessment with the intervention (e.g. DRA, OBA). Monitor student progress on a frequent and regular basis in order to make quality judgments about the progress.

122 122 Use Your Case Using your potential desired outcome, discuss a possible monitoring plan. What will be assessed? How? By whom? When? How frequently? How does it relate to the baseline? #8

123 Reflective Practice Our Cornerstone for Change

124 124 Why Reflect? “If teachers are to become skilled at independently identifying and addressing idiosyncratic learning problems of their students, they must learn to reflect critically on student work as well as on their own teaching practices.” “Lifelines to the classroom: Designing support for beginning teachers”, by Kendyll Stansbury and Joy Zimmerman. Knowledge Brief, WestEd, 2000.

125 125 Evaluate the Student Progress and Plan What changes occurred? Evaluate and analyze the overall progress by comparing the baseline data to the outcome data Examine the degree of implementation integrity of the plan Determine what changes occurred Use a decision guide to make adjustments and/or revisions to the plan

126 126 What Reflective Educators Do? Commit to continuous improvement Assume responsibility of learning Demonstrate thinking skills for inquiry Take action that aligns with new understanding Reflective Practice to Improve Schools J. York-Barr, et.al.

127 127 Reflection Cycle Collect Data From a Variety Of Sources Analyze Data Evaluate Student Learning Modify Practice Draw Conclusions About Impact of Teaching on Student Learning BEST Training 2001

128 128 What Do We Change? Context of learning What we teach Outcomes of Learning How we teach S tudent(s) I nstruction E nvironment C urriculum Adapted from Heartland Area Education Agency

129 129 Integrity Did we do what we said we would do? Reasons why we tend not to follow through: Lack of defined or appropriate focus Plan was not clearly defined or comprehensive to include appropriate strategies The skill levels needed to implement the plan were not adequate The right resources (time, money, personnel) were not supplied

130 130 Measuring the Effectiveness of Implementation Did we achieve our goal for student outcomes? Did we do what we said we were going to do to promote student success? How do we know this? Did we set a predetermined goal line? Did we monitor student progress towards this goal line? Did we examine why the goal was met or not met?

131 131 Self-Reflection Dialogue how the protocol you used today will serve as reflective practice and as a means to ensure implementation integrity. Descriptive Review Initial Line of Inquiry

132 132 With Your Technical Assistant Reflect how today’s information influences the process you have developed thus far. Review the previous dialogue about your school’s /district’s use of collegial support and family partnerships. Examine the various ways of teaming and determine how collegial support and family partnerships could potentially look for your school/district.

133 133 On Your Own… 1. Select a protocol and try it with a small group. 2. Review today’s content and add any additional assessment information needed to case study. Collect baseline. Revise hypothesis and desired outcome, if necessary. Utilize technical assistance support to complete assessment worksheet for case study.

134 134 Bring with You Next Time Bring same case study and supplemental materials with you, including updated assessment information. Bring curriculum and sample lesson plans for case that relate to identified focus area in need of improvement.


Download ppt "Assessment & Reflective Practice Our Cornerstone for Change 25 Industrial Park Road, Middletown, CT 06457-1520 · (860) 632-1485 Connecticut State Department."

Similar presentations


Ads by Google