Download presentation
Presentation is loading. Please wait.
Published byDrusilla Copeland Modified over 9 years ago
1
USING FORMATIVE AND BENCHMARK ASSESSMENTS IN AN RTI SYSTEM Abby Potter Education Consultant, Title I School Support Team John Humphries School Psychologist, Student Services/Prevention & Wellness Abby Potter Education Consultant, Title I School Support Team John Humphries School Psychologist, Student Services/Prevention & Wellness
2
Agenda: 60 minutes Abby & John: Introduction and Background John: Benchmark Assessment—Screening Abby: Formative (Ongoing) Assessment John: Benchmark Assessment—Progress Monitoring Abby & John: Q & A, Discussion as time allows
3
Response to Intervention (RtI) RtI is a process for achieving higher levels of academic and behavioral success for all students through: High Quality Instructional Practice Continuous Review of Student Progress (multiple measures) Collaboration
4
Response to Intervention (RtI) A Systemic Approach for Constant Inquiry Continuous Review of Student Progress To Assess: How all students are performing (screening) How they are responding to differentiated core instruction (ongoing assessment) How they are responding to intervention/additional supports (monitoring progress)
5
Topics for Assessment Types Definitions Purposes/Rationale Strengths & Limitations Common features Research Resources for getting started
6
BALANCED ASSESSMENT Continuous Review of Student Progress 6
7
Balanced Assessment System Key Components: Continuum of assessments Multiple users Multiple information sources, used to create a complete picture of student progress Each assessment type has a primary purpose, as well as strengths and limitations
8
Balanced Assessment System Formative Benchmark Summative Daily Ongoing Evaluation Strategies Periodic Diagnostic/Progress Assessments Large-Scale Standardized Assessments Immediate Feedback Multiple Data Points Across Time Annual Snapshot Student-Centered Classroom/School-Centered School /District/State-Centered 8
10
Summative/ Large-Scale Purpose : To determine how students in schools, districts, and states are progressing To inform curriculum and instruction To determine Adequate Yearly Progress (AYP)
11
Benchmark Assessment Purpose: To determine to what extent all students are progressing (screening) To determine how well additional supports or services are working before too much time passes (monitoring progress)
12
Formative Assessment Purpose: To consider what learning comes next for students To improve learning while there is still time to act – before the graded event
13
Current Practices What are you doing now to assess your students? How is it working? WKCE and AYP WKCE Definition of Proficiency Differences WKCE and NAEP Achievement Gaps
15
RtI: From SPED to Gen Ed to Every Ed Perspective: “The Art and The Science of Teaching” Why RtI? Why now? One difference is the major advances in assessment and intervention technologies that allow us to make better decisions and intervene more appropriately. Moving from “true” CBM to more standardized measures.
16
Benchmark Assessment:Screening Definitions Purposes/Rationale Strengths and Limitations Common features Research Resources for getting started: Academics & Behavior
17
Screening: Definition Screening is characterized by fast, inexpensive, repeatable data collection about critical skills, beliefs, or behaviors. Screening usually identifies students who need further assessment or provides information for future planning activities.
18
Screening: Purposes/Rationale The purpose of screening is to identify students who are “at-risk” of a poor outcome Rationale: Use a screener with strong statistical properties along with other data to identify students you want to learn more about Don’t wait until it’s too late. WKCE is a poor screener for this reason.
19
Screening: Strengths & Limitations By definition, easy, quick, repeatable Immediate results Guide programming Predictive validity Diagnostically Guiding instruction Administrators Teachers Absent good PM and Formative Asmt. Statistical limitations StrengthsLimitations: How Misused
20
Selected Research on Screening Jenkins, J. R., Hudson, R. F., & Johnson, E. S. (2007). Screening for service delivery in an RTI framework: Candidate measures. School Psychology Review, 36, 560-82. Messick, S. (1989). Validity. In R. L. Linn (Ed.), Educational measurement (3rd ed., pp. 13–103) New York: Macmillan. Riedel, B. W. (2007). The relationship between DIBELS, reading comprehension, and vocabulary in urban first-grade students. Reading Research Quarterly, 42, 546–567. Ritchie, K. D., & Speece, D. L. (2004). Early identification of reading disabilities: Current status and new directions. Assessment for Effective Intervention, 29(4), 13–24. Snellen Eye Chart (1862).
21
World Health Organization: Principles of Screening (1968) The condition should be an important health problem. There should be a treatment for the condition. Facilities for diagnosis and treatment should be available. There should be a latent stage of the disease. There should be a test or examination for the condition. The test should be acceptable to the population. The natural history of the disease should be adequately understood. There should be an agreed policy on who to treat. The total cost of finding a case should be economically balanced in relation to medical expenditure as a whole. Case-finding should be a continuous process, not just a one time project. Wilson JMG, Jungner G. Principles and Practice of Screening for Disease. WHO Chronicle 1968;22(11):473
22
Resources for Screening BASC CBCL Office Referrals Teacher Nomination TeenScreen GAIN-SS Online SOS Go to National Center on Student Progress Monitoring Also see The ABCs of CBM by Hosp etc. MAP? Behavioral ScreeningAcademic Screening
23
Formative (Ongoing) Assessment Definitions Purposes/Rationale Strengths and Limitations Common features Research Resources for getting started: Academics & Behavior
24
Formative (Ongoing) Assessment Definition: “Formative assessment is an intentional and systematic process used by teachers and students during instruction that provides feedback to adjust on-going teaching and learning to improve students’ achievement of the intended instructional outcomes.” CCSSO, 2007
25
Formative (Ongoing) Assessment intentional systematic process feedback adjust on-going intended instructional outcomes
26
Purpose: To consider what learning comes next for the student To improve learning while there is still time to act – before the graded event Formative (Ongoing) Assessment
27
Examples: Teacher observations Teacher questioning & class discussions Analysis of student work (graded & non-graded) Exit questions Teacher feedback Student self-assessment KWLs Student Journals
28
Strengths: Informs day-to-day instruction Informs intervention Instant information Student self-assessment Provides information about on-going student progress Designed & evaluated by those who know the students best Provides a huge volume of qualitative, descriptive data Formative (Ongoing) Assessment
29
Limitations: Time Informal/not standardized Overabundance of information May be challenging to ‘grade’ When used to the exclusion of other types of assessment
30
Formative (Ongoing) Assessment Essential components of effective formative assessment: Learning Progressions: clearly articulate the sub-goals of the ultimate learning goal Learning Goals and Criteria for Success: clearly identified and communicated to students Descriptive Feedback: provided to students with evidence-based feedback that is linked to the intended instructional outcomes and criteria for success. CCSSO, 2008
31
Formative (Ongoing) Assessment Essential components of effective formative assessment (continued): Self- and Peer-Assessment: important for providing students an opportunity to think metacognitively about their learning. Collaboration: A classroom culture in which teachers and students are partners in learning should be established. CCSSO, 2008
32
Formative (Ongoing) Assessment Research Inside the Black Box: Raising Standards Through Classroom Assessment By Paul Black and Dylan Wiliam (1998) New assessment beliefs for a new school mission By Rick Stiggins (2004) Implementing Formative Assessment at the District Level: An Annotated Bibliography (New England Comprehensive Center)
33
Formative (Ongoing) Assessment Resources for getting started: Academics & Behavior Set learning goals and criteria for success Select assessment techniques (teacher and students) Determine how feedback is provided Organize information from formative assessment (teacher and students)
34
Formative (Ongoing) Assessment “Assessment FOR learning turns the classroom assessment process and its results into an instructional intervention designed to increase, not merely monitor, student learning.” Richard Stiggins
35
Benchmarks: Progress Monitoring Definitions Purposes/Rationale Strengths and Limitations Common features Research Resources for getting started: Academics & Behavior
36
Progress Monitoring: Definition Progress monitoring (PM) is a scientifically-based practice used to assess student performance and evaluate the effectiveness of instruction.
37
PM: Purposes/Rationale PM has two purposes: Determine whether students are progressing appropriately from additional supports and intervention Build more effective supports and interventions Rationale: Use PM to closely monitor whether what we’re doing is effective!
38
PM: Strengths & Limitations High frequency Sensitive to change Guide programming more than screening May have to make your own PM tools Improper tools give invalid, unreliable results Statistical limitations Used in isolation StrengthsLimitations: How Misused
39
Research on Progress Monitoring A substantial research literature Support a wide range of educational decisions Beginning in 1977 as Data-Based Program Modification (Deno & Mirkin, CEC) "Developments in Curriculum-Based Measurement" by S.L. Deno, 2003, The Journal of Special Education, 37. 3., 184-192.
40
Progress Monitoring and SLD Federal regulations currently require: Information demonstrating that the student received repeated assessments of achievement reflecting student progress §300.309(b)(2) Data-based documentation is “…an objective and systematic process of documenting a child’s progress.” “…data documenting a child’s progress are systematically collected and analyzed…” Comments, Page 46657
41
National Perspective President Obama on Monday: “…far too few districts are emulating the example of Houston and Long Beach, and using data to track how much progress a student is making and where that student is struggling - a resource that can help us improve student achievement.”
42
Resources for PM Frequency of difficulties in school Self-rating Parent/teacher rating Determined by treatment providers Go to National Center on Student Progress Monitoring Also see The ABCs of CBM by Hosp Behavioral PMAcademic Screening
43
Summary Everyone has an important role in selecting assessments for RtI schools Reading specialists have expertise in teaching and assessing reading skills Teachers have expertise in aligning assessment with curriculum School psychologists have expertise in tests and measurement for academics and behavior
44
How Does it Fit Together? Addl. Diagnostic Assessment Instruction Results Monitoring Individual Diagnostic Individualized Intensive weekly All Students at a grade level Fall Winter Spring Universal Screening None Continue With Core Instruction Guided by Formative Assessment SUMMATIVE Grades Discipline AYP Measures Group Diagnostic Small Group Differen- tiated By Skill 2 times/month Step 1 Step 2Step 3Step 4 Supplemental Services 1-5% 5-10% 80-90% Core Instruction Intensive Support Courtesy of Dave Tilley, Heartland AEA
45
Questions in Test Selection How does this map to our data system? Does this test have adequate technical properties for our intended use? Reliability & Validity Frequency Scale Alignment with our curriculum How will we use the collected data? Don’t use a test outside of its how intended. If using for SLD, federal regulations § 300.304 require use “for the purposes for which the assessments or measures are valid and reliable”
46
Achieving Balance Thinking about your assessment system… Is your system balanced? If not, are you okay with imbalance? How do the assessments support and inform one another? Do all users know the purpose, strengths and limitations of the assessments? What do you do with the results of the different assessments? Does everyone play an important role? How deal with disagreement? Collaborate!
47
Myths and Misperceptions Myth: “Running Records have no role in RtI” Myth: “DIBELS does not work—no research” Myth: “PM is SPED, no need in Gen Ed” Myth: “WKCE is worthless” Myth: “We already assess too much and now RtI is going to make us assess more” Myth: “School Psychologists will rule the world and take over reading programming” Myth: “Reading Specialists will rule the world and take over school psychology programming”
48
Additional Research Sources Response to Intervention: Research for Practice. 2007, NASDSE www.rti4success.org www.rtinetwork.org (NASP and IRA) Response to Intervention: A Research Review. Hughes & Dexter
49
Contact Information Abby Potter aubree.potter@dpi.wi.gov (608) 267-7338 John Humphries john.humphries@dpi.wi.gov (608) 266-7189
50
USING FORMATIVE AND BENCHMARK ASSESSMENTS IN AN RTI SYSTEM Abby Potter aubree.potter@dpi.wi.gov John Humphries john.humphries@dpi.wi.gov Abby Potter aubree.potter@dpi.wi.gov John Humphries john.humphries@dpi.wi.gov (800) 441-4563 dial 6
Similar presentations
© 2024 SlidePlayer.com Inc.
All rights reserved.