STAR Training of Trainers

Slides:



Advertisements
Similar presentations
Accelerating Achievement in Boston Public Schools: Academic Achievement Framework.
Advertisements

Response to Intervention (RtI) in Primary Grades
Chapter 9 - Fluency Assessment
1 The Florida Assessments for Instruction in Reading Overview.
Chapter 9: Fluency Assessment
Academic Data for Instructional Decisions: Elementary Level Dr. Amy Lingo, Dr. Nicole Fenty, and Regina Hirn Project ABRI University of Louisville Project.
Digging Deeper with DIBELS Data
STAR Assessments: Using data to drive your instruction 2012.
Margaret D. Anderson SUNY Cortland, April, Federal legislation provides the guidelines that schools must follow when identifying children for special.
1 Module 2 Using DIBELS Next Data: Identifying and Validating Need for Support.
INTRODUCTION TO THE FAST SYSTEM The FAST Team University of Minnesota Theodore J. Christ, PhD
Robyn Seifert February 6,  smarterbalanced.org  K-12 EDUCATION  Administrators 2.
WelcomeOPLC’s Reading Program and How it Works. OPLC Overview Balanced Reading Program – Reading Block – Whole Group Reading Assessments – Grouping Supports/Enrichment.
North Penn School District Phase III Update Introduction to Response to Instruction and Intervention (RTII): A Schoolwide Framework for Student Success.
Response to Intervention Finding RTI-Ready Measures to Assess and Track Student Academic Skills Jim Wright
Universal Screening: Answers to District Leaders Questions Are you uncertain about the practical matters of Response to Intervention?
Dr. Kelley. Share LiveText Assignment Discuss at table (rubric, typed AIP, student assessments):  The assessment tools you used.  What you learned.
Curriculum Based Evaluations Informed Decision Making Leads to Greater Student Achievement Margy Bailey 2006.
STAR Basics.
Tools for Classroom Teachers Scaffolding Vocabulary activities Graphic organizers Phonics games Comprehension activities Literature circles.
RESPONSE TO INTERVENTION Georgia’s Pyramid. Pyramid Vocabulary  Formative Assessment  Universal Screening  Intervention  Progress Monitoring.
EasyCBM: Benchmarking and Progress Monitoring System Jack B. Monpas-Huber, Ph.D. Director of Assessment & Student Information Shereen Henry Math Instructional.
Chapter 9 Fluency Assessment Tina Jensen. What? Fluency Assessment Consists of listening to students read aloud for a given time to collect information.
Aligning Interventions with Core How to meet student needs without creating curricular chaos.
Assessment, Screening and Progress Monitoring made Easy! a tool for every tier.
STAR Pointers IES Team Staff Development Specialists, E2CCB Kim Oakes Dana Serure Justine Stephan.
Response to Intervention Reliable Methods to Measure Student Progress in Basic Literacy Skills Jim Wright
UNIVERSITY OF MINNESOTA Minnesota Center for Reading Research 175 Peik Hall 159 Pillsbury Drive SE, Minneapolis, MN Contacts: Kathrin Maki:
Vision: Every child in every district receives the instruction that they need and deserve…every day. Oregon Response to Intervention Vision: Every child.
A Closer look at Computer Adaptive Tests (CAT) and Curriculum-Based Measurement (CBM)— Making RTI progress monitoring more manageable and effective.
Utilizing AIMSweb to Inform Instruction June 25, 2012.
PLCs & Data: Key Drivers for Successful Response to Intervention Matthew Burns, Ph.D. University of Minnesota.
1 Preventing Reading Difficulties with DIBELS Assessment.
DIBELS: Dynamic Indicators of Basic Early Literacy Skills 6 th Edition A guide for Parents.
Vision: Every child in every district receives the instruction that they need and deserve…every day. Oregon Response to Intervention Vision: Every child.
0 From TN Department of Education Presentation RTII: Response to Instruction and Intervention.
School-wide Data Analysis Oregon RtI Spring Conference May 9 th 2012.
Overall Training Objectives for TODAY Explain how appropriate STAR Reports assist with the Student Learning Objectives (SLOs) process Explain how to use.
Vision: Every child in every district receives the instruction that they need and deserve…every day. Oregon Response to Intervention Vision: Every child.
Current Picture of Rainier School District #13 Keplinger, Miller, Nelson.
Response to Instruction: Using Data to Make Decisions PRESENTER: Lexie Domaradzki.
RTI Essential Components Jenice Pizzuto Jenice Pizzuto Jenice Pizzuto National Consultant, Learning Forward, President, Learning Forward Oregon Leadership.
Keystone Educational Consulting Dr. Ashlea Rineer-Hershey Dr. Richael Barger-Anderson.
HOW DO WE USE DIBELS WITH AN OUTCOMES-DRIVEN MODEL? Identify the Need for Support Validate the Need for Support Plan Support Evaluate Effectiveness of.
Class Action Research: Treatment for the Nonresponsive Student IL510 Kim Vivanco July 15, 2009
ELLA Module 3 Assessments and Interventions. Goals for Today: Participants will be able to: Identify the four purposes for assessment. Align DIBELS assessments.
Vision: Every child in every district receives the instruction that they need and deserve…every day. Foundational Tier 1 training Sept. 22, 2015 Universal.
Digging Deeper with Screening Data: Creating Intervention Groups Gresham-Barlow School District September 8, 2011.
ELP Smart Goal Intervention Instruction ELP Inservice Feb 25/26 09.
DIBELS: Doing it Right –. Big Ideas of Today’s Presentation Reading success is built upon a foundation of skills DIBELS (Dynamic Indicators of Basic Early.
Easy Curriculum Based Measurement (CBM). What is Easy CBM? EasyCBM® was designed by researchers at the University of Oregon as an integral part of an.
Assessment Discussion Student Achievement Committee Meeting January 9, 2014.
Franklin Academy – Cooper City Campus Parent Workshop January 12, 2016
AGENDA Norms RTI Overview Tier II and III kits What they look like How this is going to work  Progress Monitoring  Next week at a glance  Question and.
Progress Monitoring Goal Setting Overview of Measures Keith Drieberg, Director of Psychological Services John Oliveri, School Psychologist Cathleen Geraghty,
ILEADR Consultants: Brie Beane & Amanda Goulds. Prior to RtI  Academic Ranking- 55 th  Graduation Rate- 61% (53rd)  SAT- 57 th  Dropout Rate- 6.5%
Basic Information for STAR Reading and Math Diane Burtchin, Math Coach or ext
Somers Public Schools Building and Departmental Goals
WestEd.org Washington Private Schools RtI Conference Follow- up Webinar October 16, 2012 Silvia DeRuvo Pam McCabe WestEd Center for Prevention and Early.
STAR A SSESSMENTS Grand Blanc Community Schools
STAR Assessments by Renaissance Learning
STAR Reading. Purpose Periodic progress monitoring assessment Quick and accurate estimates of reading comprehension Assessment of reading relative to.
Data-Driven Decision Making
Instructional Planning
Coventry Public Schools
Data Usage Response to Intervention
DIBELS Next Overview.
CHAPTER 13: Assessing Mathematics Achievement
Special Education teacher progress monitoring refresher training
Brooks Elementary School SUCCESS AND NOTHING LESS!
Presentation transcript:

STAR Training of Trainers

What is a computer adaptive test? Too high low Appropriate content Before this slide – I’m thinking of a number between 1 and 100, guess the number.

Traditional Assessments are Imprecise Above and Below Grade Level Sixth grade reading assessment Another benefit to computer adaptive testing is its ability to accurately test students that are far above or below grade level better than other assessment types. On traditional assessments, there are many test items for the grade level the assessment was designed for. Only a few items are designed to test material above or below that grade level. For example, on this traditional sixth grade test, you can see there is a large number of questions written at the sixth grade level and relatively few written above or below that grade level. (click) Students who are achieving at their grade level have a large pool of questions to answer to help determine their abilities. (click) Students performing outside their grade levels have many fewer questions to answer, decreasing the accuracy with which you can determine their abilities. The result is an incomplete picture of the achievement of these students. Below sixth grade Sixth grade test items Above sixth grade

CAT: Precise at All Levels Sixth grade reading assessment However, with computer adaptive technology, students are able to be tested on items for other grade levels, allowing for a more accurate categorization of their abilities. Therefore, another benefit to computer adaptive testing is its ability to accurately test students that are far above or below grade level better than other assessment types. Why is this important? (to measure growth of low performing students you need to have accurate data) Below sixth grade Sixth grade test items Above sixth grade

STAR Early Literacy General readiness Graphophonemic awareness Phonics Vocabulary Comprehension Structural analysis STAR Score Definitions (Emergent Reader, Probable Reader, etc.) HANDOUT

STAR Reading Vocabulary in context Comprehension Instructional reading level Independent reading range SEL – up to Grade 1 (typically) – transition to STAR-R when a student scores as “Probable Reader” – 100 sight vocab. – STAR –R, 1-12 Vocabulary as a proxy for comprehension (just like fluency is a proxy) because you can’t measure comprehension directly, you measure its manifestations. Vocabulary is

STAR Reading tests comprehension Background knowledge Background knowledge Vocabulary knowledge Construct meaning from text The items used in STAR Reading are designed to test students’ reading comprehension skills. The assessment items require students to call upon background information combined with vocabulary knowledge in order to construct meaning from text. These cognitive tasks are consistent with what researchers and educational experts describe as reading comprehension. In other words, background knowledge and vocabulary intersect to create meaning from text. STAR Reading assesses comprehension directly with robust, holistic questions that tell you a lot about how all the reading skills relate to each other and work in concert. Comprehension

STAR Math tests competency in eight mathematical strands Numeration Computation Estimation Algebra Geometry Word Problems STAR Math is a well-rounded math assessment. It combines computation and numeration items with word problems, estimation, statistics, charts and graphs, geometry, measurement, and algebra to help you pinpoint students’ math levels more accurately and efficiently. Measurement Data analysis & statistics

Scaled Score (SS) 615 0 SS 900 SS 1400 SS Scaled Score (SS) Based on question difficulty and number of correct responses Best score to use for measuring growth because can be used to compare student performance over time and across grade levels Useful in setting goals, measuring progress within a school year, and growth from year-to-year The scaled score is used to derive the other STAR scores

Percentile Rank (PR) 52 1 PR 99 PR To understand growth, we need to understand the scores that are used to measure that growth. Quickly review: Percentile Rank, a norm-referenced score Gives the best measure of the student’s reading ability relative to peers Ranges from 1 to 99 (click) Example a PR of 52 means a student’s reading skills are greater than 52% of same-grade students PR is useful when comparing students at the same grade level. It is also helpful when looking at student growth relative to peers. If they are growing in PR they are surpassing average growth rates and experiencing an acceleration in learning. ???

Grade Level Equivalent Score (GE) 4.8

Scaled Score: Absolute Growth Percentile Rank: Relative Growth Two types of growth When using STAR Reading to monitor progress, you’ll need to be familiar with both scaled score and percentile rank. The scaled score will tell you about the absolute growth of the student: the raw amount of growth they have made. This is similar to a doctor measuring the height of a child in inches. (click) The percentile rank will tell you about the relative growth of a student: how his growth compares to other students in the same grade. A doctor looks at relative growth by using a growth chart to compare the child’s height with other children the same age. Scaled Score: Absolute Growth Percentile Rank: Relative Growth

Estimated Oral Reading Fluency A New Measure in STAR-EL and STAR Reading

STAR and Est. ORF Resulted from a correlational study looking at STAR (EL and Reading) scores and DIBELS Oral Reading Fluency Statistically HUGE sample N=12,220

What a correlation!

What a correlation!

Notice What Happens G4 – R=0.71 G3 – R=0.78 G2 – R=0.84 G1- R=0.87

Diagnostic Information

Reasons for score fluctuation Standard Error of Measurement (chance) Fluctuation in student performance (good day, bad day) There are three main reasons for this normal fluctuation. First is the standard error of measurement or the extent to which scores would be expected to fluctuate due to chance. Standard error of measurement can be calculated, and a chart listing the SEM can be found in the Understanding Reliability and Validity document in the software. (click) Next there are fluctuations in student performance. A student may perform at her best during one administration of a test, and somewhat less than her best on another occasion. These fluctuations may be related to such things as illness, distractions, anxiety, or motivation. (click) A final reason for score fluctuation is a statistical phenomenon called regression to the mean. This is what happens when students with the highest and lowest scores on a first test tend to score closer to the average on the next test. More information can be found in the Understanding Reliability and Validity guide under the resources tab in the software. Regression to the mean (statistical phenomenon)

Fidelity of Administration Pre-Test Instructions

Extended comprehension Item Time Limits Grades K - 2 3 minutes per item 60 seconds per item Grades 3 + STAR READING K-2 60 seconds per item Grades 3+ have 45 seconds per short comprehension item Grades 3+ have 90 seconds per extended comprehension item STAR MATH All items have a 3 minute time limit STAR EARLY LITERACY All items have a 90 second time limit ALL ASSESSMENTS Testing time does not affect scores and there is no overall test time limit. They are just in place to maintain test security and keep the test moving. Warning clock appears in upper right corner during last 15 seconds When time limit expires, test continues to the next item and counts the timed out item as incorrect unless students have selected the correct answer before time runs out. 45 seconds Short comprehension 90 seconds Extended comprehension 90 seconds per item

Using STAR in an RtI Setting

Core Instructional Program (Tier 1) Progress shown Lack of progress Data Review Progress Monitoring with STAR Intervention B Intervention A (Tier 2) [This diagram is on page 2 of the handouts.] The process schools follow when implementing RtI can vary. The core instructional program is the foundation. This is where learning takes place everyday for all students, so it is important that students are engaged and best practices are followed. No matter what difficulties individual students may have, creating and maintaining a strong core instructional program is essential. (Click) Periodically, all students in the core instructional program are assessed to determine their degree of success within that program. This is called universal screening, and STAR can be used to accurately and efficiently screen all students. (Click) The information from that screening is reviewed, generally by a team made up of teachers, administrators and specialists. The team determines which students need additional opportunities to learn the material. (Click) An intervention begins for students who needed extra help. This is done in addition to the core instructional program, not in place of it. (Click) The progress of the students receiving the intervention is monitored on a regular basis, usually between weekly and monthly. The STAR assessment is an appropriate tool for progress monitoring. (Click) The data is reviewed to determine if the students are experiencing success with that intervention or are still having difficulties. (Click) Students showing progress, may continue with the intervention or (click) stop with the intervention and return to receiving just core instruction. (Click) If students are not showing progress, check that the intervention is being administered with fidelity. Also, consider intensifying the intervention, for instance, increasing the amount of time the student receives intervention services. (Click) Or, try a different intervention, perhaps a more intensive intervention, such as individualized instruction. Data Review Screening with STAR Core Instructional Program (Tier 1)

Screening Report The report is a hyperlink to a full pdf. Talk through the report format: Black bar represents benchmark – 40th percentile – can be adjusted Each bar represents a student – green, at or above benchmark, 40th – blue, on watch, between 40th and 25th – yellow, intervention, between 25th and 10th – red, urgent intervention, 10th or below Briefly discuss the data on this specific report sample Fall screening – Grade 6 – Reading Time permitting, ask the audience, “If this report was for a school you supervised, would you be pleased with their performance? Why or why not?: Answer: yes – 72% meet or exceed benchmark (target is 80, so they are close) – also percentages in yellow and red fall close to the RTI rules of thumb for T2 and T3 (15% and 3-5% respectively).

A Math Report 28

No hyperlink 29

Are you satisfied with the number/percent of students in your class/grade who are at or above proficiency?  Is core instruction effective? Some students are in the “red.”  Based on your opinion of the student(s) and other information you have, is this student in need of “urgent intervention?” Some students (blue) are “on watch” or “almost” at benchmark.  Which of these are you “worried about” and which do you think will progress with continued core instruction? What is being done or what do you think needs to be done for those students in need of “intervention” (Yellow)? The questions are from George Batsche (Univ. of South FL). A noted expert in RTI, Batsche wrote these to help people interact with the data presented on the Screening Report. George Batsche

Screening “The testing needs to be brief, easy to administer, reliable, and valid” (Applebaum, 2009, p. 4) “Screening is a type of assessment that is characterized by providing quick, low-cost, repeatable testing of age-appropriate critical skills or behaviors” (NRCLD, 2006, p. 1.2) “For a screening measure to be useful, it should. . . be practical” (NRCLD, 2006, 1.2) More kids don’t need screening than do.

Clean the Fish Tank! If the water in the fish tank is dirty, you don’t start taking our individual fish and diagnosis their needs. You clean the fish tank! Heather Diamond, FL Department of Ed.

Progress Monitoring

What’s wrong? Nothing! From a statistical and systems standpoint, things are fine with this report. The teacher, however, felt “something must be wrong” because there are so many test scores under the line. This is a clear case of where we need to ignore the blue marks and trust the line. Statistically the blue dots are irrelevent. They mean nothing. It’s only when they combine to form a line that meaning begins to emerge – and the formula for that is far more complex than “eyeballing” it.

Student Growth in PR and SS (these are STAR Reading equivalents) Why do you need to look at both? It is possible, even common, that the SS might change but PR might not. Each PR at each grade level is equivalent to more than one scaled score. Example: (click) A fifth grade student has a 22 PR with a scaled score of 479. If she scores 484 SS on the next STAR assessment, (click) she will have made absolute growth as reflected by the SS, but her PR is still 22. She did not make growth relative to her peers.

Meta-skill assessments & single skill probes The STAR assessments are the way you can determine if a student’s reading or math skills have improved overall. Teachers may also choose to monitor progress in individual skills, but ultimately, because you want better readers or math students, you’ll want to monitor improvement in reading or math as a whole. STAR Assessments measure skills as a whole Individual probes measure skill on isolated tasks

Progress Monitoring It is a common misconception that accurate measure of the isolated skills requires an explicit, separate test for each skill. On the contrary, since the sub skills in a given domain are highly interrelated, sub skills scores can be derived more accurately—and more efficiently—from student’s overall test performance, which provides far more data from more items than a short probe.

Goal Setting

Star Hosted Data

The Statistics Behind the Goal-Setting Wizard K 1 2 3 4 5 6 7 8 9 10 11 12

Goal-Setting Rate to Maintain 11PR = 2.0 Rate to Meet Bench- mark = 7.0 Moderate Goal = 3.3 Ambitious Goal = 5.5

Not responding to the intervention Check the fidelity of implementation Give the intervention more time to work Increase the intensity of the intervention Try a new intervention We’ve determined that Y is not responding to the intervention. What could we do next? (Give participants time to discuss/record ideas.) Here are a few options. (click) First, if you stick with Intervention A, you want to make sure that the intervention was implemented with fidelity. Was it done right? Was the student engaged and motivated? (click) If you determine it was implemented correctly, you might want to give it more time to work if you notice positive changes in student learning that have not been reflected in testing yet. Is the student more on task? Completing homework that wasn’t done before? Seeking help when needed? This may be an indication learning growth is on its way. (click) You might decide to increase the intensity of the intervention. If the student was previously receiving extra instruction three times a week for 20 minutes, perhaps boosting it to five times a week or 30 minute sessions would result in significant growth. (click) If you do determine this intervention is just not the right fit for the student, you could try another intervention. This could be another Tier 2 intervention or a Tier 3 intervention depending on how intense, individualized, or diagnostic the intervention is.

Editing a goal Give the intervention more time to work If you make a change to an intervention you’re trying with a student may mean that you want to edit the previous goal or start a new goal. For example, if you decide to give the intervention more time to work, you may need to edit the goal end date in the software to reflect this change.

Starting a new intervention Try a new intervention If you decide to try a new intervention, you will want to start a new intervention and use the Goal Setting Wizard to start a new goal. This will result in a vertical red line being drawn to indicate a change in intervention. This allows you to see how the student’s response to the new intervention differs from the previous intervention. (screen shot to show this?)

How is Y responding to the new intervention? (New mock ups will be created for this screen. The new sample would be similar to this with two interventions shown, or perhaps with just one successful intervention.) For now let’s assume that you started a new intervention with the student. A new goal was set in the software, resulting in a vertical red line to be drawn and a new goal and goal line to appear on the Student Progress Monitoring report. After seven subsequent tests, a new picture emerges. What do you notice about how the student is responding to the new intervention? Is the student on track to meet her goal? How do you know? Note the growth rate on page two. Here we can see student growth in scaled scores per week and compare it to the expected growth rate above.

STAR Early Literacy STAR Math STAR Reading Very Highly Rated for Both Screening and Progress Monitoring by the National Center for Response to Intervention www.rti4success.org

High rating from National Center on RTI Screening Tools Chart (partial) www.rti4success.org

High rating from National Center on RTI Progress Monitoring Tools Chart (partial) STAR Reading, STAR Early Literacy, and STAR Math received the highest ratings in eight of the nine categories, each receiving significantly higher ratings than several other tools. www.rti4success.org

Resources Software: manuals, Live Chat Phone: (800) 338-4204 Email: answers@renlearn.com Renaissance Training Center: www.renlearn.com/profdevel Getting the Most out of STAR Guides Help in the software (Software and technical manuals, live chat) Technical support by phone, email, knowledge base Renaissance Training Center: On-Demand Sessions Getting the Most out of STAR Reading guide (STAR Math and STAR Early Literacy guides to come)

RtI Experts Dr. George Batsche University of South Florida Dr. Matt Burns University of Minnesota Dr. Ted Christ University of Minnesota Dr. Joe Kovaleski Indiana Univ of PA Dr. Jim Ysseldyke University of Minnesota Dr. Ed Shapiro Lehigh University Dr. Amanda VanDerHeyden Consultant