Presentation is loading. Please wait.

Presentation is loading. Please wait.

Reading CBM Data Measures Cathy Claes Anna Harms Jennifer Rollenhagen MiBLSi State Implementer’s Conference, 2012.

Similar presentations


Presentation on theme: "Reading CBM Data Measures Cathy Claes Anna Harms Jennifer Rollenhagen MiBLSi State Implementer’s Conference, 2012."— Presentation transcript:

1 Reading CBM Data Measures Cathy Claes Anna Harms Jennifer Rollenhagen MiBLSi State Implementer’s Conference, 2012

2

3 http://miblsi.cenmi.org

4 Agenda Updates from 2012 DIBELS Summit AIMSweb updates Universal Screening Flowchart Advanced Report Generation Reading Data Coordinator Listserv 2 minutes: Share with your elbow partner which agenda item your are most interested in and why.

5 Updates from 2012 DIBELS Summit and DMG What’s next after Next DIBELSnet Webinars with Wireless Generation

6 What’s next after Next IDAPEL – French version IDEL – revision of Spanish version will be released soon CRTIEC – Center on Response to Intervention in Early Childhood: www.crtiec.orgwww.crtiec.org –Descriptive study of Tier 1 –Tier 2 & 3 Interventions –Progress Monitoring –Goal to disseminate findings and provide national leadership

7 What’s next after Next CFOL – Comprehension Fluency Oral Language –Is in development WUF-R –Changes were made Modified administration – all words administered, total time recorded Modified word pool Words were assigned theoretical categories Words randomly stratified –After initial research, changes were made again. Next study will include modified scoring rules to include partial credit

8 What’s next after Next DIBELS Deep –Brief Diagnostic assessments –Linked to DIBELS Next –For students who are not yet at benchmark or for those who are very inaccurate –Fits in the plan/support phase of the outcomes driven model –Aligned with the Common Core State Standards –Assessments include Phonemic Awareness and WRD Reading –Not all assessments are timed –Teaching suggestions are contained within to help get a student to the concept –Routing form

9 What’s next after Next DIBELS Survey –Based on DIBELS Next Benchmark Assessment scores –Administered to students who have not reached benchmark goals Takes 5-20 minutes per student to administer Includes four DIBELS Next Measures: –First Sound Fluency (FSF) –Phoneme Segmentation Fluency (PSF) –Nonsense Word Fluency (NWF) –DIBELS Oral Reading Fluency (DORF) Includes a manual, a flipbook with assessor directions and student materials and scoring booklets –Assesses off grade-level progress monitoring –Accurately identifies the level of materials needed for progress monitoring –Supports teachers in selecting intervention materials at the right level

10 What’s next after Next DIBELS Math –This will be different than other CBM measures currently in the field –Aligned with Common Core State Standards –Still determining correct timing and amount of growth –Early Numeracy and Computation measures KBeginning Quantity Discrimination K-1Number Identification Fluency (1-99) K-1 Next Number Fluency 1Advanced Quantity Discrimination 1 Missing Number Fluency 1-5Computation (addition, subtraction, multiplication, division, & fractions)

11 What’s next after Next DIBELS 7-9 –The goal is to assess a broad range of critical reading comprehension skills –Currently 190 science, social studies, and prose passages are being written. Must have good flow, content, and be factually accurate Passages are currently going through a rigorous process of review and revision –Comprehension pieces Daze Recall (passage specific questions for vocabulary, events, and summarizing) Multiple choice (passage specific questions for vocabulary, events, and summarizing) –Goal Readability studies in 2012-2013, Comprehension studies in 2012-2013, Benchmark studies in 2013-2014

12 What’s next with Next PELI – Preschool Early Literacy Indicators –Screening and progress monitoring for ages 3-6 –Book format –5-7 minutes to administer –Skills assessed Comprehension – literal questions, predictions, inferences Alphabet Knowledge – Name upper case letters Phonemic Awareness Vocabulary/Oral Language Story Retell –Current & Future work Development of new books – up to 10 Benchmark goal study Sensitivity to intervention study Version study

13 Research Opportunities with DMG Contact: –DIBELS Deep Kelly Powell-Smith kpowellsmith@dibels.orgkpowellsmith@dibels.org –Math and WUF-R Courtney Wheeler cwheeler@dibels.orgcwheeler@dibels.org –PELI and 7-9 Mary Abbott mabbott@dibels.orgmabbott@dibels.org

14 14 DIBELSnet Sample Reports

15 Objectives Introduce a new data service from the authors of DIBELS Next, called DIBELSnet Present features of the system Review sample reports Show data entry fields Discuss frequently asked questions 15

16 Features of DIBELSnet Created and managed by Dynamic Measurement Group, the authors of DIBELS Next Supports data entry of: DIBELS Next, DIBELS 6 th Edition, IDEL (Spanish) IDAPEL (French) PELI (Preschool) DIBELS Math Built to support the use of DIBELS within an Outcomes-Driven Model 16

17 Log In Screen 17

18 School or District Overview Report 18 Can be generated by school or district Can be generated by school or district Identifies School, Grade and Year Identifies School, Grade and Year Shows the % in each benchmark status category Shows the % in each benchmark status category

19 School or District Overview Report 19

20 Status by Grade 20 Identifies District Shows the % in each benchmark status category Shows the % in each benchmark status category

21 Status by Grade 21

22 Status by Measure 22 Identifies District and Grade Identifies District and Grade

23 Histogram and Box Plot 23 Identifies District and Grade Identifies District and Grade Shows the # of students who scored a certain way Shows the # of students who scored a certain way

24 Histogram and Box Plot 24 Shows the range of scores relative to the benchmark goal at the beginning, middle and end of the year for one measure Shows the range of scores relative to the benchmark goal at the beginning, middle and end of the year for one measure

25 25 Effectiveness of Instructional Levels By School

26 Classroom Report 26

27 Grouping Report 27

28 Progress Monitoring Report 28

29 Student Benchmark Assessment History Report 29

30 Frequently Asked Questions What is the cost? $1 per student per year Can historical data be imported? Yes Will historical data for DIBELS 6 th Edition be displayed with DIBELS Next? Yes. A line will delineate when the transition occurred Can I determine who has access to what data? Yes. Local personnel assign passwords and determine level of access 30

31 Frequently Asked Questions Do I have to sign an agreement? Yes, it can be downloaded from dibels.net What support is available? Manual can be downloaded from dibels.net Who do I contact? Josh Wallin jwallin@dibels.org Customer Support info@dibels.org 31

32 DMG Monthly Webinars www.dibels.org March 27, 3-4 p.m. EST: Data Driven Decision Making April 24, 3-4 p.m. EST: DIBELS and the Common Core May 22, 3-4 p.m. EST: Role of Retell in DIBELS

33 AIMSweb Updates Browser-Based Scoring Review of AIMSweb National Norms and Default Criteria Webinar dates/Resources

34 Browser-Based Scoring Online scoring Eliminates need to print student scoring booklets, hand-score, and then enter scores online. Does require that each screener has access to a computer or other electronic device with internet access. Used to be an additional cost per student, but is now included as part of the Reading Pro package. Can be used for the following reading subtests: Letter Naming Fluency Letter Sound Fluency Nonsense Word Fluency Phoneme Segmentation Fluency Reading CBM

35

36

37 Browser-Based Scoring Tutorial Videos Developed by: Lisa Langell, M.A., S.Psy.S. National Manager of AIMSweb Professional Development Assessment & Information lisa.langell@pearson.com BBS Public Overview: http://dl.dropbox.com/u/26148540/BBS-publicdemo.mp4 http://dl.dropbox.com/u/26148540/BBS-publicdemo.mp4 R-CBM Training BBS: http://dl.dropbox.com/u/26148540/bbs-rcbm-12-8-2011.mp4 TEL Training BBS: http://dl.dropbox.com/u/26148540/BBS-TEL.mp4

38 AIMSweb National Norms

39 7 Sources of Info on the National Norms and Default Cut Scores www.aimsweb.com www.aimsweb.com

40 What are the National Norms? A set of Raw Scores and Percentile Ranks that was designed to be representative of the national student population. Data were selected from schools with an AIMSweb account. Most data were from the 2009-10 school year.

41 Only schools conducting universal screening in the Fall, Winter and Spring were used to develop the National Norms. (excluded schools just using AIMSweb with a subpopulation) A school and grade level was included if at least 95% of the enrolled population (based on NCES numbers) was screened in Fall, Winter and Spring. What are the National Norms?

42 Available for TEL, R-CBM, and Maze (among others) and replace the AIMSweb Aggregate Norms for these measures. Not available for DIBELS 6 th Ed., DIBELS Next, or R-SPAN The AIMSweb Aggregate Norms represented all data in the AIMSweb system from year to year. In most cases, the raw scores for the National Norms are slightly higher than the Aggregate Norms from previous years. What are the National Norms?

43 AIMSweb National Norms

44 AIMSweb Defaults 2011-12

45 How were the Default Criteria Developed Using data in AIMSweb accounts from 20 states (Michigan not included) The cut scores represent averages across the states providing data. Tier 1 cut scores for R-CBM: 80% probability of passing the “typical” state test Tier 2 cut scores for R-CBM: 50% probability of passing the “typical” state test Because the cuts for R-CBM and M-CAP consistently fell around the 15 th and the 45 th percentiles of the National Norms, the cut scores for Maze and other math measures we set at the same percentiles of the National Norms. The higher cut score (35 th percentile) for TEL and TEN is based on a success-probability study of TEL done by Silberglitt.

46 AIMSweb Defaults 2011-12

47

48

49

50 How to Run Reports using the AIMSweb Defaults 2011-12

51 Tier Transition Report

52 Select AIMSweb Defaults 2011-12 as the Criteria

53 Tier Transition Report

54 Scores and Percentile (Rainbow)

55 Select AIMSweb Defaults 2011-12 as the Criteria Select Criterion as the Report Method

56 Scores and Percentile (Rainbow)

57 Norm Chart/ Comparison Report

58 Select AIMSweb Defaults 2011-12 as the Target Sets

59 Norm Chart/ Comparison Report Targets based on AIMSweb Defaults 2011-12

60 The AIMSweb Default 2011-12 CAN be applied to previous data:

61

62

63

64 Universal Screening & Fidelity Checks Why do we want to do fidelity checks? –Check the accuracy of administration of universal screeners for Benchmark and Progress Monitoring –To ensure Reading CBM data is accurate for decision-making Universal Screening Flowchart Examples of AIMSweb AIRS and DIBELS Next Accuracy Checks

65

66 Examples of Accuracy Checks

67 Early Warning Signs (EWS) Tool Middle & High School Universal Screener www.betterhighschools.org www.betterhighschools.org

68 EWS for Middle & High Schools A Universal Tool: Enables schools/districts to identify students who may be at risk for academic failure Monitor students’ responses to interventions Relies on student level data available at the school or district including indicators for attendance, course failures, and behavior to calculate potential risk for eventual dropping out Purpose: To support students with an increased risk of academic failure, in order to get them back on track for academic success and eventual graduation

69 EWS Indicators

70 Early Warning Signs

71 Advanced Report Generation and Data Analysis Why conduct a subgroup analysis with CBM measures? –Distribution Report (DIBELS Data System) –Tier Transition Report (AIMSweb) Class List Reports (DIBELS Data System)

72 Reading Data Summary Pink Assessment Binder

73 DIBELS Distribution Reports Disaggregated by results by school, class, or demographics Example shows disaggregation by school

74 AIMSweb Subgroup Analysis Tier Transition Report: Grade Level Choose Reports and Grade tabs

75 AIMSweb Analysis Tier Transition Report: Grade Level Click on Expand

76 AIMSweb Subgroup Analysis Tier Transition Report: Grade Level Select: Ethnicity or Meal Status Report Criteria should be at AIMSweb Defaults 2011- 2012 Grade: Click Display

77 AIMSweb Subgroup Analysis Tier Transition: Grade Level Demographic information has been filtered based on selection criteria

78 Data Coordinator Listserv Purpose of listserv: Share information related to DIBELS and AIMSweb measures including: –Measurement/system updates –Tools and resources for data collection and analysis. –Networking with other Reading Data Coordinators If you want to be added contact either: –Nikki Matthews nmatthew@oaisd.orgnmatthew@oaisd.org –Jennifer Rollenhagen jrollenhagen@mloisd.orgjrollenhagen@mloisd.org

79 3-2-1 Processing From today’s session: What 3 big ideas are you going to take away?What 3 big ideas are you going to take away? What 2 questions do you have?What 2 questions do you have? What 1 action are you going to implement when you return to your district?What 1 action are you going to implement when you return to your district?


Download ppt "Reading CBM Data Measures Cathy Claes Anna Harms Jennifer Rollenhagen MiBLSi State Implementer’s Conference, 2012."

Similar presentations


Ads by Google