Presentation is loading. Please wait.

Presentation is loading. Please wait.

Strategies for Maintaining Data Quality Using Commercial Assessment Systems Nick Ortiz Colorado Department of Education Barb Jackson University of Nebraska.

Similar presentations


Presentation on theme: "Strategies for Maintaining Data Quality Using Commercial Assessment Systems Nick Ortiz Colorado Department of Education Barb Jackson University of Nebraska."— Presentation transcript:

1 Strategies for Maintaining Data Quality Using Commercial Assessment Systems Nick Ortiz Colorado Department of Education Barb Jackson University of Nebraska Medical Center-Munroe Meyer Institute Jan Thelen Nebraska Department of Education

2 Agenda 1.Review of online assessment systems in Colorado, Nebraska 2.Using automatically generated reports 3.Pattern checking with raw data 4.Helping local districts analyze data quality

3 How do we use online assessment systems? Federal reporting State reporting CCDC and/or GOLDCOR ColoradoXXXXX NebraskaXXX Hierarchy:  State  Program  Teacher  Child  No COSF rating ® ® Data output options: 1.Automated user reports 2.Raw data

4 General Reports: CreativeCurriculum.net

5 Raw Data & Mgmt Reports: CreativeCurriculum.net

6 General Reports: Online COR

7 Strategies Using Automated Reports Random checks by state Targeted checks by state State provides guidance to local school districts on running checks (administrator trainings, etc.) State prompts local districts to run own checks

8 1)Have observations been entered? Observation Notes report, Classroom Status Report 2)Are observations of good quality? 3)Is the corresponding score accurate? Check observation notes Pattern Checking with Automated Reports A H

9 Make sure all children are included in basic reports Snapshot or Gains report Verify Totals match expected numbers of children in your system Pattern Checking with Automated Reports BCD

10 Verify assessments are complete Snapshot, Gains, Assessment Status reports Use of “not observed”/”missing data” is discouraged Frequent missing data could mean: –Lack of training –Insufficient time for observation –Invalid data Pattern Checking with Automated Reports BCE

11 OSEP Entry Status report Child List report Verify children with IEPs/IFSPs have correct entry/exit dates, etc. Verify children who will be included in OSEP reporting Pattern Checking with Automated Reports F

12 Snapshot report Disaggregate by program, class, child, and/or domain Look for specific trouble areas that could skew data Analyze completion patterns at different levels Pattern Checking with Automated Reports

13 Strategies Using Raw Data 13 2F1 F3 2F2 4 5 1 3 2 5 3 3 2 F1 3 1 1 F2 F1

14 Correlation between Age and Raw Score by Outcome Established finding that developmental abilities increase with age Relationship should lead to high correlations between: –chronological age and raw score –chronological age and developmental age Pattern Checking with Raw Data

15 AssessmentAEPS Outcome 1.570 (n=1704) Outcome 2.626 (n=1678) Outcome 3.606 (n=1699) Nebraska Summary: Correlations between chronological age and raw score for each Outcome across assessments Pattern Checking with Raw Data

16 Examine the distribution across OSEP Summary Statements What patterns do you see? Summary Statement 1: Percent that substantially increased their rate of growth by exit Summary Statement 2: Percent of children who were functioning within age expectations in each outcome exit Pattern Checking with Raw Data

17 Degree to which outcomes are related to demographic characteristics Gender Primary language in the home Ethnicity Disability Run online OSEP Mandated Federal Reports for Part C or B Pattern Checking with Raw Data

18

19 Are there differences that you were not expecting by gender, primary language in the home, ethnicity, and disability? What are the programmatic implications? Pattern Checking with Raw Data

20 Relationship with Disability Entry and exit scores and OSEP categories should be related to the nature of the child’s disability For many, but not all, children with disabilities, progress in functioning in the three outcomes proceeds together Pattern Checking with Raw Data

21 Comparison by Disability Disability Summary Statement 1: Outcome A Summary Statement 2 : Outcome A Summary Statement1 : Outcome B Summary Statement 2 : Outcome B Summary Statement 1: Outcome C Summary Statement 2 : Outcome C Developmental Delay (n=99) 47%67%72% 63%69% Speech/Language Impairment (n= 81) 51%72%80%75%64%76% Pattern Checking with Raw Data

22 Relationship with Gender, Ethnicity, and Primary Language Hypothesis: There will not be a relationship between Entry and Exit scores and OSEP categories by the child’s gender, ethnicity, and primary language. Pattern Checking with Raw Data

23 Comparison by Ethnicity Ethnicity Summary Statement 1 : Outcome A Summary Statement 2: Outcome A Summary Statement1 : Outcome B Summary Statement 2: Outcome B Summary Statement 1: Outcome C Summary Statement 2: Outcome C White (n=183) 46%68%74%72%50%68% All other ethnicities (n=39) 50%52%61%65%50%52% Pattern Checking with Raw Data

24 Comparison by Primary Language Disability Summary Statement 1: Outcome A Summary Statement 2 : Outcome A Summary Statement1 : Outcome B Summary Statement 2 : Outcome B Summary Statement 1: Outcome C Summary Statement 2 : Outcome C ELL (n=39) 56%49%68%50%66%58% English (n= 252) 64%75%76%78%67%77% Pattern Checking with Raw Data

25 Work Sampling System ● 2006-09 High frequency in OSEP Progress Category A non-IEP IEP CategoryN%N% Outcome 1 A Exit is 5 or lower, and is less than entry; no progress 195 1.6% 30 3.6% B Exit is 5 or lower, and is less than or equal to entry; progress 143 1.1% 39 4.7% C Exit is 5 or lower, and is higher than entry 453 3.6% 120 14.5% D Exit is 6 or higher, and entry is 5 or lower 4624 36.9% 333 40.2% E Score of 6 or 7 at both entry and exit 7110 56.8% 306 37.0% Total 12525828 Outcome 2 A Exit is 5 or lower, and is less than entry; no progress 107 0.8% 27 3.1% B Exit is 5 or lower, and is less than or equal to entry; progress 192 1.5% 80 9.3% C Exit is 5 or lower, and is higher than entry 427 3.3% 90 10.4% D Exit is 6 or higher, and entry is 5 or lower 5058 39.5% 386 44.7% E Score of 6 or 7 at both entry and exit 7005 54.8% 281 32.5% Total 12789864 Outcome 3 A Exit is 5 or lower, and is less than entry; no progress 170 1.5% 46 6.4% B Exit is 5 or lower, and is less than or equal to entry; progress 111 1.0% 72 10.0% C Exit is 5 or lower, and is higher than entry 338 3.0% 94 13.0% D Exit is 6 or higher, and entry is 5 or lower 4766 41.9% 324 44.8% E Score of 6 or 7 at both entry and exit 5990 52.7% 187 25.9% Total 11375723

26 Outcome Area Total number of children that made no progress Children making no progress that had identical scoring patterns in the fall and spring Percent of all children making no progress that had identical scoring patterns across checkpoints Outcome 122514966% Outcome 21348664% Outcome 321613060% Looking closely at children who made no progress… Pattern Checking with Raw Data

27 Children identified as “no progress” may be due to mis-administered assessments Solution  professional development Pattern Checking – Children Who Make No Progress

28 Further Analyses Anomalies in funding source combinations Duplicate child records Erroneous birth dates Score profiles across program year Pattern Checking with Raw Data

29 Further Analyses Export the raw data and complete statistical analyses to significance of patterns Export the child level data and complete cross tabs in Excel: –Disability with ELL Pattern Checking with Raw Data

30 Specific Strategies for Enabling Local Programs

31 Colorado Data “Chats” Screen capture video tutorials Memo to local Head Start about misinterpreting patterns Strategies for Enabling Local Programs G

32 NE: Navigating Reports in Online Systems: Support for Local Administrators Use of reports to monitor quality –Webinars –Lab workshops –Email alerts with instructions: individualized by assessment Facilitating communication with partners –Clarifying roles and responsibilities across partnering agencies

33 NE: Using Data as part of a Continuous Improvement Process Annual Results Matter Institute –National speakers –Local districts share how they use reports Individual Consultation SPP/APR –Meeting Targets –Supporting Improvement Activities

34 Questions?

35 Thank You!


Download ppt "Strategies for Maintaining Data Quality Using Commercial Assessment Systems Nick Ortiz Colorado Department of Education Barb Jackson University of Nebraska."

Similar presentations


Ads by Google