Presentation is loading. Please wait.

Presentation is loading. Please wait.

Communicating through Data Displays October 10, 2006 © 2006 Public Consulting Group, Inc.

Similar presentations


Presentation on theme: "Communicating through Data Displays October 10, 2006 © 2006 Public Consulting Group, Inc."— Presentation transcript:

1 Communicating through Data Displays October 10, 2006 © 2006 Public Consulting Group, Inc.

2 Key Terms Aggregated Data: Data that are presented in summary (as opposed to student-level data) Alignment: The quality that allows you to compare one test to another test (A vertically aligned test represents real gains or losses from one year to the next) Disaggregation: Summary data split into different subgroups (e.g. gender, race, ethnicity, lunch status, SPED) Error: Impacts the validity of an assessment; Includes measurement error and sampling error Inference: Conclusions that are drawn from a data set Sample: Group of students included in a data set Validity: The statistical term used to determine how much inference can be made

3 The Framework Measures Multiple measures allow for a more complete picture of student performance Explorations Explorations enable looking at the data through different lenses to answer essential questions Disaggregators Disaggregators help reveal the various factors that impact educational outcomes

4 Types of Measures Measures are the “yardstick” that is used to measure student performance. The more measures that are used, the more robust and complete the picture State Assessments (MCAS) Usually taken in spring and reported the following fall Not vertically aligned Tests vary from year to year National Assessments (Terra Nova, ITBS, etc.) Some districts choose to supplement the state assessment with a national assessment Many of these are vertically aligned and are aligned from year to year National assessments are not aligned to the state curriculum framework Diagnostic Assessments (DRA, DIBELS) Diagnostic assessments help identify students who need interventions and supports Diagnostic assessments may not be vertically aligned

5 Types of Measures (cont’d) Subject Area and Course Grades Grades are subjective and can tend to be inflated Grades can be compared to performance on state and other assessments to identify disparities Disciplinary Records Discipline data can be used to monitor high-risk students and explore the impact of behavior on performance Discipline consequences provide important information for identifying inequities among groups of students (e.g., students with disabilities, ethnic groups) Attendance Rates Attendance data can be used to identify students who are at risk Attendance data can be used to explore the relationship between attendance and performance Graduation Rates Graduation rates can be used to evaluate the effectiveness of curriculum and instruction Graduation rates can be analyzed to identify inequities based on student characteristics

6 Types of Explorations The type of exploration you choose depends on the question that you want to answer. Types of exploration include: Snapshot Cross-Sectional Longitudinal Gains Item-Level Student Listing Correlation

7 Snapshot Shows how a group of students performed against a given measure at a certain point in time. Limitations: This analysis only presents one point in time. (Graph Type: Histogram / Bar) How did students perform at a certain point in time?

8 Historical Looks at how students at a particular grade level performed on a given measure across multiple years. This is what NCLB uses to calculate AYP. Limitations: This analysis does not take into account differences in the group of students from year to year. (Graph Type: Floating Column) How did students at a certain grade- level perform historically?

9 Longitudinal Looks at a cohort of students over time. Shows “real gains” Limitations: Comparisons of a group of students from one year to another are only valid using a vertically-aligned test. (Graph Type: Line) How did a cohort of students perform over time?

10 Gains Looks at the extent to which students are improving over time or losing ground based on a particular measure. Limitations: Caution must be used when drawing conclusions about a given student based upon performance on two tests. (Graph Type: Stacked Column) How did students who performed at each level on a prior assessment perform on subsequent assessments?

11 Student Listing Allows the analysis of students in a group in relation to each other. Conditional formatting can be added to highlight outliers. Limitations: Student listings can be difficult to interpret when too many data elements are included. What are the characteristics of specific students?

12 Item Analysis Displays how students did on each item or within a particular standard or strand. Providing reference groups is important for tests that are not aligned from year to year because that is the only way to determine relative performance. Limitations: Smaller sample sizes (e.g. classroom-level) limit the inferences that can be made (Graph Type: Scatter) How did a group of students perform on an item or on a set of items on a specific assessment?

13 Correlation Looks at the relationship between performance in one measure to performance in another measure. Correlation does not equal causation. Limitations: Correlations should not be done with small groups of students. How is performance in one measure related to performance in another? (Graph Type: Scatter)

14 Disaggregators Disaggregators are used to reveal how performance between one group of students differs from another group. Disaggregators include the following: Race Ethnicity Gender Special Education Status Lunch Status (Income Level) English Proficiency District Grade School Limitations: Disaggregating small groups of students can lead to subgroups with only a few students. Caution must be used when making inferences from disaggregated data. Teacher and Teacher Qualifications Program information Mobility Attendance Rates Discipline Infractions and Consequences Course-taking Patterns Years in the School/District Retention NCLB Subgroups How does performance differ from one group of students to another?

15 Disaggregation (cont’d) (Graph Type: Floating Column)

16 Disaggregation (cont’d) (Graph Type: Bar of Pie)

17 Disaggregation (cont’d) (Graph Type: Bar of Pie)

18 Data Don’ts Data can be dangerous! You should avoid: Comparing performance on tests that have not been aligned; for example: Don’t compare 3 rd grade scale scores to 5 th grade scale scores Don’t compare 3 rd grade Math scale scores to 3 rd grade ELA scale scores Making large inferences from a few data points; for example: Be wary of conclusions about a subject area based on one item on a test Be wary of conclusions about a student’s overall level based on performance on one test Be wary of conclusions about a student’s strengths or weaknesses based on performance on one item on one test

19 Questions?


Download ppt "Communicating through Data Displays October 10, 2006 © 2006 Public Consulting Group, Inc."

Similar presentations


Ads by Google