Using Data to Improve Student Achievement Summer 2006 Preschool CSDC.

Slides:



Advertisements
Similar presentations
Measuring Growth Using the Normal Curve Equivalent
Advertisements

Using Data to Improve Student Achievement
Data Driven Decisions Moving from 3D to D 3. Data Driven Decisions Moving from 3D to D 3 Malcolm Thomas Director, Evaluation Services Escambia School.
Understanding Stanford 10 Results
Building Level Benchmark Data This represents the percent of students who demonstrated the following proficiency levels on benchmark assessments. AP-Advanced.
Progress Monitoring project DATA Assessment Module.
What is Category 6.0 Management of Classroom Processes? Why is it important? What would it look like in action? Assessing the Classroom Learning System.
Watertown Public Schools Assessment Report 2009 Ann Koufman-Frederick & WPS Administrative Council School Committee Meeting December 7, 2009 Part I MCAS,
Using Summative Data to Monitor Student Performance: Choosing appropriate summative tests. Presented by Philip Holmes-Smith School Research Evaluation.
Tools and Charts Language Arts August 3, 2006 Summer 2006 Preschool CSDC.
Standards-Based Grading in the Science Classroom How do I make my grading support student learning? Ken Mattingly
1 Prepared by: Research Services and Student Assessment & School Performance School Accountability in Florida: Grading Schools and Measuring Adequate Yearly.
Brock’s Gap Intermediate School Hoover City Schools Testing- Spring 2014 Results / Analysis- Fall 2014.
Learning Goals, Scales and Learning Activities
Curriculum Based Measures vs. Formal Assessment
Elementary Science Bellringers
1 Let’s Meet! October 13,  All four people have to run.  The baton has to be held and passed by all participants.  You can have world class speed.
Using Data to Improve Student Achievement Summer 2006 Preschool CSDC.
Building the District Assessment System Akron Public Schools May 13, 2013.
A CONNECTION BETWEEN SELF REPORTED GRADES AND FEEDBACK Jaimee Sutherland Farmington Middle School.
Grade-level Benchmark Data Meetings
Title I Coordinators’ Meeting: Guiding Students to Proficiency December 07, 2005.
Excel: Unlimited Possibilities Graphs, Imputing Data, and Automatic Calculations Secondary Mathematics Pre-School Inservice Cape Coral High School August.
The DATA WISE Process and Data- Driven Dialogue Presented by: Lori DeForest (315)
Pomona Unified School District Standard Practices for Data Analysis Silvia San Martin Teacher Specialist Research and Assessment.
Department of Research and Evaluation Santa Ana Unified School District 2011 CST API and AYP Elementary Presentation Version: Elementary.
Welcome to Applying Data!. Applying Data I want to be in your classroom!
Know the Rules Nancy E. Brito, NBCT, Accountability Specialist Department of Educational Data Warehouse, Accountability, and School Improvement
Using Data to Improve Student Achievement Secondary Mathematics Preschool Inservice 2006.
Using Data to Improve Student Achievement Summer 2006 Preschool CSDC.
Summative & Formative Data Outcomes Understand Summative and Formative Assessments. Understand Summative and Formative Assessments. Review FCAT Reading.
Data Vocabulary Language Arts Summer Cadre 2006 Migdalia Rosario Varsity Lakes Middle Jennifer Miller Varsity Lakes Middle Pat Zubal Dunbar Middle School.
1 Using Data to Improve Student Achievement & to Close the Achievement Gap Tips & Tools for Data Analysis Spring 2007.
Adolescent Literacy – Professional Development
Using Data for Decisions Points to Ponder. Different Types of Assessments Measure of Academic Progress (MAP) Guided Reading (Leveled Reading) Statewide.
Administrator Update January Individuals with Disabilities Education Act (IDEA) 1997 –Students with disabilities must participate in statewide assessment.
Using Data in the Goal-Setting Process Webinar September 30, 2015.
Making the Most of Your Data: Strategies for Evaluating Your Program Greta Colombi, NDTAC; and John McLaughlin, ED.
Using Data to Plan for Instruction Summer 2006 Preschool CSDC.
1 Student Assessment Update Research, Evaluation & Accountability Angela Marino Coordinator Research, Evaluation & Accountability.
After lunch - Mix it up! Arrange your tables so that everyone else seated at your table represents another district. 1.
Assessment Literacy Interim Assessment Kansas State Department of Education ASSESSMENT LITERACY PROJECT1.
Adequate Yearly Progress (AYP) Academic Performance Index (API) and Analysis of the Mathematics Section of the California Standards Test (CST) Data Elementary.
Standards-Based Assessment Overview K-8 Fairfield Public Schools Fall /30/2015.
No Child Left Behind Adequate Yearly Progress (AYP) Know the Rules Division of Performance Accountability Dr. Marc Baron, Chief Nancy E. Brito, Instructional.
Do you know where your students are in relationship to the objectives you are responsible for teaching?
Understanding Alaska Measures of Progress Results: Reports 1 ASA Fall Meeting 9/25/2015 Alaska Department of Education & Early Development Margaret MacKinnon,
SAT -10 Its History, Current Activities, Future Direction Presentation to Principals, May 11, 2010 and Assistant Principals, May 13, 2010 Fran Clay, Glenn.
Student Learning Objectives SLOs April 3, NY State’s Regulations governing teacher evaluation call for a “State-determined District-wide growth.
Becky Martin Continuous Improvement Facilitator Paul Hayes Secondary Student Services Facilitator 17th National Quality Education Conference October 2009.
Using Data to Improve Student Achievement Summer 2006 Preschool CSDC.
Using Data to Improve Student Achievement Summer 2006 Preschool CSDC.
Assessing Learners with Special Needs: An Applied Approach, 6e © 2009 Pearson Education, Inc. All rights reserved. Chapter 5: Introduction to Norm- Referenced.
1 AMP Results Overview for Educators October 30, 2015.
Capacity Development and School Reform Accountability The School District Of Palm Beach County Adequate Yearly Progress, Differentiated Accountability.
Welcome to MMS MAP DATA INFO NIGHT 2015.
2006 HURRICANE NAMES AnalyzeIn-service BasalJ-Curve CRISSKagan DIBELSLee EducateMarzano FCATNorm GainsOrder HistogramPareto Looks Like Another Challenging.
Mobile County School District SPED 6-9 End of Year Consultative Session
1 NCEXTEND1 Alternate Assessment with Alternate Achievement Standards Conference on Exceptional Children November 17-18, 2008 NCDPI Division of Accountability.
Performance Monitoring COURTNEY MILLS SCPCSD DIRECTOR OF ACADEMIC PROGRAMS.
Berry Middle School/Spain Park High School Hoover City Schools Testing- Spring 2014 Results / Analysis- Fall 2014.
Student Growth Goals for Coaching Conversations. Requirements for Student Growth Goals Time line Reference to an Enduring Skill Proficiency and Growth.
Using Assessments to Monitor and Evaluate Student Progress 25 Industrial Park Road, Middletown, CT · (860) ctserc.org.
1 Teacher Evaluation Institute July 23, 2013 Roanoke Virginia Department of Education Division of Teacher Education and Licensure.
Using Data to Improve Student Achievement Summer 2006 Preschool CSDC.
1 Testing Various Models in Support of Improving API Scores.
Nuts and Bolts of Assessment
NWEA Measures of Academic Progress (MAP)
Diagnosis and Remediation of Reading Difficulties
Vision 20/20: Checks and Balances
Presentation transcript:

Using Data to Improve Student Achievement Summer 2006 Preschool CSDC

Outcomes Know why we need to look at data Know why we need to look at data Identify two types of tests Identify two types of tests Understand three types of scores Understand three types of scores Understand Summative & Formative Assessments Understand Summative & Formative Assessments Be able to interpret Summative Assessment Reports Be able to interpret Summative Assessment Reports Know how to use data in instructional planning for increased student learning Know how to use data in instructional planning for increased student learning

Why Look at Data? The purpose of data is to give educators INSIGHT!

Types of Tests Norm-Referenced Test (NRT) Criterion-Referenced Test (CRT)

What is a Norm-Referenced Test (NRT)? A standardized assessment in which all students perform under the same conditions. A standardized assessment in which all students perform under the same conditions. It compares the performance of a student or group of students to a national sample of students at the same grade and age, called the norm group. It compares the performance of a student or group of students to a national sample of students at the same grade and age, called the norm group.

What is a Criterion-Referenced Test (CRT)? An assessment comparing one student's performance to a specific learning objective or performance standard and not to the performance of other students. An assessment comparing one student's performance to a specific learning objective or performance standard and not to the performance of other students. It tells us how well students are performing on specific goals or content standards rather than how their performance compares to a national or local norming group. It tells us how well students are performing on specific goals or content standards rather than how their performance compares to a national or local norming group.

Summary NRT and CRT

Types of Scores

Raw Score (RS) The number of items a student answers correctly on a test. The number of items a student answers correctly on a test. John took a 10 item Grammar and Usage Content Mastery Subtest (where each item was worth one point) and correctly answered 7 items. John took a 10 item Grammar and Usage Content Mastery Subtest (where each item was worth one point) and correctly answered 7 items. His raw score for this assessment is 7. His raw score for this assessment is 7.

Scale Score (SS) Mathematically converted raw scores based on level of difficulty per question. Mathematically converted raw scores based on level of difficulty per question. For FCAT-SSS, a computer program is used to analyze student responses and to compute the scale score. For FCAT-SSS, a computer program is used to analyze student responses and to compute the scale score. Scale Scores reflect a more accurate picture of the student’s achievement level. Scale Scores reflect a more accurate picture of the student’s achievement level.

Gain Scores Commonly referred to as “Learning Gains”. Commonly referred to as “Learning Gains”. The amount of progress a student makes in one school year.

Student Learning Gains: Who Qualifies?  All students with a pre- and post-test, including all subgroups (ESE, LEP, etc.).  All students with matched, consecutive year (i.e & 2006) FCAT SSS results, grades  Students enrolled in the same school during October and February FTE will count toward school learning gains.

Learning Gains: Which Scores? Gains are applied in reading and math, not writing or science. Gains are applied in reading and math, not writing or science. Pre-test may be from same school, same district, or anywhere in the state. Pre-test may be from same school, same district, or anywhere in the state.

Learning Gains: What equals Adequate Yearly Progress (AYP)? A. Improve FCAT Achievement Levels from 2005 to 2006 (e.g. 1-2, 2-3, 3-4, 4-5) OR B. Maintain “satisfactory” Achievement Levels from (e.g. 3-3, 4-4, 5-5) OR C. Demonstrate more than one year’s growth within Level 1 or Level 2 - determined by DSS Cut Points (not applicable for retained students)

Developmental Scale Score Gains Table (DSS Cut Points) Students achieving within Level 1 (or within Level 2) for two consecutive years must gain at least one point more than those listed in the table in order to satisfy the “making annual learning gains” component of the school accountability system. Grade Level ChangeReadingMathematics 3 to to to to to to to

Learning Gains: Retainees A retained student can only be counted as making adequate progress if he/she: Moves up one level. (e.g. 1-2, 2-3, 3-4, 4-5) Maintains a level 3, 4, or 5. Maintains a level 3, 4, or 5.

Learning Gains: Activity Using the data on the following table to determine:  which students made a learning gain  what percentage of the teacher’s students made a learning gain

Data Display for FCAT Reading Results Student04/05 Grade Level 05/06 Grade Level Pre-test Achievement Level Pre- test DSS Post-test Achievement Level Post- test DSS Learning Gain Determination A78Level 1Level 2Yes or No Reason: A, B, or C B78Level 4 Yes or No Reason: A, B, or C C78Level 21598Level 21743Yes or No Reason: A, B, or C D88Level 1Level 2Yes or No Reason: A, B, or C E88Level 3 Yes or No Reason: A, B, or C F88Level 11486Level 11653Yes or No Reason: A, B, or C G78Level 5Level 4Yes or No Reason: A, B, or C

Data Display for FCAT Reading Results Student04/05 Grade Level 05/06 Grade Level Pre-test Achievement Level Pre- test DSS Post-test Achievement Level Post- test DSS Learning Gain Determination A78Level 1Level 2Yes Reason: A B78Level 4 Yes Reason: B C78Level 21598Level 21743Yes Reason: C D88Level 1Level 2Yes Reason: A E88Level 3 Yes Reason: B F88Level 11486Level 11653No Reason: C G78Level 5Level 4No Reason: B

Teacher Learning Gains Based on Data Display 5 out of 7 students made learning gains. 71% of this teacher’s students made learning gains and add points towards the school’s grade. No points are given to the school for Student F because he was retained and stayed within level 1 – even though he made significant gains in DSS points. No points are given to Student G because he decreased a level. Total Number of Students with a Pre and Post-test who qualify for learning gain calculations: Reason A Increased 1 or more Achievement Levels Reason B Maintains “satisfactory” levels (3, 4, or 5) Reason C DSS Target Gain (More than a year’s growth) 7221

Class Record Sheet for Learning Gains

Types of Data Results (Summative) Data used to make decisions about student achievement at the end of a period of instruction. Process (Formative) Data gathered at regular intervals during the instructional period; used to provide feedback about student progress and to provide direction for instructional interventions.

A Closer Look at Results Data Examples:

FCAT Parent Report

A Closer Look at Formative Data FORF Content Mastery Fluency Checks

What tools do we have? FCAT Data Inquiry (Summative) FCAT Data Inquiry (Summative) Teacher Tools for Data Collection Teacher Tools for Data Collection (Can be Summative or Formative) Histogram Pareto Chart Run Chart Scatter Diagram Item Analysis

Histogram Bar chart representing a frequency distribution of student scores Heights of the bars represent number of students scoring at same level/score Used to Monitor progress Histogram: Minutes to Run 1 Mile Time Frequency

Beginning of Year Placement Scores

Mid-year

End of Year Placement Scores

Run Chart Use to: Monitor progress over time Display data in simplest form

Word Fluency Checks

Individual Student Fluency Scores

Class Fluency Averages

Class Goal: By the end of the year, 100% of our class will have an average of at least 140 words correct per minute. Class Run Chart: Percent of Students Reading 140 words correct per minute on fluency checks. Quizzes Week Correct Words per Minute

Scatter Diagram

Scatter Diagram: Hours of Sleep vs Mistakes on Test Hours of Sleep Mistakes

Item Analysis Use to: Determine mastered content Determine most common mistakes

Spelling Inventory Item Analysis

CLASSROOM TEST ANALYSIS BENCHMARK ASSESSED ITEM # NUMBER CORRECT NUMBER INCORRECT NUMBER PARTIAL CREDIT NUMBER DISTRACTOR A/1 NUMBER DISTRACTOR B/2 NUMBER DISTRACTOR C/3 NUMBER DISTRACTOR D/4 NUMBER NO ANSWER

ITEM ANALYSIS ACTIVITY

Pareto Chart Use to: Rank issues in order of occurrence Decide which problems need to be addressed first Find the issues that have the greatest impact Monitor impact of changes

Pareto Chart: Types of mistakes in Division Problems Incorrect multiplicationIncorrect subtractionNo decimalOther Mistake Percent Cumulative percentage

Conclusion With the run charts, scatter diagrams, and histograms, educators can do a far better job of leading the vast majority to meet grade-level standards. With the run charts, scatter diagrams, and histograms, educators can do a far better job of leading the vast majority to meet grade-level standards. The key is using the statistical tools to create teams (teachers & students, teachers and principals, site staff, and district office staff) that are working together to create all-time-bests in every aspect of schooling. The key is using the statistical tools to create teams (teachers & students, teachers and principals, site staff, and district office staff) that are working together to create all-time-bests in every aspect of schooling.

Data analysis provides: Insight and Questions

Adapted from Getting Excited About Data, Edie Holcomb What question are we trying to answer? What can we tell from the data? What can we NOT tell from the data? What else might we want to know? What good news is here for us to celebrate? What opportunities for improvement are suggested by the data? Questions to Ponder…

Action Answers! Provides

What information have I gained from my data? What interventions can I put In place? Implement the plan. Analyze the results. Make improvements. Steps to Improvement PLAN DO STUDY ACT

Personal Action Plan What data can I access? What tools can I use to help me monitor progress toward our class goals? What/who else do I need to help me? What is my start date? How will I evaluate the results? P D S A