Data Vocabulary Language Arts Summer Cadre 2006 Migdalia Rosario Varsity Lakes Middle Jennifer Miller Varsity Lakes Middle Pat Zubal Dunbar Middle School.

Slides:



Advertisements
Similar presentations
Science Action Research Team Kennedy High School.
Advertisements

Understanding Stanford 10 Results
SEPTEMBER 24, 2009 NICOLE SKALSKY, PH.D. ASSESSMENT FACILITATOR Colorado Growth Model Overview.
Introduction to the Georgia Student Growth Model Understanding and Using SGPs to Improve Student Performance 1.
St. Theresa Catholic School Parent information night
Another addition to the ACER assessment suite….
Reports and Scores Fen Chou, Ph.D. Louisiana Department of Education August 2006.
© 2008 McGraw-Hill Higher Education. All rights reserved. CHAPTER 16 Classroom Assessment.
Lesson Nine Item Analysis.
Staar Trek The Next Generation STAAR Trek: The Next Generation Performance Standards.
How to Take Tests I Background On Testing.
Standardized Test Scores Common Representations for Parents and Students.
Classroom Assessment A Practical Guide for Educators by Craig A
1 The New York State Education Department New York State’s Student Reporting and Accountability System.
Grade 3-8 English Language Arts and Mathematics Results August 8, 2011.
Introduction to GREAT for ELs Office of Student Assessment Wisconsin Department of Public Instruction (608)
But What Does It All Mean? Key Concepts for Getting the Most Out of Your Assessments Emily Moiduddin.
Vertical Scale Scores.
Using Data to Improve Student Achievement Summer 2006 Preschool CSDC.
A CONNECTION BETWEEN SELF REPORTED GRADES AND FEEDBACK Jaimee Sutherland Farmington Middle School.
Welcome to Applying Data!. Applying Data I want to be in your classroom!
Using Data to Improve Student Achievement Secondary Mathematics Preschool Inservice 2006.
How Can Teacher Evaluation Be Connected to Student Achievement?
SB : The Great Teachers and Leaders Act State-wide definition of “effective” teacher and principal in Colorado Academic growth, using multiple measures.
Chapter 3 Understanding Test Scores Robert J. Drummond and Karyn Dayle Jones Assessment Procedures for Counselors and Helping Professionals, 6 th edition.
SAT 10 (Stanford 10) 2013 Nakornpayap International School Presentation by Ms.Pooh.
1 Using Data to Improve Student Achievement & to Close the Achievement Gap Tips & Tools for Data Analysis Spring 2007.
SD Counts Cohort Meeting November 23 9:00-2:00. Problem-Solving Activity To encourage people to attend a special concert, radio station MATH decides to.
Rivier Univ.SAIF Statistics John O. Willis1 Rivier University Education Division Specialist in Assessment of Intellectual Functioning (SAIF) Program.
Diagnostics Mathematics Assessments: Main Ideas  Now typically assess the knowledge and skill on the subsets of the 10 standards specified by the National.
Understanding the TerraNova Test Testing Dates: May Kindergarten to Grade 2.
NRTs and CRTs Group members: Camila, Ariel, Annie, William.
Understanding The PLAN. What Do My Scores Mean?  Your scores range between 1 and 32.  The PLAN takes the number of correct responses on each test and.
Introduction to GREAT for ELs Office of Student Assessment Wisconsin Department of Public Instruction (608)
Using Data to Plan for Instruction Summer 2006 Preschool CSDC.
MELS 601 Ch. 7. If curriculum can be defined most simply as what is taught in the school, then instruction is the how —the methods and techniques that.
Trend Data Calculator March 31, 2005 CRT Trend Data  Locate the "Spring Criterion-Referenced Test School/District Achievement Level Report" for ELA.
SAT -10 Its History, Current Activities, Future Direction Presentation to Principals, May 11, 2010 and Assistant Principals, May 13, 2010 Fran Clay, Glenn.
Chapter 2 ~~~~~ Standardized Assessment: Types, Scores, Reporting.
Student Achievement Dashboard Training. Objectives ●PWBAT understand the purpose for using the dashboard to analyze student achievement ●PWBAT understand.
Do Now (7 minutes) NYSESLAT Overview (10 minutes) NYSESLAT Jigsaw (20 Minutes) Group Presentation/Discussion (20 Minutes) Wrap-Up and Q&A (3 minutes) Do.
Using Data to Improve Student Achievement Summer 2006 Preschool CSDC.
Using the Iowa Assessments Interpretation Workshops Session 3 Using the Iowa Assessments to Track Readiness.
The PLC Team Learning Process Review Step One: Identify essential (key) learning standards that all students must learn in each content area during each.
A Closer Look Quality Goals Appropriate Assessments.
Using Data to Improve Student Achievement Summer 2006 Preschool CSDC.
Iowa Assessment Results and Annual Goals.
Assessing Learners with Special Needs: An Applied Approach, 6e © 2009 Pearson Education, Inc. All rights reserved. Chapter 5: Introduction to Norm- Referenced.
Using Data to Improve Student Achievement Summer 2006 Preschool CSDC.
Capacity Development and School Reform Accountability The School District Of Palm Beach County Adequate Yearly Progress, Differentiated Accountability.
Alternate Proficiency Assessment Erin Lichtenwalner.
Welcome to MMS MAP DATA INFO NIGHT 2015.
Creating Your School Improvement Plan in ASSIST. Click on the “Goals & Plans” Tab.
Iowa Test of Basic Skills – ITBS Informational Meeting Spring 2009 Al-Hedayah Academy Assessment.
QUANTITATIVE DATA: STUDENT LEARNING Norm referenced Criterion referenced “Performance levels” aligned to local, state, national standards Moving toward.
Georgia Milestones End of Grade (EOG) Assessment Grades 3, 4, and 5
The Normal Distribution and Norm-Referenced Testing Norm-referenced tests compare students with their age or grade peers. Scores on these tests are compared.
Standardized Testing EDUC 307. Standardized test a test in which all the questions, format, instructions, scoring, and reporting of scores are the same.
The Great Divide a norm-referenced test (NRT)
Understanding ERB Scores
Review of Cut Scores and Conversion Tables (Angoff Method)
Ch4-1. Testing.
Using Data to Improve Student Achievement Summer 2006 Preschool CSDC.
Richard Woods, Georgia’s School Superintendent “Educating Georgia’s Future” gadoe.org Overview of the Georgia Student Growth Model 1.
1 Testing Various Models in Support of Improving API Scores.
Department of Research and Evaluation
WIDA Standards for ELLs
Standards Based Grading
Standards and Assessment Alternatives
What is this PAT information telling me about my own practice and my students? Leah Saunders.
Presentation transcript:

Data Vocabulary Language Arts Summer Cadre 2006 Migdalia Rosario Varsity Lakes Middle Jennifer Miller Varsity Lakes Middle Pat Zubal Dunbar Middle School Fran Mallory Dunbar High School

Why a vocabulary lesson? 'My teacher said the school has tough new standards and I need to improve my vocabulary. What's 'vocabulary'?'

Establish a common language  Clear understanding by all Better communication  About processes  About results  About student achievement/failure  About instructional practices that yield learning OBJECTIVES

Common Vocabulary Common Understanding

PDSA Planexamine base line data Dois conduct experiment Studystudy results Actif study results in improvement make the improvement stick

Two Major Types of Tests Norm-Referenced Test (NRT) Criterion-Referenced Test (CRT)

What is a Norm-Referenced Test (NRT)? A standardized assessment in which all students perform under the same conditions. It compares the performance of a student or group of students to a national sample of students at the same grade and age, called the norm group.

What is a Criterion-Referenced Test (CRT)? An assessment where a student's performance is compared to a specific learning objective or performance standard and not to the performance of other students. It tells us how well students are performing on specific goals or content standards rather than just telling how their performance compares to a norm group of students nationally or locally.

Question: In criterion-referenced assessments, is it possible that none, or all, of the students will reach a particular goal or performance standard? Answer: YES!!!

Summary NRT and CRT

Three Major Types of Scores

Raw Score (RS) The number of items a student answers correctly on a test. –John took a 20 item mathematics test (where each item was worth one point) and correctly answered 17 items. –His raw score for this assessment is 17.

Question: If Mary answered 24 items correctly on a reading test, and 40 items correctly on a mathematics test, did she do better on the mathematics test than on the reading measure? Reading Test 24 Math Test 40 ?

Scale Score (SS) Mathematically converted raw scores that use a new, arbitrarily chosen scale to represent levels of achievement or ability. They have no inherent or readily apparent meaning. For FCAT-SSS, a computer program is used to analyze student responses and to compute the scale score. It reports test results on the student’s entire test

Scale Score (SS) Higher scale scores indicate higher proficiency. On a continuous, vertical scale across grade levels you can track a student’s progress from lower to upper grade levels on one scale. Growth in scale score units indicates growth in proficiency. For FCAT-SSS, the Developmental Scale Score is used to determine a student’s annual progress from grade to grade.

Gain Scores Commonly referred to as “Learning Gains”. The amount of progress a student makes in one school year.

FCAT-SSS Scale Scores

FCAT-SSS Developmental Scale

Pareto Chart Mistakes by Subtest 0% 20% 40% 60% 80% 100% 120% Reference and ResearchAuthor's PurposeCompare / ContrastCause and EffectMain Idea Subtest Percentage of Mistakes Cumulative Percentage

PDSA Planexamine base line data Dois conduct experiment Studystudy results Actif study results in improvement make the improvement stick

Group Activity