LCSA - June 8, 2012. Continue the use of data for dialogue and decision-making Support compliance reporting for Comprehensive Needs Assessment and School.

Slides:



Advertisements
Similar presentations
Interpreting Test Results For Grades K-8. What tests will my child take? Students are assessed through: DIBELS (Dynamic Indicators of Basic Early Literacy.
Advertisements

Data Warehouse What is a Data Warehouse? MAISA Definition Michigan Association of Intermediate School Administrators (supported by MSBO-Michigan School.
Visit Our Website at earlychildhoodohio.org Ohio’s Early Childhood Comprehensive Assessment System 1.
PAYS FOR: Literacy Coach, Power Hour Aides, LTM's, Literacy Trainings, Kindergarten Teacher Training, Materials.
Bloomington Public Schools K-12 Pathways Based in part on NWEA (2012) study of RIT scales and ACT College Readiness Benchmarks, TIES equating with MCA.
Using MAP for College and Career Readiness
STAR Assessments: Using data to drive your instruction 2012.
Welcome to Title I Reading and Math Night! Underwood Elementary School September 16, 2013.
Fall 2014 MAP NWEA Score Comparison Alliance Dr. Olga Mohan High School October 22, 2014.
Performance Pathways Users Conference Using Multiple Assessments.
Build Assessment Literacy and Create a Data Overview Oct. 10, 2006.
MARCH 12, 2015 Testing at Lees Corner ES. Still Online? Online Testing  Grade Level Common Assessments Mostly in grades 3-6  eCart Windows Grades 3-6.
Data Analysis Protocol Identify – Strengths – Weaknesses – Trends – Outliers Focus on 3 – Strengths – Weaknesses.
Race to the Top Assessment Toolbox 1. RT3 Assessment Initiatives Purpose – To support teachers in preparing the students for the Common Core Assessment.
JUNE 26, 2012 BOARD MEETING Measures of Academic Progress (MAP)
Program Effectiveness in GARF: Where Have We Been and Where Do You Need to Go?
Valentine Elementary School San Marino Unified School District Standardized Testing and Reporting (STAR) Spring 2009 California Standards Test.
Challenges and Opportunities: Putting the Puzzle Together The Michigan Merit Exam: Looking at Mathematics Fall 2006.
Using Data to Identify Student Needs for MME Stan Masters Coordinator of Curriculum, Assessment, and School Improvement Lenawee ISD August 26, 2008.
Stan Masters Lenawee ISD February 10, 2012
Educator Evaluations: Growth Models Presentation to Sand Creek Schools June 13, 2011.
Data for Student Success Regional Data Initiative Presentation November 20, 2009.
Cut Scores, Student Growth and College/Career Readiness: Data Dialogues Stan Masters Lenawee ISD January 17, 2012.
Assessment Overview Dr. Gayle Potter Associate Director Curriculum, Assessment, and Research January 9, 2008.
Mathematics Indicators and Goals. Math Tier II Indicator Indicator 1.8: All junior high students will meet or exceed standards and be identified as proficient.
MI-Access Reports— What Good are They to Me? Prepared by Linda Headley, Headley Pratt Consulting Fall 2007.
Predicting Patterns: Lenawee County's Use of EXPLORE and PLAN DataDirector 2011 User Conference Dearborn, Michigan.
Benchmark Data. World History Average Score: 56% Alliance: 96%
Data Literacy MARCH Workshop outcomes Begin building common understanding and skills of data literacy Reflect on which data sources we use most.
DataDirector: Using Customized Reports for School Improvement Fall 2010 Jennifer DeGrie and Stan Masters Lenawee ISD.
USING DATADIRECTOR ASSESSMENTS TO INCREASE USE OF FEEDBACK TO STUDENTS AND TEACHERS Stan Masters Coordinator of Instructional Data Services Lenawee Intermediate.
EXPLORE and PLAN College Readiness Benchmark Scores The EXPLORE and PLAN College Readiness Benchmark Scores are based on the ACT College Readiness Benchmark.
Goal Setting Measures of Academic Progress, MAP, DesCartes: A Continuum of Learning, Partnering to help all kids learn, Power of Instructional Design,
So Much Data – Where Do I Start? Assessment & Accountability Conference 2008 Session #18.
Department of Research and Planning November 14, 2011.
May 13, 2011 Getting to Know the Common Core State Standards (CCSS)
Present Levels of Academic Achievement and Functional Performance.
Student Achievement Manager Intervention Plans. Creating a Student Intervention Plan Intervention Plans: Encompass all components of an intervention –
Public Schools of Petoskey Measures of Academic Progress (MAP) Implementation Report Northwest Evaluation Association (NWEA) April 2007.
ELPA, MEAP, and MME Reporting Office of Educational Assessment & Accountability (OEAA) 2006 OEAA Fall Conference Marilyn Roberts – Director, Office of.
Spring 2009 MCAS Results. Dedham Elementary Schools.
Fall 2007 MEAP Reporting 2007 OEAA Conference Jim Griffiths – Manager, Assessment Administration & Reporting Sue Peterman - Department Analyst, MEAP.
DataDirector for Teachers - Beginner Mitch Fowler
Improving Student Performance in Middle School Mathematics Improving Student Performance in Middle School Mathematics The Year in Review Facilitators:
MEAP / MME New Cut Scores Gill Elementary February 2012.
DataDirector for Teachers - Beginner Mitch Fowler
A Balanced Assessment Plan For Petoskey Public Schools.
Welcome to MMS MAP DATA INFO NIGHT 2015.
Hudson Area Schools - Lincoln Elementary School Improvement Professional Development Friday, February 26, 2010.
DataDirector: Using Customized Reports for Local Assessments Fall 2011 Stan Masters Lenawee ISD.
Millard South High School. A rubric is a clear summary of how you will be graded on a particular piece of your work. Each trait will be scored on a scale.
NWEA Training By: Mark Hayes. How everything work together? Students take the NWEA assessment to get their RIT score. The RIT score determines what.
Dorthea Litson April 30, Purposes of Assessment Purposes of Assessment Making instructional decisions Monitoring student progress Evaluating programs.
DataDirector for Administrators - Beginner Mitch Fowler
The State of the School Fall Goals What do we want children to know and be able to do with text in this school? We want our children to know how.
Minnesota Assessments June 25, 2012 Presenter: Jennifer Dugan.
READING 3D TRAINING Lynn Holmes Fall 2015.
Somers Public Schools Building and Departmental Goals
WestEd.org Washington Private Schools RtI Conference Follow- up Webinar October 16, 2012 Silvia DeRuvo Pam McCabe WestEd Center for Prevention and Early.
GISD: Genesee County’s Regional Educational Service Agency GENESEE INTERMEDIATE SCHOOL DISTRICT LISA A. HAGEL SUPERINTENDENT OCTOBER 20, 2011 Understanding.
Somers Public Schools Building and Departmental Goals
Winston-Salem / Forsyth County Schools
January 17, 2017 Board Workshop
MAP Measures Of Academic Progress Test offered in Reading, Language Usage, Math and Science.
MME Reading Score Analysis
Back to School Night 2018.
Puzzles and Mystery Novels
DataDirector for Administrators
Harper Creek Admin Team DataDirector Training
JACKSON SCHOOL DISTRICT Middle School Math Informational Night
Presentation transcript:

LCSA - June 8, 2012

Continue the use of data for dialogue and decision-making Support compliance reporting for Comprehensive Needs Assessment and School Improvement planning

Major Question Data Representations Major Question Data Representations Dialogue Questions – Observations – Inferences Dialogue Questions – Observations – Inferences

Predictions Observations Inferences Adapted from Deb Clancy, Washtenaw ISD, 2008, based upon the work of Nancy Love, “Using Data/Getting Results” (2002)

Observations – What percentage of our students were at levels 1 and 2? – At which level of performance do we have the most students? Inferences – What school processes by adults might explain the students’ achievement? – What next steps should be taken to address this achievement? Observations – What percentage of our students were at levels 1 and 2? – At which level of performance do we have the most students? Inferences – What school processes by adults might explain the students’ achievement? – What next steps should be taken to address this achievement?

Observations – Which strands were our strengths on the test? – Which strands were our weaknesses on the test? Inferences – What school processes by adults might explain the students’ achievement? – What next steps should be taken to address this achievement? Observations – Which strands were our strengths on the test? – Which strands were our weaknesses on the test? Inferences – What school processes by adults might explain the students’ achievement? – What next steps should be taken to address this achievement?

Summary Assessments with scores –pre/post, unit tests, literacy scores Item Bank Assessments with standards –tests created with DataDirector items Answer Sheet Assessments with standards –tests created with items outside of DataDirector

Using EXPLORE Scores to Predict Future PLAN Scores Highest Probability High Medium Low Lowest Probability 10-11Expected10-11Expected10-11Expected10-11Expected EXPLOREPLANEXPLOREPLANEXPLOREPLANEXPLOREPLAN LastnameFirstnameEnglish Reading Mathematics Science

Student Group # Students Nonsense Word FluencyOral Reading Fluency Phoneme Segmentation Fluency All Students 63 Deficit00.00%At Risk23.17%Deficit00.00% Emerging23.17% Some Risk %Emerging11.59% Established % Low Risk %Established %

Performance Level Scaled Score Domain/ Standard Score Benchmark/ GLCE Score Written Curriculum Alignment Analysis of Performance Task Analysis of Student Learning VALIDITYVALIDITY

“X” represents opportunities Fall 2009 Fall 2010 Fall 2011 Statewide Assessment (MEAP) XXX Interim Assessment ( NWEA, DIBELS, DRA, STAR ) XXXXXXXXX Classroom Assessment (Unit tests, common writings with rubrics) XXXXXX XXXXXX XXXXXX XXXXXX XXXXXX XXXXXX XXXXXX XXXXXX XXXXXX

Performance Level data with MEAP/MME –“On Track” designation with PLAN or EXPLORE –Threshold designation on interim assessments

LastFirst STAR EOY Grade Equivalency Fall 2011 MEAP PL

Scaled Scores data with MEAP/MME –Scale Scores with PLAN or EXPLORE –Scale Scores on Interim Assessments

LastFirst Fall 2011 MEAP SS Spring Test RIT Score

Standards data with MEAP/MME –Subarea scores with PLAN or EXPLORE –Goal scores on Interim Assessments –Standards data on local assessments

Fall 2011 PreTestSpring 2012 MME MATH A1MATH A2MATH L2 Math A1 Math A2 Math L2 Number of Questions FirstLast 20%0%20% %100%80% %100%20% %67%60% %0% %67%0% %100%20% %33%20% %0%20% %0% %67%40% %67%20% %67%60% %67%20% %67%80% %67%20%364038

Expectations data with MEAP –Item Analysis scores with EXPLORE –Item Analysis on local assessments

Classroom TestMEAP WHG WHG WHG WHG 3.2.3W1.2.1W2.1.4W3.1.9W3.2.3 Q1Q2Q3Q4Q8Q9Q11Q13 Name YNNNYNNN YYYYYNYY YYYYYYYY YYYYYNYN YYYYNYNN NNNYYNYY YYYYYYYY YYYYYNYY YYYYYYYY YYYYYYYN YYYYYNNN NNNNYYYY YNNNYYNY

Source: Presentation by Dr. Victoria Bernhardt, April 2007