Using EVAAS to Improve Student Performance Donna Albaugh Rachel McBroom Heather Stewart Region 4 PD Leads NCDPI.

Slides:



Advertisements
Similar presentations
___________________________ NCDPI Division of Accountability Services/North Carolina Testing Program North Carolina Alternate Assessment Academic Inventory.
Advertisements

___________________________ NCDPI Division of Accountability Services/North Carolina Testing Program North Carolina Alternate Assessment Academic Inventory.
___________________________ NCDPI Division of Accountability Services/North Carolina Testing Program North Carolina Alternate Assessment Academic Inventory.
Summary of NCDPI Staff Development 12/4/12
Copyright © 2010, SAS Institute Inc. All rights reserved. Making Data Work for Kids: EVAAS Teacher Reports October 2012 SAS ® EVAAS ® for K-12.
EVAAS, Teacher Effectiveness Reports, and Teacher Evaluation November 2012.
EVAAS EVALUATION EFFECTIVENESS …AND WHAT DOES IT SAY??? Brawley Middle School November 27, 2012.
A Grand Bargain for Education Reform A Grand Bargain for Education Reform The Giffin Model 2011 Regional Conference on Strategic Compensation Awareness.
VALUE – ADDED 101 Ken Bernacki and Denise Brewster.
Haywood County Schools February 20,2013
Making the Most of Your Data To Inform and Improve Teaching and Learning Transylvania County Schools March 20,2013 The Power of EVAAS.
Upper Darby School District Growth Data
Student Growth Percentiles For Classroom Teachers and Contributing Professionals KDE:OAA:3/28/2014:kd:rls 1.
Using EVAAS to Improve Student Performance Heather Stewart Jody Cleven Region 4 PD Leads NCDPI.
Enquiring mines wanna no.... Who is it? Coleman Report “[S]chools bring little influence to bear upon a child’s achievement that is independent of.
Curriculum Roundtable February 4, Topics Ohio’s New State Tests Professional Development in the County Visible Learning Conversation.
WATAUGA COUNTY SCHOOLS MIDDLE GRADES PRE-MATH 1 & MATH 1 PLACEMENT Advanced Math Placement Procedure.
The reform of A level qualifications in the sciences Dennis Opposs SCORE seminar on grading of practical work in A level sciences, 17 October 2014, London.
Overall Teacher Judgements
“Every Child a Graduate” Wisconsin Department of Public Instruction SPECIAL EDCATION FRAMEWORK FOR PROFESSIONAL DEVELOPMENT Life-long learning and continuous.
1 New York State Growth Model for Educator Evaluation 2011–12 July 2012 PRESENTATION as of 7/9/12.
Copyright ©2006. Battelle for Kids. Understanding & Using Value-Added Analysis.
PPT Presentation Template: This PPT includes all slides to present a district or building level overview of PVAAS. This was used with a district- wide.
PVAAS – Pennsylvania Value-Added Assessment System added_assessment_system_(pvaas)/8751.
Beginning – Intermediate October 16, 2012 EVAAS for Educators.
Jim Lloyd_2007 Educational Value Added Assessment System (EVAAS) Phase 2 Presentation.
Understanding EVAAS Teacher Effectiveness Reporting Washington County Public Schools April 25, 2013.
Getting Started Log on to this site: Complete Tasks Before We Begin Section 1.Consensogram Activity 2.Burning.
North Carolina Career Technical Education Assessments: Using Assessment and Data for Continuous Growth and Improvement Tammy Howard, PhD Director, Accountability.
EVAAS for Educators Mary Keel, Ed.D. Robin Loflin Smith, Ed.D. Tara Patterson, MSA.
Beginning – Intermediate – Advance Date EVAAS for Educators.
Setting the Context 10/26/2015 page 1. Getting Students READY The central focus of READY is improving student learning... by enabling and ensuring great.
Testing and Accountability Using Data to Help Improve Student Achievement By Charity Bell Executive Director, West Learning Community.
Beginning – Intermediate – Advanced Friday, September 14, 2012 EVAAS for Educators.
Copyright © 2010, SAS Institute Inc. All rights reserved. EVAAS Concepts: NCEs and Standard Errors Sandy Horn January 2013 SAS ® EVAAS ® for K-12.
Copyright © 2010, SAS Institute Inc. All rights reserved. How Do They Do That? EVAAS and the New Tests October 2013 SAS ® EVAAS ® for K-12.
1 New York State Growth Model for Educator Evaluation 2011–12 July 2012 PRESENTATION as of 7/9/12.
PVAAS Public Data Release January January 20,
DVAS Training Find out how Battelle for Kids can help Presentation Outcomes Learn rationale for value-added progress measures Receive conceptual.
EVAAS Proactive and Teacher Reports: Assessing Students’ Academic Needs and Using Teacher Reports to Improve Student Progress Cherokee County Schools February.
Value Added Model and Evaluations: Keeping It Simple Polk County Schools – November 2015.
Copyright © 2010, SAS Institute Inc. All rights reserved. How Do They Do That? EVAAS and the New Tests October 2013 SAS ® EVAAS ® for K-12.
ACT ASPIRE GROWTH REPORTS. DISTRICTS AND SCHOOLS THAT PARTICIPATED IN ACT ASPIRE ASSESSMENTS (READING, MATH, ENGLISH, SCIENCE AND WRITING) WITH AN N COUNT.
EVAAS and Expectations. Answers the question of how effective a schooling experience is for learners Produces reports that –Predict student success –Show.
EVAAS Proactive and Teacher Reports: Assessing Students’ Academic Needs and Using Teacher Reports to Improve Student Progress Becky Pearson and Joyce Gardner.
CTE Directors April 11, 2013 Understanding EVAAS: Teacher Effectiveness Reporting.
Overview of Student Learning Objectives (SLOs) for
GEORGIA’S CRITERION-REFERENCED COMPETENCY TESTS (CRCT) Questions and Answers for Parents of Georgia Students February 11, 2009 Presented by: MCES.
Beginning – Intermediate October 16, 2012 WRESA Internet Access: Log into AB Tech No Password EVAAS for Educators.
Making the Most of Your Data To Inform and Improve Teaching and Learning Swain County Schools March 8, 2013 The Power of EVAAS.
EVAAS for Teachers: Overview and Teacher Reports Every Student READY.
Beginning – Intermediate – Advance November 8, 2012 EVAAS for Educators.
Using EVAAS to Improve Student Performance Heather Stewart Jody Cleven Region 4 PD Leads NCDPI.
Copyright © 2014 American Institutes for Research and Cleveland Metropolitan School District. All rights reserved. March 2014 Interpreting Vendor Assessment.
Understanding ERB Scores
Understanding Growth Targets and Target Adjustment Guidance for Student Learning Objectives Cleveland Metropolitan School District Copyright © 2014 American.
Making Data Work for Kids: EVAAS Teacher Reports October 2012 SAS ® EVAAS ® for K-12.
1 New York State Growth Model for Educator Evaluation June 2012 PRESENTATION as of 6/14/12.
Using EVAAS to Make Data- Driven Decisions Clay County School April 20, 2012 Jan King Professional Development Consultant, NCDPI.
Teacher SLTs
New EVAAS Teacher Value-Added Reports for
2015 PARCC Results for R.I: Work to do, focus on teaching and learning
Reflective Assessments
EVAAS Overview.
Making Data Work for Kids: EVAAS Teacher Reports October 2012
Proactive Assessments
Teacher SLTs
Why should you care about the EVAAS Teacher Value Added Report?
State Assessment Update
“Reviewing Achievement and Value-Added Data”
Presentation transcript:

Using EVAAS to Improve Student Performance Donna Albaugh Rachel McBroom Heather Stewart Region 4 PD Leads NCDPI

Outcomes Overview of EVAAS Growth Model Interpreting Value-Added Reports Interpreting Diagnostic Report Helping teachers use their individual data Note: Slides marked with * are from SAS.

Underlying EVAAS Philosophy All students deserve opportunities to make appropriate academic progress every year. There is no “one size fits all” way of educating students who enter a class at different levels of academic achievement. Adjustments to instruction should be based on the academic attainment of students, not on socio-economic factors. Given reliable information on past effectiveness, educators can make appropriate adjustments to improve student opportunities. "What teachers know and can do is the most important influence on what students learn." (National Commission on Teaching and America's Future, 1996) One of the most important things educators can know is with whom they are effective and where they need to develop new skills.. * SAS

Achievement and Poverty * SAS

Academic Growth and Poverty No one is doomed to failure. * SAS

Benefits for Principals Gain a consolidated view of student progress and teacher effectiveness, as well as the impact of instruction and performance. Bring clarity to strategic planning and function as a catalyst for conversations that must take place to ensure that all students reach their potential. Understand and leverage the strengths of effective teachers. Use the valuable resource of effective teaching to benefit as many students as possible.

Changes in Reporting Above Not Detectably Different Below Exceeds Expected Growth Meets Expected Growth Does Not Meet Expected Growth

Value-Added Reporting Predictions Projections

North Carolina uses two different models in EVAAS 1.The Univariate Response Model (URM) is used for EOGs in grades 5 and 8 science, EOCs in high school, and CTE Post-Assessments; URM will be used for the Common Exams (predicts) 2.The Multivariate Response Model (MRM) is used for EOGs in grades 3 – 8 mathematics and English Language Arts (projects)

Predict/Project The URM uses a student’s prior test scores to predict where a student will be positioned in the statewide distribution of students who take a certain assessment The MRM uses a student’s prior test scores to project where students will be positioned in the statewide distribution of students who take a certain assessment In both cases, the basic methodology is the same: using a student’s prior test scores to consider their performance on an assessment he or she will take at the end of the school year

Predictive VA Model – Univariate Response Model (Predicts) For reports where testing is not sequential (gr. 5 and 8 science, EOCs in high school, CTE post-assessments) Common Exams Students must have three previous test scores for a predictive model Effect is difference between predicted and observed scores

Value-Added Reports – Multivariate Model (Projections)

Value-Added Reporting

The NCE Base is by definition set at 50.0, and it represents the average attainment level of students in the grade and subject, statewide. If the school mean is greater, the average student in the school is performing at a higher achievement level than the average student in the state.

District Diagnostic Reports Use to identify patterns or trends of progress among students expected to score at different achievement levels

Diagnostic Report

Diagnostic Reports – the whiskers

Diagnostic Reports Looking for Patterns

Student Pattern Report

Student Pattern Report – Key Questions Different experience? Different strategies? Different needs? Number of hours?

Student Pattern Report – Key Questions Different experience? Different strategies? Different needs? Number of hours? Rerun the report with new criteria. YES!

Student Pattern Report – Next Steps 16 Students who attended for 40+ hours All 31 Students in the Program

Less Informed Conclusion: We need to change the selection criteria for this program. More Informed Conclusion: We need to adjust the recommended hours for participants.

Custom Student Report HANDOUT

Using Data to Drive Crucial Conversations What’s a Crucial Conversation? And Who Cares? A crucial conversation is a discussion between two or more people where stakes are high, opinions vary, and emotions run strong. When we face crucial conversations, we can do one of three things: We can avoid them, we can face them and handle them poorly, or we can face them and handle them well. Ironically, the more crucial the conversation, the less likely we are to handle it well.

Frequently Asked Questions How is the school composite determined? In accordance with current State Board of Education policy, the school composite only includes assessments that are part of the State Testing Program. Included: EOCs and EOGs Not Included: CTE Post-Assessments and Common Exams Many educators feel strongly that the CTE Post-Assessments and Common Exams should be included – the State Board of Education will consider this recommendation

Frequently Asked Questions How will the Common Exams be entered into EVAAS? The data will be processed and transferred in the same way as data from the State Testing Program: districts scan their answer sheets; DPI does quality checks and audits; DPI moves the data to SAS through a secure FTP A major difference for all assessments (EOCs, EOGs, CTE Post-Assessments, and Common Exams) is that student- teacher links will come from the EVAAS roster verification process, not files pulled from NCWISE

Frequently Asked Questions Why is there no growth data for individual students? All growth models depend on having a large enough sample of student test scores to produce valid and reliable measures of growth The ABC Growth Model is not valid or reliable when used to measure growth at the student-level; the error around any estimate is larger than the entire scale for an assessment EVAAS will only produce growth values in which we can all have confidence, hence, there will be no growth values for individual students

Frequently Asked Questions How can we have growth measures for assessments that have never been given before? Even if an assessment has never been given before, a first- time administration will produce a mean score, and 50% of students will do better than the mean and 50% of students will do worse than the mean Using percentiles and NCEs, EVAAS is always considering a student’s position in the statewide distribution, not the scale or raw score on the assessment

Frequently Asked Questions How can we have growth measures for assessments that have never been given before? For assessments that are completely new (like the Common Exams), SAS performs an analysis of data from first semester to determine the prediction model. After the second semester administration, all of the data are used in the model to produce value-added scores

Frequently Asked Questions Why are there large numbers of teachers “not meeting expected growth” in some content areas and not others? There are larger numbers of students taking some state assessments compared to others. Additionally, evidence from across the country shows that ELA teachers are less likely to have either an extremely positive or extremely negative impact than teachers of other content areas The distributions currently in EVAAS are based on one-year of data; they tend to normalize as we add more data to the system

Questions?