Archived Information. MPR Associates 1 Effective Performance Measurement Systems  Define valid and reliable measures of student performance  Use appropriate.

Slides:



Advertisements
Similar presentations
David M. Callejo Pérez & Sebastían R. Díaz West Virginia University Organizing Student Progress Data.
Advertisements

P-20 Data Collaborative Grant University/College Work Group February 24, 2010.
Value Added in CPS. What is value added? A measure of the contribution of schooling to student performance Uses statistical techniques to isolate the.
MSIP Accountability Plan
Victorian Curriculum and Assessment Authority
The Need To Improve STEM Learning Successful K-12 STEM is essential for scientific discovery, economic growth and functioning democracy Too.
Pennsylvania’s Continuous Improvement Process. Understanding AYP How much do you know about AYP?
April 6, 2011 DRAFT Educator Evaluation Project. Teacher Education and Licensure DRAFT The ultimate goal of all educator evaluation should be… TO IMPROVE.
Determining Validity For Oklahoma’s Educational Accountability System Prepared for the American Educational Research Association (AERA) Oklahoma State.
Educator Evaluations Education Accountability Summit August 26-28,
Lessons Learned from AYP Decision Appeals Prepared for the American Educational Research Association Indiana Department of Education April 15, 2004.
1 Utah Performance Assessment System for Students U-PASS Accountability Plan Judy W. Park Assessment & Accountability Director Utah State Office of Education.
Communicating through Data Displays October 10, 2006 © 2006 Public Consulting Group, Inc.
The Special Education Leadership Training Project January, 2003 Mary Lynn Boscardin, Ph.D. Associate Professor Preston C. Green, III, Ed.D., J.D., Associate.
Carl D. Perkins Career and Technical Education Improvement Act of 2006.
Catherine Cross Maple, Ph.D. Deputy Secretary Learning and Accountability
Grade 12 Subject Specific Ministry Training Sessions
MPR Associates 1 Albert Einstein said… “Not everything that counts can be counted and not everything that can be counted counts.” Using data well depends.
Using Longitudinal Data to Improve Student Achievement US Chamber of Commerce Institute Aimee Rogstad Guidera February 23, 2010.
Introduction to Adequate Yearly Progress (AYP) Michigan Department of Education Office of Psychometrics, Accountability, Research, & Evaluation Summer.
Assessment Group for Provincial Assessments, June Kadriye Ercikan University of British Columbia.
PDHPE K-6 Using the syllabus for consistency of assessment © 2006 Curriculum K-12 Directorate, NSW Department of Education and Training.
LOUISIANA STATE SUPERINTENDENT OF EDUCATION JOHN WHITE Tracking Readiness: Measuring High School Effectiveness in Louisiana National Conference on Student.
Questions & Answers About AYP & PI answered on the video by: Rae Belisle, Dave Meaney Bill Padia & Maria Reyes July 2003.
A Parent’s Guide to Understanding the State Accountability Workbook.
Using Data to Improve Student Achievement Aimee R. Guidera Director, Data Quality Campaign National Center for Education Accountability April 23, 2007.
Exploring Alternate AYP Designs for Assessment and Accountability Systems 1 Dr. J.P. Beaudoin, CEO, Research in Action, Inc. Dr. Patricia Abeyta, Bureau.
Pontotoc City School District. Pontotoc City School District believes LEARNING is a priority, a need, and a desire. To be successful, we must nurture.
A Closer Look at Adequate Yearly Progress (AYP) Michigan Department of Education Office of Educational Assessment and Accountability Paul Bielawski Conference.
Presentation on The Elementary and Secondary Education Act “No Child Left Behind” Nicholas C. Donohue, Commissioner of Education New Hampshire Department.
Assessment & Evaluation TF/TL Standard IV CPED 5406 Johnson Bible College Dr. Tony Krug.
Moving Forward!. Responds to identified needs of the district and community, based upon data, and Provides Framework and Vehicle for School Improvement.
Title I and Families. Purpose of Meeting According to the No Child Left Behind Act of 2001, schools are required to host an Annual Meeting to explain.
Expeditionary Learning Queens Middle School Meeting May 29,2013 Presenters: Maryanne Campagna & Antoinette DiPietro 1.
PREPARING [DISTRICT NAME] STUDENTS FOR COLLEGE & CAREER Setting a New Baseline for Success.
University Teaching Symposium January 9,  Individual Development and Educational Assessment  Kansas State University  A diagnostic to.
No Child Left Behind Tecumseh Local Schools. No Child Left Behind OR... 4 No Educator Left Unconfused 4 No Lawyer Left Unemployed 4 No Child Left Untested.
1 Comprehensive Accountability Systems: A Framework for Evaluation Kerry Englert, Ph.D. Paper Presented at the Canadian Evaluation Society June 2, 2003.
Key Considerations in Collecting Student Follow-up Data NACTEI May 15, 2012 Portland, OR Promoting Rigorous Career and Technical Education Programs of.
USING TEACHER EVALUATION TO IMPROVE TEACHING Melinda Mangin.
Data Report July Collect and analyze RtI data Determine effectiveness of RtI in South Dakota in Guide.
NCLB / Education YES! What’s New for Students With Disabilities? Michigan Department of Education.
New Jersey Assessment Of Skills and Knowledge Science 2015 Carmela Triglia.
10+ Ways to Analyze Data Presenter: Lupe Lloyd Lupe Lloyd & Associates, Inc.
P-20 in Action – Michigan’s Focus on Career and College Ready Students: Success through Cross- Agency Collaboration 2012 MIS Conference February 16, 2012.
Research Problem The role of the instructor in online courses depends on course design. Traditional instructor responsibilities include class management,
The Importance of MSIS Data for Assessment Reporting and Accountability Office of Research & Statistics Mississippi Department of Education July 2004.
Development of the Egyptian Code of Practice for Student Assessment Lamis Ragab, MD, MHPE Hala Salah, MD.
No Child Left Behind Impact on Gwinnett County Public Schools’ Students and Schools.
Stetson University welcomes: NCATE Board of Examiners.
EDU 4245 Class 5: Achievement Gap (cont) and Diagnostic Assessments.
C R E S S T / CU University of Colorado at Boulder National Center for Research on Evaluation, Standards, and Student Testing Design Principles for Assessment.
Dr. Salwa El-Magoli Chairperson of the National Quality Assurance and Accreditation Committee. Former Dean of the Faculty of Agricultural, Cairo university.
Assessment and the Quality of Learning  Summative assessment: Gives a numerical mark or grade at the conclusion of a term of study.  Formative assessment:
Summer Series, 2007 Building Capacity to Make Research-Based Practice Common Practice In Georgia Utilizing the Keys to Quality.
UPDATE ON EDUCATOR EVALUATIONS IN MICHIGAN Directors and Representatives of Teacher Education Programs April 22, 2016.
STATEWIDE ASSESSMENT AND ACCOUNTABILITY: PROMOTION AND GRADUATION TESTS BY ISABELLA BROWN Emory University Summer 2006.
Updates on Oklahoma’s Accountability System Jennifer Stegman, Assistant Superintendent Karen Robertson, API Director Office of Accountability and Assessments.
Long Range Technology Plan, Student Device Standards Secondary Device Recommendation.
 Mark D. Reckase.  Student achievement is a result of the interaction of the student and the educational environment including each teacher.  Teachers.
Note: In 2009, this survey replaced the NCA/Baldrige Quality Standards Assessment that was administered from Also, 2010 was the first time.
Conversation about State Report Card November 28, 2016
Educator Equity Resource Tool: Using Comprehensive Equity Indicators
Alaska Superintendents Association Fall Meeting 2016
Coordinator of Research, Development, and Evaluation Position
RTI & SRBI What Are They and How Can We Use Them?
2015 PARCC Results for R.I: Work to do, focus on teaching and learning
Understanding the Local Control & Accountability Plan (LCAP)
Using Data to Improve Student Achievement Aimee R. Guidera
Administrator Evaluation Orientation
Presentation transcript:

Archived Information

MPR Associates 1 Effective Performance Measurement Systems  Define valid and reliable measures of student performance  Use appropriate analytic methods  Promote school improvement

MPR Associates 2 Essential Elements of NCLB  Adequate Yearly Progress Assess state, district and school achievement over time Identify schools in need of improvement  High School Performance Measures Student achievement Graduation rates

MPR Associates 3 Essential Elements of NCLB  State Accountability System Single system Statistically valid and reliable  Populations All Students Subgroups of students

MPR Associates 5 High School Graduation Rate In 2000, the high school graduation rate was 86 %  How is graduation determined?  How is student mobility treated?  When does measurement occur?

MPR Associates 6 Measurement Criteria  Statistically valid systems Single measurement approach Accurate over time Reliable data  Measure Construction Fair Useful

MPR Associates 8 Unpacking the Data Aggregate statistics can conceal useful information. Making the most of your data requires: Identifying issues of concern Disaggregating variables Controlling for factors affecting outcomes

MPR Associates 9 Example: NCLB Graduation Rate State and District Graduation Rate StateDistrict Grad. Rate86 %84 % Race-ethnicity White Black Hispanic 78 68

MPR Associates 10 Example: NCLB Graduation Rate (continued) State and District Graduation Rate Urban StateDistrictDistricts Grad. Rate86 %84 %82 % Race-ethnicity White Black Hispanic786862

MPR Associates 11 Example: NCLB Graduation Rate (continued) District and School Graduation Rates DistrictSchool ASchool B Grad. Rate84 %92 %78 % Race-ethnicity White Black Hispanic68NA68

MPR Associates 12 A General Model for Examining Student Outcomes School Inputs School Practices Student Outcomes School Inputs School Practices Student Outcomes Good Evaluation Focuses on more than Student Outcomes: Data on School Practices and Inputs are Also Essential

MPR Associates 13 Essential Control Variables  School Practices Curricular offerings and rigor Class scheduling Professional development  School Inputs Student demographics Size and urbanicity Teacher experience Examples:

MPR Associates 14 Additional Data Sources…  Student Transcripts Attendance Subject area grades Curricular rigor Mobility & retention …can provide a broader picture of student performance Examples:

MPR Associates 16 Improving Usability  User-tested reporting  Dedicated time for reflection and strategic planning  Building analytic capacity among teachers and administrators  Periodic feedback for overall design of state and local data systems

MPR Associates 18 Effective Performance Measurement Systems  Track: Produce valid and reliable data on school and student outcomes  Analyze: Provide policy relevant information across a range of variables  Improve: Assist policymakers and educators in making better decisions