Slide 1 Educator Preparation Advisory Council (EPAC) Data Subcommittee February 25, 2014.

Slides:



Advertisements
Similar presentations
Foundations of American Education, Fifth Edition
Advertisements

Mississippi Statewide Teacher Appraisal Rubric (M-STAR)
Superintendent Program Review Committee February 26, 2010.
Integrating the NASP Practice Model Into Presentations: Resource Slides Referencing the NASP Practice Model in professional development presentations helps.
Joint ATS-WASC Accreditation Reviews Jerry McCarthy, ATS Teri Cannon, WASC.
RIDE – Office of Special Populations
Copyright © 2012 California Department of Education, Child Development Division with WestEd Center for Child & Family Studies, Desired Results T&TA Project.
1 DPAS II Process and Procedures for Teachers Developed by: Delaware Department of Education.
The Delaware Performance Appraisal System II for Specialists August 2013 Training Module I Introduction to DPAS II Training for Specialists.
Training for Teachers and Specialists
Session Objectives Begin to understand the goals, purpose and rationale for Program Reviews Learn about the components of implementing Program Reviews.
1 Career Pathways for All Students PreK-14 2 Compiled by Sue Updegraff Keystone AEA Information from –Iowa Career Pathways –Iowa School-to-Work –Iowa.
1 R-2 Report: Success in algebra by the end of ninth grade A presentation to the Board of Education by Brad Stam, Chief Academic Officer Instructional.
1 Assessing and Giving Feedback. 2 Learning Outcomes By the end of the session, participants should be able to: Identify the purposes and use of different.
APS Teacher Evaluation
Multiple Indicator Cluster Surveys Survey Design Workshop MICS Technical Assistance MICS Survey Design Workshop.
1 SESSION 5- RECORDING AND REPORTING IN GRADES R-12 Computer Applications Technology Information Technology.
The SCPS Professional Growth System
Overview for Parents and Guardians Fall 2010
Reviewer Training for Onsite Program Review 1. Welcome & Introductions 2.
Introduction to Assessment – Support Services Andrea Brown Director of Program Assessment and Institutional Research Dr. Debra Bryant Accreditation Liaison.
Transforming Teacher Education through Clinical Practice: A National Strategy to Prepare Effective Teachers - Dr. Dwight C. Watson - University of Northern.
Improving Practitioner Assessment Participation Decisions for English Language Learners with Disabilities Laurene Christensen, Ph.D. Linda Goldstone, M.S.
Common Core at CPS Scope and Sequence Implementation Plan
Overview of the Teacher Professional Growth and Effectiveness System KY Council of Administrators of Special Education Summer Conference July 9th, 2013.
CDI Module 10: A Review of Effective Skills Training
Overview of SB 191 Ensuring Quality Instruction through Educator Effectiveness Colorado Department of Education Updated: July 2011.
SEED – CT’s System for Educator and Evaluation and Development April 2013 Wethersfield Public Schools CONNECTICUT ADMINISTRATOR EVALUATION Overview of.
Wethersfield Teacher Evaluation and Support Plan
Introduction to Creating a Balanced Assessment System Presented by: Illinois State Board of Education.
RTI Implementer Webinar Series: Establishing a Screening Process
Formative Assessment Practices Can Be Used in Educator Evaluation Margaret Heritage Edward Roeber.
The Design and Implementation of Educator Evaluation Systems, Variability of Systems and the Role of a Theory of Action Rhode Island Lisa Foehr Rhode Island.
Deana Holinka, MA, CRC, Administrative Coordinator,
Flexible Assessment, Tools and Resources for PLAR Get Ready! Go! Presenter: Deb Blower, PLAR Facilitator Red River College of Applied Arts, Science and.
Summative Assessment Kansas State Department of Education ASSESSMENT LITERACY PROJECT1.
Parents as Partners in Education
1 Phase III: Planning Action Developing Improvement Plans.
SLG Goals, Summative Evaluations, and Assessment Guidance Training LCSD#7 10/10/14.
Sub-heading ADMINISTRATOR EVALUATION AND SUPPORT SYSTEM Curriculum, Instruction and Assessment Leader Proposed Adaptations.
Introduction to Teacher Evaluation August 20, 2014 Elizabeth M. Osga, Ph.D.
Support Professionals Evaluation Model Webinar Spring 2013.
Teacher Evaluation System LSKD Site Administrator Training August 6, 2014.
Sub-heading ADMINISTRATOR EVALUATION AND SUPPORT SYSTEM Adult Education Leader Proposed Adaptations.
PUSD Teacher Evaluation SY12/13 Governing Board Presentation May 10, 2012.
1 Literacy Leadership Teams December 2004 Common High-Quality Differentiated Instruction for Achievement for All within The Cleveland Literacy System Module.
PUSD Teacher Evaluation SY 13/14 Governing Board Presentation May 9, 2013 Dr. Heather Cruz, Deputy Superintendent.
Connecting the Process to: -Current Practice -CEP -CIITS/EDS 1.
Student Learning Data The Three R’s: Requirements, Recommendations & Resources 1.
Data, Now What? Skills for Analyzing and Interpreting Data
AN INTRODUCTION TO NEW HAMPSHIRE STATE POLICY FOR TEACHER EDUCATION PROGRAM REVIEW AND APPROVAL Professional Educator Preparation Program Orientation.
1 Educator Preparation Advisory Council (EPAC) September, 2013.
April 6, 2011 DRAFT Educator Evaluation Project. Teacher Education and Licensure DRAFT The ultimate goal of all educator evaluation should be… TO IMPROVE.
NTEP – Network for Transforming Teacher Preparation A presentation to the State Board TAC on Tiered Licensure and Career Ladders April 6, 2014.
Accountability Assessment Parents & Community Preparing College, Career, & Culturally Ready Graduates Standards Support 1.
Principal Evaluation in Massachusetts: Where we are now National Summit on Educator Effectiveness Principal Evaluation Breakout Session #2 Claudia Bach,
Evaluation of Teacher Preparation Programs: Purposes, Methods, and Policy Options Robert E. Floden, Michigan State University & Jeanne Burns, Louisiana.
1 Educator Preparation Advisory Council (EPAC) July 31, 2013 A New Framework to Strengthen School Leader Preparation in Connecticut.
Connecticut Collaboration for Effective Educator Development, Accountability, and Reform (CEEDAR) Grant.
CONNECT WITH CAEP | | CAEP Standard 3: Candidate quality, recruitment and selectivity Jennifer Carinci,
CONNECT WITH CAEP | Transitioning from NCATE and TEAC to CAEP: How? Patty Garvin, Senior Director,
Commission on Teacher Credentialing Ensuring Educator Excellence 1 Biennial Report October 2008.
Policy for Results: How Policy Meets Preparation to Lead the Way to Improved Outcomes: H325A
Assessment System Overview Center for Education Overview for the NCATE BOE Team April 18-22, 2009.
CONNECTICUT STATE DEPARTMENT OF EDUCATION Educator Preparation Advisory Council (EPAC) University of Saint Joseph, Mercy Hall, Crystal Room December 11,
Helping Teachers Help All Students: The Imperative for High-Quality Professional Development Report of the Maryland Teacher Professional Development Advisory.
Stetson University welcomes: NCATE Board of Examiners.
Partnership for Practice
Phyllis Lynch, PhD Director, Instruction, Assessment and Curriculum
Standard Four Program Impact
Presentation transcript:

Slide 1 Educator Preparation Advisory Council (EPAC) Data Subcommittee February 25, 2014

Slide 2 Welcome and Introductions Welcome Dr. Sarah Barzee Introductions Future meetings: schedule 1-2 hour virtual meetings/conf calls with pre-work EPAC workplan for

Underlying Assumptions of EPAC Charge The quality of instruction plays a central role in student learning (academic, behavioral and social). Educator preparation programs ensure baseline knowledge, skills and dispositions are demonstrated (CCT, SLS, CCSSO Learner Ready Definition, etc.) and contribute to the quality of instruction. Overall goal is to improve programs, not ensure compliance, and provide useful information for improvement of preparation policy and practice. Adapted from Evaluation of Teacher Preparation Programs National Academy of Education, 2013 Slide 3

Purposes of Educator Preparation Program Evaluation Ensuring accountability and monitoring program quality and providing reliable information to the general public and policy makers Providing information to consumers to help them make choices about preparation programs and providing future employers information to support hiring decisions Supporting continuous program improvement with relevant performance data and measures that can identify strengths and weaknesses of existing programs Adapted from Evaluation of Teacher Preparation Programs National Academy of Education, 2013 Slide 4

Validity of Program Approval Decisions Must be based on multiple measures (quantitative and qualitative data) Construct validity Content validity Predictive validity Consequential validity Evaluation system must be adaptable to changing educational standards, curricula, assessment and modes of instruction Slide 5

Slide 6 Program Approval Process Review of Programs based on EPAC Principles 1-5, multiple measures and qualitative criteria as well as statutory requirements PROGRAM APPROVAL PROCESS and DECISION BY the STATE BOARD OF EDUCATION (at the individual program, not institutional, level) Data and Accountability System Performance Categories: Recruitment and Completion Rates Employment and Retention Rates Pre-Service Performance Rates Educator Effectiveness (surveys, eval data) District Partnership Quality + Will Determine EDUCATOR PREPARATION PROGRAM APPROVAL Based on EPAC Principles Assessment Subcommittee To review and make recommendations on new assessments to be developed as part of accountability system

Slide 7 Inter-related Work of EPAC Subcommittees Program Approval: Develop a new, more rigorous program approval process and regulations to guide approval decisions by the State Board of Education (SBE) based on review of efficacy of curriculum, as well as accountability data on a programs measures of quality. Data Collection, Analysis and Reporting: Develop a new data collection, analysis and reporting system for institutional reporting and an accountability system for program approval, as well as provide biennial research data on supply and demand, to inform continuous improvement. Data from accountability system will be linked with program approval decisions Assessment: Guide development of new assessment options including performance assessments, clinical experience evaluations, feedback surveys. Data from new and existing assessments will be used in the data and accountability system.

Slide 8 Data Subcommittee Outcomes EPAC Principles 1. Program Entry Standards 2. Staffing & Support of Clinical Experiences 3. Clinical Experience Requirements 4. District-Program Partnerships & Shared Responsibility 5. Program Completion & Candidate Assessment Standards 6. Program Effectiveness & Accountability Develop: An institutional reporting system An accountability system to be used as part of program approval Biennial data report on supply and demand Use supports from CCSSO/NTEP cross-state collaborative, 2013 – 2015

Creating a Culture of Evidence Use quantitative and qualitative data to make valid inferences (interpretations and findings) that inform program improvement Shift focus from compliance to inquiry and program improvement Data must inform collaboration and shared responsibility for IHE faculty and staff to review and make changes in program structure, practices, policies and teaching School based faculty and administration also have a shared responsibility to collaborate with IHE partners Adapted from Evaluation of Teacher Preparation Programs National Academy of Education, 2013 Slide 9

Attributes and Measures Related to Program Quality Two tables from Evaluation of Teacher Preparation Programs, National Academy of Education, 2013 Table 2.1 Attributes Related to TPP Quality and Evidence Used to Measure Them (page 27) Table 2.2 Main Types of Evidence Used by Different TPP Evaluation Systems (page 60) Slide 10

Slide 11

Slide 12

6 EPAC Principles & Alignment to Performance Indicators 1.Program Entry Standards 2.Staffing & Support of Clinical Experiences 3.Clinical Experience Requirements 4.District-Program Partnerships & Shared Responsibility 5.Program Completion & Candidate Assessment Standards 6.Program Effectiveness & Accountability Slide 13

Title II HEA Mandate for Accountability of Teacher Preparation Programs Expectation of identifying At-Risk or Low-Performing preparation programs has been in place since Title II HEA mandate; this work will redefine criteria for effective, at-risk and low-performing programs See current definition of At-Risk or Low-Performing institutions that has been in place since 2000 and the basis of reporting low-performing IHEs in the annual Title II state report Slide 14

Designing the Data and Accountability System Identify accountability categories Are the previously identified accountability categories sufficient? Recruitment and Completion Rates Employment and Retention Rates Pre-Service Performance Rates Educator Effectiveness (surveys, eval data) District Partnership Quality How will we weight each category? Are they all equally weighted? How do we measure these? Which data points from existing assessments or new assessments do we use? Is there a trigger of any specific data point(s), categories that would require immediate program review? Slide 15

Designing the Data and Accountability System Data currently available: Title II Report (refer to 2013 report Employment/Staff File Data Completer Rates (in Title II report and ETS Title II system) Pass Rates (Praxis II, Foundations of Reading, CAT, ACTFL) Assessments yet to be designed or implemented to provide necessary data points Feedback surveys from teachers Feedback surveys from employers Pre-service assessments Statewide student teaching/clinical experience evaluation instrument Measure of IHE/District Partnership quality Slide 16

Slide 17

Cautions about Data Use and Reporting Consider the source (self-report, district, IHE, feds, etc.) Consider its completeness or missing data Consider the quality of the data Comparisons of different N size Limitations to certain data points as part of the annual data reporting or as part of the accountability system Evolution and adaptability of data system and accountability system over time Slide 18

Case Study: Louisiana Review Louisianas accountability system for educator preparation program First adopted in 2003 Revised version adopted in 2013 Review Annual Report for Teacher Preparation Slide 19

Designing IHE Data and Performance Reports: Next Steps Discuss how to annually or biennially report educator preparation data and performance profiles Build off of existing data systems such as Title II, Certification, CSDE Staff File, etc. Review and consider a dashboard system for displaying annual profile data on each institution and individual program (to the extent that we have program level data and meets the suppression test) Identify key design features desired for this on-line reporting system Slide 20

Next Meeting (virtual) Survey subcommittee for recommendations about: Categories to be used in accountability system Weighting of categories Underlying data points within each category Overall rating system: Identify levels (e.g., low-performing, at-risk, effective, etc.) Identify whether system measures are calculated annually, biennially or other Trigger for off-cycle review: which data point or category can trigger a program approval review outside of established cycle if overall rating system? April meeting: debrief and finalize the above recommendations Set dates for: April, May and June Slide 21

Next Meetings May: Presentation from Ed Klonoski, President of COSC, on Dashboard System Design for IHE Profiles and Performance Reports Design similar to what we have for school district profiles? June: Supply and Demand Study preview and summary of data compiled and analyzed Slide 22