What We Know About Effective Professional Development: Implications for State MSPs Part 2 Iris R. Weiss June 11, 2008.

Slides:



Advertisements
Similar presentations
M & E for K to 12 BEP in Schools
Advertisements

Designs to Estimate Impacts of MSP Projects with Confidence. Ellen Bobronnikov March 29, 2010.
An Introduction to Response to Intervention MN Response to Intervention Center Ann Casey, Ph.D. Director St. Croix River Ed. District
Evaluating and Revising the Physical Education Instructional Program.
Why Student Perceptions Matter Rob Ramsdell, Co-founder April 2015.
Assessing and Evaluating Learning
Supporting PreK Teachers During Act 3 Implementation.
What is program success? Wendy Tackett, Ph.D., Evaluator Valerie L. Mills, Project Director Adele Sobania, STEM Oakland Schools MSP, Michigan.
Evaluation of Math-Science Partnership Projects (or how to find out if you’re really getting your money’s worth)
Central Kentucky Partnership in Mathematics and Science (CKPIMS) Central Kentucky Partnership in Mathematics and Science (CKPIMS) Central Kentucky Education.
What is Effective Professional Development? Dr. Robert Mayes Science and Mathematics Teaching Center University of Wyoming.
Evaluation of a Laptop Program Deborah L. Lowther Steven M. Ross Gary R. Morrison.
Measuring Changes in Teachers’ Mathematics Content Knowledge Dr. Amy Germuth Compass Consulting Group, LLC.
Measuring Changes in Teachers’ Science Content Knowledge Dr. Anne D’Agostino Compass Consulting Group, LLC.
Illinois MSP Program Goals  To increase the content expertise of mathematics and science teachers; 4 To increase teaching skills through access to the.
SB : The Great Teachers and Leaders Act State-wide definition of “effective” teacher and principal in Colorado Academic growth, using multiple measures.
HOW TO DO A STATE LONGITUDINAL EVALUATION MATH AND SCIENCE PARTNERSHIPS PROGRAM FEBRUARY 2011.
Evaluating the Vermont Mathematics Initiative (VMI) in a Value Added Context H. ‘Bud’ Meyers, Ph.D. College of Education and Social Services University.
Reflect and Revise: Evaluative Thinking for Program Success Tom DeCaigny and Leah Goldstein Moses.
Classroom Assessments Checklists, Rating Scales, and Rubrics
Update on Virginia’s Growth Measure Deborah L. Jonas, Ph.D. Executive Director for Research and Strategic Planning Virginia Department of Education July-August.
Project Director – Dr. Mark Lung Dept of Natural & Environmental Sciences Western State College of Colorado Project Evaluator – Dr. Dave Shannon Educational.
Laying the Groundwork for the New Teacher Professional Growth and Effectiveness System TPGES.
Overview of Evaluation Designs. Learning objectives By the end of this presentation, you will be able to: Explain evaluation design Describe the differences.
Research Indicators for Sustaining and Institutionalizing Change CaMSP Network Meeting April 4 & 5, 2011 Sacramento, CA Mikala L. Rahn, PhD Public Works,
Mathematics and Science Education U.S. Department of Education.
K-12 Mathematics in Rapid City Longitudinal Findings from Project PRIME Ben Sayler & Susie Roth November 5, 2009.
U.S. Department of Education Mathematics and Science Partnerships: FY 2005 Summary.
Committee on the Assessment of K-12 Science Proficiency Board on Testing and Assessment and Board on Science Education National Academy of Sciences.
Ensuring that Professional Development Leads to Improved Mathematics Teaching & Learning Kristen Malzahn Horizon Research, Inc. TDG Leadership Seminar.
IES Evaluations and Data Collection Instruments Lauren Angelo National Center for Education Evaluation and Sally Atkins-Burnett Mathematica Policy Research.
1. Housekeeping Items June 8 th and 9 th put on calendar for 2 nd round of Iowa Core ***Shenandoah participants*** Module 6 training on March 24 th will.
What We Know About Effective Professional Development: Implications for State MSPs Iris R. Weiss June 11, 2008.
Mathematics and Science Partnerships: Summary of the Performance Period 2008 Annual Reports U.S. Department of Education.
Lessons Learned about Going to Scale with Effective Professional Development Iris R. Weiss Horizon Research, Inc. February 2011.
Math and Science Partnership Program Approaches to State Longitudinal Evaluation March 21, 2011 San Francisco MSP Regional Meeting Patty O’Driscoll Public.
1 Milwaukee Mathematics Partnership Program Evaluation Year 5 Results Carl Hanssen Hanssen Consulting, LLC Cindy Walker University of Wisconsin-Milwaukee.
Evaluating Impacts of MSP Grants Hilary Rhodes, PhD Ellen Bobronnikov February 22, 2010 Common Issues and Recommendations.
Department of Secondary Education Program Assessment Report What We Assessed: Student Learning Outcomes (SLOs) and CA State Teaching Performance.
Proposal Writing Workshop Features of Effective Proposals.
Mathematics and Science Partnerships: Summary of the FY2006 Annual Reports U.S. Department of Education.
Assessment Information from multiple sources that describes a student’s level of achievement Used to make educational decisions about students Gives feedback.
Introduction to Surveys of Enacted Curriculum Presentation: Introduce SEC to Educators [Enter place and date]
Mathematics and Science Partnerships: Summary of the Performance Period 2008 Annual Reports U.S. Department of Education.
FASA Middle School Principal ’ s Leadership Academy Don Griesheimer Laura Hassler Lang July 22, 2007.
Changes in Professional licensure Teacher evaluation system Training at Coastal Carolina University.
Evaluating Impacts of MSP Grants Ellen Bobronnikov January 6, 2009 Common Issues and Potential Solutions.
A Signature Tool of The Institute for Learning
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov February 16, 2011.
1 Scoring Provincial Large-Scale Assessments María Elena Oliveri, University of British Columbia Britta Gundersen-Bryden, British Columbia Ministry of.
Course, Curriculum, and Laboratory Improvement (CCLI) Transforming Undergraduate Education in Science, Technology, Engineering and Mathematics PROGRAM.
AIM: K–8 Science Iris Weiss Eric Banilower Horizon Research, Inc.
Project Evaluation for MSP Targeted Partnerships Eric R. Banilower Horizon Research, Inc.
MSP Regional Meeting February 13-15, 2008 Calli Holaway-Johnson, Ph.D. Charles Stegman, Ph.D. National Office for Research on Measurement and Evaluation.
DECISION-MAKING FOR RESULTS HSES- Data Team Training.
Instructional Leadership and Application of the Standards Aligned System Act 45 Program Requirements and ITQ Content Review October 14, 2010.
1 Innovative Teaching and Learning (ITL) Research Corinne Singleton SRI International.
CaMSP Science Assessment Webinar Public Works, Inc. Sharing Lessons Learned in the Development and Use of Science Assessments for CaMSP Teachers and Students.
MSP Summary of First Year Annual Report FY 2004 Projects.
Classroom Assessments Checklists, Rating Scales, and Rubrics
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov March 23, 2011.
Building a Framework to Support the Culture Required for Student Centered Learning Jeff McCoy | Executive Director of Academic Innovation & Technology.
Melanie Taylor Horizon Research, Inc.
Classroom Assessments Checklists, Rating Scales, and Rubrics
Partial Credit Scoring for Technology Enhanced Items
RESEARCH IMPLEMENTATION PRACTICE
Office of Education Improvement and Innovation
Standard Four Program Impact
February 21-22, 2018.
Continuous Assessment Establishing Checkpoints
Presentation transcript:

What We Know About Effective Professional Development: Implications for State MSPs Part 2 Iris R. Weiss June 11, 2008

Comments on selected indicators A.4. Are goals measurable? There are important goals where we lack instruments, e.g., PCK, but in my view we should continue to focus on them, doing the best we can to monitor teacher progress.

B.5. Are the proposed activities innovative? Innovation is overrated, in my view, whether we are talking about PD or classroom instruction. Competent implementation of existing, well-designed approaches is a better bet.

B.9. Have teachers been integrally involved in the development of the PD plan? Teachers won’t necessarily know what they don’t know, nor how to design activities that focus appropriately on adult-level content. PD activities have to be both helpful, and perceived as helpful, so having a range of teachers review the PD plan is useful.

C.1. PD Provider Knowledge and Skills STEM faculty bring in-depth knowledge, but will likely need orientation to the world of K-12 instruction. Teacher leaders bring in-depth knowledge of K-12 teaching, but will likely need encouragement/assistance in focusing on content.

State MSP RFPs The scoring rubric sends strong signals about what the state is seeking. Avoid giving mixed messages; the scoring rubric needs to be consistent with the RFP text.

Take a few minutes to consider whether/how you would revise the rubric used in your state.

Evaluating the Quality and Impact of State MSPs

Project Evaluations Each project has a “theory of action” for how the planned activities will lead to the desired outcomes.

Basic Logic Model for PD Professional Development Teacher Knowledge Classroom Practice Improved Student Achievement

Deciding on Mid-Course Corrections Both project management and evaluators have responsibility for monitoring project implementation. Having PD providers observe classrooms is particularly powerful.

Project Evaluations State MSPs may want to measure quality and impact in any of a number of places in the logic model: –Quality of PD, and consistency among PD providers –Teacher knowledge –Classroom practice –Student achievement

Mechanisms for Mid-Course Corrections Professional development projects should include mechanisms for assessing effectiveness and making mid-course corrections.

Thorny Issues Abound Cost of classroom observations, scoring open-ended assessments, etc., and scarcity of people who are trained to do this work. Lack of valid, reliable measures that are feasible for large-scale administration.

Thorny Issues Abound Development of new measures requires considerable resources and expertise. Principals, teachers, and parents are reluctant to take time away from instruction to administer tests beyond those already used.

Thorny Issues Abound Strong research designs are needed in order to make the case that any measured gains are attributable to the treatment.

Thorny Issues Abound The fact that students of participating teachers scored higher in the spring than in the fall isn’t convincing; you expect students to learn mathematics/ science each year.

Thorny Issues Abound Showing that students of participating teachers scored higher than similar students of non-participating teachers isn’t necessarily convincing either; maybe the teachers who chose to participate were better teachers to begin with.

Thorny Issues Abound Strong research designs, including those that use random assignment and careful quasi-experiments that can rule out “rival hypotheses” such as these are not trivial to design and implement, especially if they require sophisticated multi-level analyses.

Thorny Issues Abound Even if individual projects have strong evaluations, it is difficult to aggregate results across projects that focus on different parts of the logic model, and/or use different measures.

Program evaluations can solve some of these problems There are two different approaches to program evaluations. In the first, the state hires a group to design/implement a statewide program evaluation.

Advantages The state can select a group with the necessary expertise to design and implement the evaluation statewide. Program evaluation allows for aggregation of results across projects, as well as the potential for learning about what works, for whom, and under what conditions.

Project-Based Evaluation With Common Components A second approach that allows aggregation of results is to have project-based evaluations with some or all data collection common across projects.

Project-Based Evaluation With Common Components In this approach, the state hires a group to design the evaluation, select/develop instruments, train project evaluators, and analyze the results.

Project-Based Evaluation With Common Components Project-based evaluations with common components are more feasible when the projects are similar in both goals and activities.

Project-Based Evaluation With Common Components One advantage to externally- coordinated project-based evaluations (compared to program evaluations where the external group collects the data) is that they help develop the capacity of project evaluators.

Project-Based Evaluation With Common Components Disadvantages include the difficulty in quality control of data collection, and the fact that collecting common data competes for resources that might otherwise be used to evaluate the quality of project-specific activities.

Regardless of evaluation approach Project/program evaluators need instruments appropriate to the goals of the MSPs. Given the emphasis on deepening teacher content knowledge in state MSPs, measures of teacher content knowledge that can be used on a large scale are particularly important.

MSP KMD resources for designing and evaluating PD Instrument database Knowledge reviews (see excerpt)