MSP Regional Meeting February 13-15, 2008 Calli Holaway-Johnson, Ph.D. Charles Stegman, Ph.D. National Office for Research on Measurement and Evaluation.

Slides:



Advertisements
Similar presentations
Progress Towards Reading Success: The Reading First Evaluation Prepared by: Amy Kemp, Ph.D. Research Associate and Patricia A. Muller, Ph.D. Associate.
Advertisements

Evaluation Plan for Wisconsin’s Educator Effectiveness System Prepared for the Wisconsin Department of Public Instruction By: ●Josef Dvorak ●Daniel Marlin.
Designs to Estimate Impacts of MSP Projects with Confidence. Ellen Bobronnikov March 29, 2010.
Arkansas MSP Grants: Statewide Evaluation Plan Judy Trowell Arkansas Department of Education Charles Stegman Calli Holaway-Johnson University of Arkansas.
ESTEEMS (ESTablishing Excellence in Education of Mathematics and Science) Project Overview and Evaluation Dr. Deborah H. Cook, Director, NJ SSI MSP Regional.
MSP Evaluation Rubric and Working Definitions Xiaodong Zhang, PhD, Westat Annual State Coordinators Meeting Washington, DC, June 10-12, 2008.
Evaluating and Revising the Physical Education Instructional Program.
Title I Needs Assessment and Program Evaluation
Curriculum Leadership The Central Figure is the Principal.
1 Professional Development Planning and Needs Assessment Regional Trainings Spring 2006.
Principal Performance Evaluation System
Accountability Assessment Parents & Community Preparing College, Career, & Culturally Ready Graduates Standards Support 1.
Evaluation of Math-Science Partnership Projects (or how to find out if you’re really getting your money’s worth)
Teacher Certification Next Steps……. How certification works within your current practice Student Growth Criterion 3: Recognizing individual student learning.
Student Growth Goals: How Principals can Support Teachers in the Process Jenny Ray PGES Consultant KDE/NKCES.
What We Know About Effective Professional Development: Implications for State MSPs Part 2 Iris R. Weiss June 11, 2008.
Student Learning Objectives 1 Implementing High Quality Student Learning Objectives: The Promise and the Challenge Maryland Association of Secondary School.
Today’s website:
1 Developing an Evaluation Plan _____________________ The Mathematically- Connected Communities MSP Developed for the February, MSP Conference Dr.
One Voice – One Plan Office of Education Improvement and Innovation MI-CSI: Do Stage Implement Plan and Monitor Plan.
Reaching for Excellence in Middle and High School Science Teaching Partnership Cooperative Partners Tennessee Department of Education College of Arts and.
PRESENTED BY THERESA RICHARDS OREGON DEPARTMENT OF EDUCATION AUGUST 2012 Overview of the Oregon Framework for Teacher and Administrator Evaluation and.
Michigan Department of Education Office of Education Improvement and Innovation One Voice – One Plan Michigan Continuous School Improvement (MI-CSI)
Metropolitan School District of Metropolitan School District of Pike Township Indianapolis, Indiana Pike Township Indianapolis, Indiana A K-12 Coaching.
Establishing a Culture of Mathematics Learning in Urban Schools Syracuse City School District / Syracuse University Partnership Beyond Access to Math Achievement.
Building a Successful Professional Development Model Presented by: Howard Landman Project Director “Eastern Connecticut Elementary Science Coaching Consortium”
THE DRAGON CONNECTION March Who are we?  Jefferson City Schools  Small, rural school district 60 miles north of Atlanta, 18 miles north of the.
Research Indicators for Sustaining and Institutionalizing Change CaMSP Network Meeting April 4 & 5, 2011 Sacramento, CA Mikala L. Rahn, PhD Public Works,
Teacher Algebra Network: Our Model for Professional Development in Three Rural North Carolina Counties Presented by Katie J. Mawhinney and Tracie McLemore.
Assisting GPRA Report for MSP Xiaodong Zhang, Westat MSP Regional Conference Miami, January 7-9, 2008.
Connecting with the SPP/APR Kansas State Personnel Development Grant.
TWS Aids for Student Teachers & Interns Overview of TWS.
PPA 502 – Program Evaluation Lecture 2c – Process Evaluation.
A state-wide effort to improve teaching and learning to ensure that all Iowa students engage in a rigorous & relevant curriculum. The Core Curriculum.
Math and Science Partnership Program Approaches to State Longitudinal Evaluation March 21, 2011 San Francisco MSP Regional Meeting Patty O’Driscoll Public.
Assessment What is it? Collection of relevant information for the purpose of making reliable curricular decisions and discriminations among students (Gallahue,
MSP Annual Performance Report: Online Instrument MSP Regional Conferences November, 2006 – February, 2007.
Tim Brower Professor & Chair Manufacturing & Mechanical Engr. Oregon Institute of Technology MSP Regional Meeting, San Francisco, February 14 & 15, 2008.
Evaluating Impacts of MSP Grants Hilary Rhodes, PhD Ellen Bobronnikov February 22, 2010 Common Issues and Recommendations.
Research QuestionHOW and WHEN the data will be collected; HOW data will be analyzed KIND of data needed to answer the question How the MSS strategy will.
Community Planning Training 5- Community Planning Training 5-1.
Arkansas Capacity Building Science Partnership Grant: Beyond Traditional Professional Development Models 2008 Math Science Partnership Regional Conference.
The Evaluation of Mathematics and Science Partnerships Program A Quasi Experimental Design Study Abdallah Bendada, Title II Director
Race to the Top General Assessment Session Atlanta, Georgia November 17, 2009 Louis M. (Lou) Fabrizio, Ph.D. Director of Accountability Policy & Communications.
Sharon M. Livingston, Ph.D. Assistant Professor and Director of Assessment Department of Education LaGrange College LaGrange, GA GaPSC Regional Assessment.
Evaluating the Indonesia Early Childhood Education and Development Project.
January 26, 2011 Careers Conference, Madison, Wisconsin Robin Nickel, Ph.D. Associate Director, Worldwide Instructional Design System.
Washington State’s Professional Certification The Big Picture (reminder : ) Everett Public Schools Pre-Assessment Seminar
Welcome to PD Forum FY 11. Professional Development Support Structure SchoolsDistrict Support Department PD Team (Administrator, PD Contact, & PD Team.
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov February 16, 2011.
Grant Proposal for [Project Name]
Course, Curriculum, and Laboratory Improvement (CCLI) Transforming Undergraduate Education in Science, Technology, Engineering and Mathematics PROGRAM.
AIM: K–8 Science Iris Weiss Eric Banilower Horizon Research, Inc.
Office of Service Quality
Introduction to Strong Educator Support System.
Help to develop, improve, and sustain educators’ competence and confidence to implement effective educational practices and supports. Help ensure sustainability.
Sharing the Illinois MSP Reflections Illinois State Board of Education Division of Curriculum and Instruction Gwen Pollock and friends March 2008 Chicago-US.
 Developing a State Model for Student Support Services Personnel Evaluations Bureau of Exceptional Education & Student Services & Division of Educator.
Oregon Statewide System of Support for School & District Improvement Tryna Luton & Denny Nkemontoh Odyssey – August 2010.
Dr. Kathleen Haynie Haynie Research and Evaluation November 12, 2010.
Fidelity: Maximizing the Effectiveness of Tier 2 September 24, 2013 Facilitated/Presented by: The Illinois RtI Network is a State Personnel Development.
Selection Criteria and Invitational Priorities School Leadership Program U.S. Department of Education 2005.
CaMSP Science Assessment Webinar Public Works, Inc. Sharing Lessons Learned in the Development and Use of Science Assessments for CaMSP Teachers and Students.
February 25, Today’s Agenda  Introductions  USDOE School Improvement Information  Timelines and Feedback on submitted plans  Implementing plans.
Lakeland Middle School Professional Learning Communities (PLC)
Phyllis Lynch, PhD Director, Instruction, Assessment and Curriculum
Melanie Taylor Horizon Research, Inc.
Georgia Department of Education
Accessibility Supports Training
Intensive Intervention – Tier 3
Presentation transcript:

MSP Regional Meeting February 13-15, 2008 Calli Holaway-Johnson, Ph.D. Charles Stegman, Ph.D. National Office for Research on Measurement and Evaluation Systems (NORMES) University of Arkansas

Choosing an evaluator In choosing an evaluator, consider the evaluator's In choosing an evaluator, consider the evaluator's –Knowledge of educational processes –Prior experience with program evaluations –Expertise in statistics and research design Evaluator and evaluation design should be an integral part of writing the proposal Evaluator and evaluation design should be an integral part of writing the proposal

Effective Evaluation Components Measures of teacher content knowledge Measures of teacher content knowledge –Utilization of existing measures versus program- developed measures –Validity and reliability of instruments –Pre- and post-test Measures of student achievement Measures of student achievement –State, district, or school-mandated tests –Norm-referenced and criterion-referenced tests –Teacher/program-developed tests

Data Collection Make sure the data you want to collect will allow you to determine the effectiveness of your program. Make sure the data you want to collect will allow you to determine the effectiveness of your program. –Data collected should relate directly to project goals and objectives. Consider the availability of data as you are writing your proposal. Consider the availability of data as you are writing your proposal. Be honest with administrators and teachers on how data will be used. Be honest with administrators and teachers on how data will be used.

Fidelity of Implementation Evaluation should include collection of data that indicate Evaluation should include collection of data that indicate –How program was implemented (versus how program was proposed)  Challenges  Successes –How project team will use results to adapt future program goals and/or plans –How program implementation affected data collection and/or interpretation

Using the Results of Evaluation Be objective in your interpretation. Be objective in your interpretation. Evaluation results should address whether outcomes were directly related to your program. Evaluation results should address whether outcomes were directly related to your program. Review results to determine if changes need to be made. Review results to determine if changes need to be made. –Content/program focus –Program implementation –Data collection techniques and/or instruments

Sharing Results Sharing evaluation protocols and outcomes among projects helps create a learning community within a state. Sharing evaluation protocols and outcomes among projects helps create a learning community within a state. Evaluation results can be shared through statewide meetings with project directors and evaluators, as well as electronically. Evaluation results can be shared through statewide meetings with project directors and evaluators, as well as electronically.

Contact Information National Office for Research on Measurement and Evaluation Systems (NORMES) Charles Stegman, Ph.D. Calli Holaway-Johnson, Ph.D. 340 N. West Avenue, WAAX 302 Fayetteville, AR (479) Fax: (479) The instructional practices and assessments discussed or shown in this presentation are not intended as an endorsement by the U.S. Department of Education.

Questions to be discussed What criteria were used in selecting an evaluator? What criteria were used in selecting an evaluator? How involved was your evaluator in the planning of the project? How involved was your evaluator in the planning of the project? What data were collected for your evaluation? What data were collected for your evaluation? Who was responsible for ensuring that the appropriate data were collected? Who was responsible for ensuring that the appropriate data were collected? How did the implementation of your project impact the evaluation? How did the implementation of your project impact the evaluation? What evaluation findings contributed to your understanding of the effectiveness of your program? What evaluation findings contributed to your understanding of the effectiveness of your program? What was the most successful aspect of your evaluation? What was the most successful aspect of your evaluation? What was the most challenging aspect of your evaluation? What was the most challenging aspect of your evaluation? How were the results of your evaluation utilized by project staff? How were the results of your evaluation utilized by project staff?

Disclaimer The instructional practices and assessments discussed or shown in this presentation are not intended as an endorsement by the U.S. Department of Education. The instructional practices and assessments discussed or shown in this presentation are not intended as an endorsement by the U.S. Department of Education.