Evaluation of Math-Science Partnership Projects (or how to find out if you’re really getting your money’s worth)

Slides:



Advertisements
Similar presentations
P-20 Data Collaborative Grant University/College Work Group February 24, 2010.
Advertisements

Template: Making Effective Presentation about Your Evidence-based Health Promotion Program This template is intended for you to adapt to your own program.
Professional Development for School Leaders Technical Assistance Phase 3 Implementation and Documentation.
Progress Towards Reading Success: The Reading First Evaluation Prepared by: Amy Kemp, Ph.D. Research Associate and Patricia A. Muller, Ph.D. Associate.
Leon County Schools Performance Feedback Process August 2006 For more information
The NDPC-SD Intervention Framework National Dropout Prevention Center for Students with Disabilities Clemson University © 2007 NDPC-SD – All rights reserved.
Head teacher Performance Management
Donald T. Simeon Caribbean Health Research Council
Pennsylvania’s Continuous Improvement Process. Understanding AYP How much do you know about AYP?
Campus Improvement Plans
How Do We Know if a Charter School is Really Succeeding? – Various Approaches to Investigating School Effectiveness October 2012 Missouri Charter Public.
Arkansas MSP Grants: Statewide Evaluation Plan Judy Trowell Arkansas Department of Education Charles Stegman Calli Holaway-Johnson University of Arkansas.
Project Monitoring Evaluation and Assessment
Performance management guidance
Introduction to the MSP Management Information System Molly Hershey-Arista December 16, 2013.
Evaluation of Training and Education Activities. Objectives By the end of this presentation, participants will be able to List reasons why evaluation.
Standards and Guidelines for Quality Assurance in the European
Teachers have a significant role in developing and implementing the most effective teaching and learning strategies in their classroom and striving for.
Kyrene Professional Growth Plan
Title I Needs Assessment/ Program Evaluation Title I Technical Assistance & Networking Session October 5, 2010.
Program Evaluation Debra Spielmaker, PhD Utah State University School of Applied Sciences, Technology & Education - Graduate Program Advisor USDA-NIFA,
Dr.Mohamed E. Osman & Prof.Thuwayba A. Al Barwani With Dr.Abdo M. Al Mekhlafi Dr. Khalid Al Saadi Ms.Laila Alhashar Ms.Fathiya Al Maawali Ms.Zuhor Al lawati.
GSU-NACDD-CDC Chronic Disease and Public Health Workforce Training Training Needs Survey and Public Health Certificate in Chronic Disease Training for.
Central Kentucky Partnership in Mathematics and Science (CKPIMS) Central Kentucky Partnership in Mathematics and Science (CKPIMS) Central Kentucky Education.
Deepening Our Understanding of Student Learning Objectives (SLOs)
HIGH SCHOOL ACHIEVEMENT Dr. Lewis D. Ferebee, Superintendent Dr. Wanda H. Legrand, Deputy Superintendent for Academics May 29, 2015.
Overview of MSP Evaluation Rubric Gary Silverstein, Westat MSP Regional Conference San Francisco, February 13-15, 2008.
Evaluating Outcomes Across the Partnerships Tom Loveless Director, Brown Center on Education Policy The Brookings Institution Saturday,
Nevada Counselor / Psychologist Survey Data Prepared for the Legislative Committee on Education July, By Marina McHatton CTE Counseling and Assessments,
40 Performance Indicators. I: Teaching for Learning ST 1: Curriculum BE A: Aligned, Reviewed and Monitored.
If you don’t know where you’re going, any road will take you there.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Workshop 3 Early career teacher induction: Literacy middle years Workshop 3 Literacy teaching and NSW syllabus 1.
INTERNATIONAL SOCIETY FOR TECHNOLOGY IN EDUCATION working together to improve education with technology Using Evidence for Educational Technology Success.
Accessing and Reporting State Student Achievement Data for GPRA Purposes Amy A. Germuth, Ph.D. Compass Consulting Group, LLC.
Alaska Staff Development Network – Follow-Up Webinar Emerging Trends and issues in Teacher Evaluation: Implications for Alaska April 17, :45 – 5:15.
Assisting GPRA Report for MSP Xiaodong Zhang, Westat MSP Regional Conference Miami, January 7-9, 2008.
March 26-28, 2013 SINGAPORE CDIO Asian Regional Meeting and Workshop on Engineering Education and Policies for Regional Leaders Programme Evaluation (CDIO.
Regional Seminar 2005 EVALUATING POLICY Are your policies working? How do you know? School Development Planning Initiative.
SLOs for Students on GAA January 17, GAA SLO Submissions January 17, 2014 Thank you for coming today. The purpose of the session today.
Indicators for ACSM.
1. Housekeeping Items June 8 th and 9 th put on calendar for 2 nd round of Iowa Core ***Shenandoah participants*** Module 6 training on March 24 th will.
Designing Local Curriculum Module 5. Objective To assist district leadership facilitate the development of local curricula.
WHO Global Standards. 5 Key Areas for Global Standards Program graduates Program graduates Program development and revision Program development and revision.
Math and Science Partnership Program Approaches to State Longitudinal Evaluation March 21, 2011 San Francisco MSP Regional Meeting Patty O’Driscoll Public.
Office of Special Education Programs U.S. Department of Education GRANT PERFORMANCE REPORT FOR CONTINUATION FUNDING.
0 1 1.Key Performance Indicator Results ( ) KPI Survey Statistics Student Distribution by Year in Program KPI Overall Results Student Satisfaction.
Evaluation of the Noyce Teacher Scholarship Program 2010 NSF Noyce Conference Abt Associates Inc. July 9, 2010.
Assessment of Student Learning in General Education AAHE/NCA 2003 Assessment Workshop Omaha, Nebraska ● June 2003.
What is Title I and How Can I be Involved? Annual Parent Meeting Pierce Elementary
Southern Regional Education Board High Schools That Work Jo Kister, SREB Consultant Archived Information.
ANNOOR ISLAMIC SCHOOL AdvancEd Survey PURPOSE AND DIRECTION.
Evaluating Impacts of MSP Grants Ellen Bobronnikov January 6, 2009 Common Issues and Potential Solutions.
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov February 16, 2011.
Strategic Planning. What is Strategic Planning? Process to establish priorities on what you will accomplish in the future Forces you to make choices about.
Project Evaluation for MSP Targeted Partnerships Eric R. Banilower Horizon Research, Inc.
Assessment Small Learning Communities. The goal of all Small Learning Communities is to improve teaching, learning, and student outcomes A rigorous, coherent.
Past, Present, & Key to our Future. * In 1995 a survey was conducted across DE and it was found that the predominant form of Science Education was textbook.
Overview of Student Learning Objectives (SLOs) for
1 Restructuring Webinar Dr. Zollie Stevenson, Jr., Ph.D. Director Student Achievement and School Accountability Programs Office of Elementary and Secondary.
1 DEMONSTRATION PROJECTS TO ENSURE STUDENTS WITH DISABILITIES RECEIVE A QUALITY HIGHER EDUCATION PROGRAM Performance Measurement, Program and Project Evaluation.
Spring 2013 OMSP Request For Proposal. The purpose of this PowerPoint is to highlight critical components of the Request for Proposals that have historically.
ACS WASC/CDE Visiting Committee Final Presentation Panorama High School March
Selection Criteria and Invitational Priorities School Leadership Program U.S. Department of Education 2005.
EVALUATING A MIDDLE SCHOOL MATH M.ED. PROFESSIONAL DEVELOPMENT PROGRAM. Knowledge, Pedagogy, Practice or Student Achievement:
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov March 23, 2011.
Administrator Evaluation Orientation
Continuous Assessment Establishing Checkpoints
NC Mathematics and Science Partnership Program
Presentation transcript:

Evaluation of Math-Science Partnership Projects (or how to find out if you’re really getting your money’s worth)

Why Should States Require Good Evaluations of MSP Projects? To determine if the project’s objectives contribute to State education goals To find out how activities are implemented during the year To monitor the project’s progress toward achieving its objectives To determine if the project ultimately reaches its objectives (and if not, why not)

A good local evaluation can... Provide evidence that is directly relevant to the district’s students, teachers, and schools Provide immediate feedback to improve on- going projects Provide information for making informed decisions about allocating local resources

Developing the State RFP (Or, how to ask for something so that you get what you want )

What do you want to see in a good evaluation? Clear objectives with measures that directly assess the targets of each objective Documentation of program implementation and progress An evaluation design that can clearly show whether program activities themselves are the cause of any changes in target outcomes

Teacher-Focused Objectives Increase the number of mathematics and science teachers who participate in content- based professional development activities Increase teachers’ content knowledge in mathematics or science

Student-Focused Objectives Improve student academic achievement on the state mathematics and science assessments

Measuring Progress For each objective, there should be at least one measure (or indicator) that directly assesses the objective’s target outcome

Measuring Progress: Example Course specific content test – YES Teacher certification math content test – YES (but not a math pedagogy test) Teacher self-report of learning or course satisfaction – NO To measure an increase in teachers’ math content knowledge, there must be a direct measure of teachers’ math content knowledge.

Measuring Progress: Example State mathematics achievement test – YES Student self-report of learning or interest in mathematics – NO To measure improvement in students’ mathematics achievement, there must be a direct measure of students’ mathematics achievement.

Documenting the Program’s Implementation and Progress Who are the participants? Were activities carried out as planned and on what timeline? If problems were noted, how were they corrected? Do early data show progress toward the expected outcomes?

How do you determine whether the project activities themselves actually produce changes in the target outcomes? (Where’s the beef?)

Evaluation Design Baseline data are essential A comparison group is important Random assignment is the only sure method for determining program effectiveness

What is random assignment? Intervention and comparison groups are constructed by randomly assigning some teachers, schools or districts to participate in the program activities and others to not participate Random assignment is not the same as random selection (e.g., randomly choosing 5 schools that use Curriculum X out of schools that already use Curriculum X to compare with 5 randomly chosen schools that use Curriculum Y out of schools that already use Y)

The Random Assignment Difference: The Career Academy Study In a recent study, 73% of students voluntarily enrolling in a high school technical education program called Career Academy graduated on time. Completion rates for students from the National Education Longitudinal Survey who followed a career technical curriculum or a general curriculum in high school were 64% and 54%, respectively. BUT students in the Career Academy study who had been randomly assigned to the control condition graduated at the rate of 72%, not significantly different from the students in the Career Academy intervention

Career Academies

If not random assignment, then what Use a comparison group of students, schools or districts that are carefully matched to the targeted population in academic achievement levels, demographics, and other characteristics thought to be relevant to the intervention (e.g., teachers’ years of classroom experience) prior to the implementation of the intervention

If not random assignment, then what Be sure to identify both the intervention and comparison groups and the outcome measures before the intervention is administered Finally, be sure that the comparison group is not comprised of students or schools that had the opportunity to participate in the intervention but declined.

Writing the Evaluation Component: Measures and Data Collection Require objectives with measures (indicators) that directly relate to the objectives Require baseline data (existing or a project administered pre-test) Require data that documents what was implemented and how the program was implemented

Writing the Evaluation Component: Evaluation Design Require an evaluation design that can determine whether the project activities themselves produce changes in the target outcomes Encourage use of random assignment designs Encourage applicants to seek assistance from consultants who have experience in conducting impact evaluations of programs

Review of Plans: Are Outcomes Linked to Objectives? Are objectives stated in measurable terms? Is progress toward each objective measured by a specific indicator or indicators that directly relates to the objective? Do the identified indicators cover all of the key outcomes?

Review of Plans: Will Data Be Used to Improve Program? Will evaluation data be collected throughout the project? Will evaluation data be used to inform project activities? Is the timeline for collection of evaluation data integrated with the overall project timeline? Will the data they plan to collect provide information about various components of the project?

Review of Plans: Will the Evaluation Assess the Impact of the Program? Does the design allow the applicant to determine that observed changes in outcomes are due to the program? – Do they collect or use baseline data? – Do they include a control or comparison group in their evaluation design? – Do they use random assignment?

Review of Plans: Do Project Personnel Have Expertise in Impact Evaluations? Do they involve an experienced evaluator, or does someone on their staff with sufficient experience in quantitative program evaluation? Does the evaluator have a sufficient time investment to carry out the evaluation over the life of the program?

Who can help review the evaluation component of the MSP proposals? University faculty with expertise in quantitative program evaluation – Public policy – Public health – Prevention science – Psychology Evaluators with expertise and experience with random assignment evaluations