Presentation is loading. Please wait.

Presentation is loading. Please wait.

MSP Regional Meeting February 13-15, 2008 Calli Holaway-Johnson, Ph.D. Charles Stegman, Ph.D. National Office for Research on Measurement and Evaluation.

Similar presentations


Presentation on theme: "MSP Regional Meeting February 13-15, 2008 Calli Holaway-Johnson, Ph.D. Charles Stegman, Ph.D. National Office for Research on Measurement and Evaluation."— Presentation transcript:

1 MSP Regional Meeting February 13-15, 2008 Calli Holaway-Johnson, Ph.D. Charles Stegman, Ph.D. National Office for Research on Measurement and Evaluation Systems (NORMES) University of Arkansas

2 Choosing an evaluator In choosing an evaluator, consider the evaluator's In choosing an evaluator, consider the evaluator's –Knowledge of educational processes –Prior experience with program evaluations –Expertise in statistics and research design Evaluator and evaluation design should be an integral part of writing the proposal Evaluator and evaluation design should be an integral part of writing the proposal

3 Effective Evaluation Components Measures of teacher content knowledge Measures of teacher content knowledge –Utilization of existing measures versus program- developed measures –Validity and reliability of instruments –Pre- and post-test Measures of student achievement Measures of student achievement –State, district, or school-mandated tests –Norm-referenced and criterion-referenced tests –Teacher/program-developed tests

4 Data Collection Make sure the data you want to collect will allow you to determine the effectiveness of your program. Make sure the data you want to collect will allow you to determine the effectiveness of your program. –Data collected should relate directly to project goals and objectives. Consider the availability of data as you are writing your proposal. Consider the availability of data as you are writing your proposal. Be honest with administrators and teachers on how data will be used. Be honest with administrators and teachers on how data will be used.

5 Fidelity of Implementation Evaluation should include collection of data that indicate Evaluation should include collection of data that indicate –How program was implemented (versus how program was proposed)  Challenges  Successes –How project team will use results to adapt future program goals and/or plans –How program implementation affected data collection and/or interpretation

6 Using the Results of Evaluation Be objective in your interpretation. Be objective in your interpretation. Evaluation results should address whether outcomes were directly related to your program. Evaluation results should address whether outcomes were directly related to your program. Review results to determine if changes need to be made. Review results to determine if changes need to be made. –Content/program focus –Program implementation –Data collection techniques and/or instruments

7 Sharing Results Sharing evaluation protocols and outcomes among projects helps create a learning community within a state. Sharing evaluation protocols and outcomes among projects helps create a learning community within a state. Evaluation results can be shared through statewide meetings with project directors and evaluators, as well as electronically. Evaluation results can be shared through statewide meetings with project directors and evaluators, as well as electronically.

8 Contact Information National Office for Research on Measurement and Evaluation Systems (NORMES) Charles Stegman, Ph.D. cstegman@uark.edu Calli Holaway-Johnson, Ph.D. cajohns@uark.edu 340 N. West Avenue, WAAX 302 Fayetteville, AR 72701 (479) 575-5593 Fax: (479) 575-5185 http://normes.uark.edu The instructional practices and assessments discussed or shown in this presentation are not intended as an endorsement by the U.S. Department of Education.

9 Questions to be discussed What criteria were used in selecting an evaluator? What criteria were used in selecting an evaluator? How involved was your evaluator in the planning of the project? How involved was your evaluator in the planning of the project? What data were collected for your evaluation? What data were collected for your evaluation? Who was responsible for ensuring that the appropriate data were collected? Who was responsible for ensuring that the appropriate data were collected? How did the implementation of your project impact the evaluation? How did the implementation of your project impact the evaluation? What evaluation findings contributed to your understanding of the effectiveness of your program? What evaluation findings contributed to your understanding of the effectiveness of your program? What was the most successful aspect of your evaluation? What was the most successful aspect of your evaluation? What was the most challenging aspect of your evaluation? What was the most challenging aspect of your evaluation? How were the results of your evaluation utilized by project staff? How were the results of your evaluation utilized by project staff?

10 Disclaimer The instructional practices and assessments discussed or shown in this presentation are not intended as an endorsement by the U.S. Department of Education. The instructional practices and assessments discussed or shown in this presentation are not intended as an endorsement by the U.S. Department of Education.


Download ppt "MSP Regional Meeting February 13-15, 2008 Calli Holaway-Johnson, Ph.D. Charles Stegman, Ph.D. National Office for Research on Measurement and Evaluation."

Similar presentations


Ads by Google