Presentation is loading. Please wait.

Presentation is loading. Please wait.

.  Evaluators are not only faced with methodological challenges but also ethical challenges on a daily basis.

Similar presentations


Presentation on theme: ".  Evaluators are not only faced with methodological challenges but also ethical challenges on a daily basis."— Presentation transcript:

1

2  Evaluators are not only faced with methodological challenges but also ethical challenges on a daily basis.

3  Ethics in program evaluation refers to insuring that the actions of the program evaluator are in no way causing harm or potential harm to program participants, vested stakeholders, or the greater community

4  Established in 1975 the Joint Standards Committee was created to develop a set of standards to ensure the highest quality of program evaluation in the educational setting.

5  The Joint Standards Committee is made up of several organizations. The American Evaluation Association (AEA) is one of those contributing organizations and sends delegates to Joint Standards Committee meetings.

6  The standards are broken down into five main areas: 1) Utility 2) Feasibility 3) Propriety 4) Accuracy, and 5) Evaluation accountability. Use the link below to access the standards: American Evaluation Association. (n.d.). Programme Accountability Standards. Retrieved from http://www.eval.org/p/cm/ld/fid=103 http://www.eval.org/p/cm/ld/fid=103

7  The purpose of these standards is to increase the likelihood that stakeholders would find both the process and product associated with the evaluation to be valuable in nature

8  The purpose of these standards is to ensure that the evaluation is conducted accordingly, using appropriate project management techniques, along with using resources appropriately.

9  These standards are designed to support what is fair, legal and right in program evaluation.

10  The purpose of this standard is to ensure that evaluations are both dependable and truthful in their data collection and findings

11  These standards call for both a rigorous documentation of evaluations and the use of internal and external meta-evaluations in order to improve the on-going processes and products associated with evaluation.

12  An evaluation approach is the process in which the evaluator goes about collecting data.

13  Objective-based Evaluation o Most evaluation today is objective-based o Evaluation objectives are aligned with program goals o Typically there are 5- 8 evaluation objectives o After aligning the evaluation objectives to the program goals the evaluator sets out to document the degree to which program goals have been accomplished. o Uses evaluation matrixes and logic models

14  A very effective tool for evaluators to use  Provides a plan or “blueprint” for the evaluation  A table where the evaluator delineates the evaluation objectives, tools, timeline and stakeholders

15  A quantitative delineation of an program goal. o Example Benchmark: Students who participating in the after school program will have a 20% decrease in office referrals each year.

16  Allows for keeping close track of “linkages” between program goals and outcomes  However, this approach could bias evaluators because program goals are so clearly defined

17  Unorthodox approach to program evaluation  Evaluator does not know the program goals  Conducts observations and collects data to determine what the evaluator thinks are program goals based on evidence

18  Evaluators not biased by the stated goals of the program  Current emphasis on accountability and outcomes makes goal free evaluation difficult to implement and keep program in compliance with the funding source

19  Evaluator and data collection methods not driven by objectives but by “burning” questions.  These questions are usually asked by a decision-making body or administrative group, not participants or those most “affected” by the programming.

20  CIPP approach is the most noted decision- based approach  This approach uses both formative and summative evaluation data through a prescribed framework.  Four steps or phases that guide the evaluation process: Context, Input, Process and Product.

21  Context is the first component of the CIPP model. It this section the evaluator focuses on studying the context or situation for which the program will take place.

22  What do teachers and staff think we need to address this program?  What do teachers, staff, and the greater school community believe is the underlying elements of students’ behavior issues during the school day?  What is currently not working in our building’s current student behavior program?

23  The second component of the model is input. Input allows the evaluator the opportunity to examine the relationship between the amount of resources available (e.g. money, staff, equipment) and the programs proposed activities.  The question that has to be answered at this juncture is: Will the current budget/funding support the proposed activities?

24  Process evaluation is the third component of the CIPP model. In this component the question: Are we doing the program as planned?

25  Product evaluation is the fourth and final component of the CIPP model.  Product evaluation focuses evaluation efforts on final outcomes of the program and determining whether the program met its stated goals and objectives.  This component is primarily summative evaluation and answers the question: Was the program successful?

26  Evaluator “teaches” stakeholders to evaluate the program (or aspects of it ) that serves them.  E.g. students collecting data to evaluate their after school program (Youth Participatory Evaluation, Flores, 2008).

27  Empowers underrepresented groups  Provides a unique perspective to the data, program and evaluation process that an external evaluator would not be able to “capture”

28  Participatory evaluators may stray from goals of program or collecting data that is of critical interest to the funding source  Participatory evaluators lack technical expertise to collect, analyze and interpret data. Data validity could be comprimised.

29  The evaluator’s role to develop or select the criteria that will be used to judge the program or product.  Sciven also believed that the purpose of this approach was to present the evaluation findings and to let the consumers (as well as potential consumers) make the final decision as to use or not use the program or products.

30  More of an eclectic approach used by today’s evaluators  Evaluators take “bits and pieces” of the above approaches and use them appropriately in order to extend and support quality evaluations for funders, stakeholders, and the greater community.


Download ppt ".  Evaluators are not only faced with methodological challenges but also ethical challenges on a daily basis."

Similar presentations


Ads by Google