Presentation is loading. Please wait.

Presentation is loading. Please wait.

Community RESOURCE DEVELOPMENT

Similar presentations


Presentation on theme: "Community RESOURCE DEVELOPMENT"— Presentation transcript:

1 Community RESOURCE DEVELOPMENT
(DCE3411) Associate Prof. Dr. Roziah Mohd Rasdi Dept. of Professional Development & Continuing Education Faculty of Educational Studies Universiti Putra Malaysia

2 EVALUATION IN CD PROGRAM

3 MEANING A process of making judgment on the worth of an implemented program. Judgement is made by comparing what is seen/observed (evidence) with a standard criterion.

4 The meaning of evaluation is further strengthened by the following characteristics:
A continuous process - from beginning, midway, and final stage. Evaluation is a learning process to the participants involved

5 Evaluation is a process of measuring performance
Evaluation is a process of measuring performance. Therefore strengths and weaknesses are identified. Can be measured quantitatively and qualitatively. An ideal model of evaluation involves input-output-impact.

6 PURPOSE To see the achievement of objectives
Data are collected on the performance of the program Analysis is done on the data, and it is compared with the statement of objective of the program

7 The result of the comparison is stated, e. g
The result of the comparison is stated, e.g. : 90% or 80% of the objective is achieved. The result is used for follow-up activities.

8 Cont.. As a proof for budget /resource utilization
Most CD programs received budgets from sponsors (NGO), institutions) Program participants must be accountable of the budgets used.

9 Change brought about by the program should be proven
Change brought about by the program should be proven. Evaluation results is one of the ways to prove the utilization of budget/resource. Evaluation result is submitted to sponsor.

10 Cont.. Evaluation as Data Bank
Evaluation needs data that to be gathered continuously. Data gathered through – e.g. Survey, observations, are kept in the “data bank” that could be retrieved when needed.

11 A good evaluation should be based on up-to-date data (not obsolete data).
Example of data bank is Department of Statistics, who does continuous data collection on the various sectors of development. Digital facilities (computer) facilitates the management of data for development.

12 Cont.. Evaluation as a Strategy in Management
Planning and implementation are both activities in management. Management needs careful usage of resources. Management needs on-going information about the program.

13 Management answers questions such as:
Is the program formulated according to the problems and interest of the community? What activities should be prioritised? What should be the action if there is natural calamity?

14 Cont.. Evaluation as a Strategy for Program Improvement
From evaluation, weaknesses of the program are known. Weakness means the gap between the present and the desired status.

15 Cont.. Evaluation as a basis for follow-up activities:
Results of evaluation are used for future activities of the program. Also used for reformation of policy in order to bring better impact. For duplication purposes (similar program on different community and locality). Evaluation as a means to get recognition

16 Steps in Evaluation Define the focus of evaluation
4. Report the result of evaluation 2. Collect data (evidence) 3. Analyse data and do judgement

17 Step 1 : Define the focus of evaluation
Answer the following questions: What is the objective of evaluation? What criteria and indicators in each criterion to be used?

18 What are the data (evidences), and the sources of the data?
Who are the evaluators (internal or external)? How is the result reported, and for whom?

19 Step 2: Collect the data (Evidence)
It is done after criteria and indicators of criteria are known. Data are collected similar to data collection techniques in situational analysis or research, e.g. Survey, observation, document reviews.

20 Sources of Data: 1 Depending on objectives of the evaluation. If the objective is to see the impact of balanced diet program among children, therefore, the source of data is the children and the mother or parents. 2 Participants of the program 3 Program facilitators/CD workers/ social workers/ implementors 4 Relevant reports such as meeting minutes

21 Step 3: Analyse the data and do judgment
Basis of analysis is doing judgment is by comparing the present status of the program and what it ought to be It is done one by one on the criteria or indicators of criteria

22 e.g : Balanced Diet Criterion – Participation Indicators –
i. Attendance in meeting ii. Active participation iii. Give feedback

23 Analysis and judgment can be done quantitatively and qualitatively
Qualitative – e.g. of criteria: i) Appropriateness – according to problems and needs - Easy or difficult to follow - According to mandate of organization

24 ii) Effectiveness – how is the achievement of objectives
- Impact on income or other indicators - Impact on community’s psychological change such as attitude, awareness and knowledge

25 iii) Efficiency – is the program implemented according to the duration as planned? Delayed or faster? How is the ratio of input and output? iv) Significance – to community or b organization? Is it commensurate with the resources used?

26 Step 4: Report the Evaluation Results
Every one (participant) has the right to know the evaluation Various forms of reports – academic (journal articles, papers) and non-academic (bulletin, news through mass media)

27 Reports are channeled to departments or ministries for policy formulation especially results that need immediate action Some reports are made for sponsors – with certain specifications for reporting

28 MODEL OF EVALUATION

29 Hierarchical Criteria Model (classical model by Bennett, 1976)

30 Input All physical and non-physical resources including human resources (participants) Indicators involve in the evaluation of this criterion are: total resources used, maintenance of the resources, skills of participants in using resources, how resources mobilized, etc. Inputs are prime movers in any program.

31 Activity Evaluate activities at all stages - initiation, implementation and evaluation. Activities listed in the plan of work or calendar of activities are used. Judgement on the activities e.g. in the form satisfactory, or excellent. Participation Total involved. Pattern of involvement Continuity – continuous or seasonal.

32 Reaction Response and acceptance of people. Shown by their commitment, interest. Cognitive and Affective Change Shown by their commitment, interest. Shown by their interest, value, and attitudinal changes. .

33 Skills change Impact as a result of cognitive and affective changes Especially involvement in the use of technological innovations in the CD program. e.g. : Better use of hydroponics farming, proper use of computer in information sharing in the villages Skills change is more difficult to measure Takes longer time Final Outcome Achievement of objectives At the end of program

34 Model Context, Input, Process and Product (CIPP) by Stufflebeam (2000)
To see the appropriateness of program based on situation of the program such environmental characteristics, and community’s problem Seen at macro level Historical background of the area is relevant. Basis for other types of evaluation (input, process and product)

35 Input See the handling of inputs including human resources, activities and the sequence, support services and budget use. Micro level. Make use of the calendar of activities. See the input-output analysis.

36 Process Also called on-going evaluation or formative evaluation, or monitoring, or operational evaluation. Objective : to identify weaknesses to predict results of implementation activities to find remedies for the weaknesses. needs on-going staff to do evaluation data/information are collected formally and informally.

37 Product Normally called final evaluation or summative evaluation. To measure the achievement of program objectives. The effectiveness of context, input and process evaluation will affect product evaluation. Product evaluation – tells about the level of achievement, but process and input explain why that level is achieved. Overall evaluation should look at the four aspects of CIPP.

38 Internal Program Evaluators
Planners, implementers and all that are involved in the program All the community

39 Advantages They know the in an out of program (they experience), including the weaknesses and strengths Disadvantages May bias, highlight the goodness of the program only.

40 External Evaluators EXTERNAL Consultant
Someone who comes from outside the program Specialist in the area, knows very well about the subject matter

41 Advantages Very objective Capable of assessing critical issues Disadvantages May give extreme results Not experiencing the practical side of the program Dependent on documents High cost


Download ppt "Community RESOURCE DEVELOPMENT"

Similar presentations


Ads by Google