Presentation is loading. Please wait.

Presentation is loading. Please wait.

“Real World” Monitoring and Evaluation in SA* Presenter: Mary Cole development-evaluation.com * M Bamberger, J Rugh, L Mabry 2006: Real World Evaluation,

Similar presentations


Presentation on theme: "“Real World” Monitoring and Evaluation in SA* Presenter: Mary Cole development-evaluation.com * M Bamberger, J Rugh, L Mabry 2006: Real World Evaluation,"— Presentation transcript:

1 “Real World” Monitoring and Evaluation in SA* Presenter: Mary Cole development-evaluation.com * M Bamberger, J Rugh, L Mabry 2006: Real World Evaluation, Sage

2 Presentation Outline An ideal – the evaluation design matrix Real World Evaluation constraints A South African Evaluation Scenario Real World Evaluation in context 7 step Real World Evaluation Approach Steps 1- 5: plan, budget, time, data,politics A Real World Evaluation Worksheet for SA

3 Evaluation Design Matrix A Planning Instrument for Evaluations Major Issues Being Addressed: Major Assumptions Being Made: QuestionSub- Question Type of Question DesignMeasures or Indicators Target or Standard Data Sources SampleData Collection Instrument Data Analysis Comment

4 RWE constraints: When evaluations are formulated, planned, conducted, disseminated, and used: Budget Time Data Political Incl. problems, pressures, influences and ?

5 Authors’ RWE scenarios : Authors’ RWE scenarios : Evaluator is involved from project start but with data/design restrictions but no baseline data, no comparison group Funder/client/s-holder views re methods, evidence, sharing Evaluator is involved only when project is operational or completed and no baseline data, no comparison group Funder/client/s-holder views re methods, evidence, sharing

6 A SA Scenario - Medium Term Review (MTR) of the Integrated XYZ Program Purpose: The XYZ Program (piloted at 13 sites in all 9 provinces) involves the 3 spheres of government working together differently and better to achieve the public policy objective. The MTR must examine if the Strategy has achieved the objectives and must assess the design and implementation of the Program, document lessons learnt and make recommendations about how to streamline and enhance the XYZ Program in the remaining 5 years. Scope: Conduct MTR at Presidency, national, provincial and local levels. Review data (including a quasi Logframe with indicators and a nodal baseline data gathering questionnaire) and documents, complemented by conducting consultations, stakeholder/beneficiary interviews, mini survey, field visits and workshops involving the Presidency, sector portfolio committee, political and technical champions, and traditional leadership and covering all 13 nodes. Timing: 10 weeks maximum for the MTR during January to March 2007, inclusive of primary and secondary research, analysis, consultation, workshop sessions and write up of all reports to draft stage. “The service provider will have to hit the ground running”. Expertise required: a) Leader: international development, poverty eradication, inter-government relations, M&E, PCM, Logframes, at least a PhD. b) Member: institutional capacity building, public sector transformation/ reform, Masters degree. Tender price : stated R800,000.00 total

7 RWE and SA experience are there contextual constraints ? Budget? Time? Data? Political? Other? Unrealistic considering scope;\ no allowances for travel/accom Unrealistic time period? Data sources? No log frames available. No baseline data. Badly designed tools Hierarchy of gov. depts and trad. leaders; ? ?Evaluators with very specific skills ?Provinces, geography of SA, variety between provinces, languages, cultures etc. Capacity? ?

8 7 step RWE approach 1: Planning and scoping the evaluation 2: Addressing budget constraints 3: Addressing time constraints 4: Addressing data constraints 5. Addressing political constraints 6. Strategy for strong design and validity 7. Strategy for evaluation use by client

9 RWE Approach

10 1: Planning & Scoping Evaluation Purpose? Program theory? Time? Budget? Data? Political? Client info needs? Process? e.g. Impact/Outcome e.g. Logframe e.g. end FY surplus e.g. budget balance e.g. M&E system e.g. Cabinet Lekgotla e.g. eval questions e.g. participatory

11 Step 2: Budget constraints Simplify eval design Clarify client info needs Use existing data Reduce sample size Collect data economically No. interviews/locations ? Omit non-essential info ? Find good 2ary source ? Purposive sample ? Rapid assessment ?

12 Step 3: Time constraints First 5 time constraints Reduce pressure on the consultant More resource people Results based monitoring Use modern technology Same for budget constraints Supportive management ? Commit staff resources ? Outcomes in M&E system ? E.g hand held computers E.g. statistical software

13 Step 4: Data constraints Reconstructing baseline data Comparison groups – some options Collecting data on sensitive topics Collecting data on difficult to reach groups established PDI info ? Services delivery v no services delivery ? HIV-AIDS evaluation ? EPWP temp workers ?

14 Step 5: Priorities & perspectives Reconciling the political influences Political issues at the start of an evaluation during an evaluation in reporting and use How grant fund policy is applied ? Awareness of corruption and neglect of duty ? Evaluation report not disseminated ?

15 Step 6: Strong evaluation design and validity of conclusions Validity and adequacy affected by: Appropriate evaluation focus Availability of data How well data supports findings Competence of evaluation team Specific quant. and qual. considerations Can improve validity at design stage, during implementation and in report preparation

16 Step 7:Making evaluation useful Underutilization is due to: bad timing, not responsive to stakeholder info needs, right questions not addressed, weak methodology,too expensive and demanding, and lack of local capacity Improved utilization needs: Agreeing right questions at scoping stage Giving feedback to client (impact + formative) Building M&E capacity during implementation Communicating and disseminating results Following up with action plan on recommendations

17 RWE Worksheet for South Africa?: Name of Evaluation: Part I: Description of the Evaluation 1. Stage of the study at which the worksheet was prepared 2. Objectives of the evaluation 3. Evaluation design 4. RWE constraints addressed in the evaluation design 5. Options for addressing time and resource constraints 6. Threats to validity & adequacy of design and conclusions 7. Recommendations Part II: Assessment of 5 Adequacy Dimensions of Evaluation Design 8. Objectivity/confirmability 9. Reliability/dependability 10. Internal validity/credibility/authenticity 11. External validity/transferability/fittingness 12. Utilization/application/action orientation Part III: (For Quasi-Experimental Designs) 13. Strength of design with respect to four types of threat to validity Part IV: Analysis and Discussion of Each Threat to Validity 14. Threat (give name and number)

18 Beebe, J. 2001: Rapid Assessment Process: An Introduction. Walnut Creek, CA: Altamira Press “Rapid M&E methods” involves Intensive team-based ethnographic enquiry Systematic use of triangulation Iterative data analysis Additional data collection to quickly develop a preliminary understanding of a situation from an insider perspective


Download ppt "“Real World” Monitoring and Evaluation in SA* Presenter: Mary Cole development-evaluation.com * M Bamberger, J Rugh, L Mabry 2006: Real World Evaluation,"

Similar presentations


Ads by Google