Presentation is loading. Please wait.

Presentation is loading. Please wait.

Methods: Pointers for good practice Ensure that the method used is adequately described Use a multi-method approach and cross-check where possible - triangulation.

Similar presentations


Presentation on theme: "Methods: Pointers for good practice Ensure that the method used is adequately described Use a multi-method approach and cross-check where possible - triangulation."— Presentation transcript:

1 Methods: Pointers for good practice Ensure that the method used is adequately described Use a multi-method approach and cross-check where possible - triangulation Assess the intervention against appropriate international standards and law Talk to primary stakeholders Disaggregate (eg by sex, socioeconomic group & ethnicity) Ensure a focus on social process & causality Make clear any evaluator bias (Reference: ALNAP guide on evaluating humanitarian action, Beck, 2006)

2 Methodology: documentation review Start with documentation review: build on what has been done before, what is known Often referred to as ‘ secondary data ’ / literature review What kind of documentation would you search for/ include? NB Allow time in your evaluation design for documentation review

3 Defining quantitative & qualitative indicators Quantitative indicators: indicators that can be measured in numeric terms, usually through scientific techniques such as surveys Qualitative indicators: indicators that rely on descriptive data, generated through techniques such as focus groups, interviews, PRA techniques

4 Quantitative & qualitative: striking the right balance Evaluation bias towards quantitative: scientific air of accuracy, easy to communicate. But often focussed on inputs & outputs. Measuring the intangibles usually relies upon qualitative indicators eg local perceptions and attitudes. Tends to relate to impact & outcome. Very useful for establishing cause & effect Evaluation is a science and an art!

5 ‘ Informal ’ vs ‘ formal ’ data collection methods ‘ Formal ’ methods: procedure clearly defined from the outset eg formal survey, direct measurement ‘ Informal ’ methods: less precise procedures. Rely to a large extent on experience, intuition and subjective judgement eg focus group interviews, semi-structured interviews

6 ‘ Informal ’ data collection methods Don ’ t think you must identify all indicators in advance! Examples of informal data collection methods: Focus group discussions Semi-structured interviews: Using checklist of questions, PRA techniques PRA techniques, eg timelines, ranking, institutional mapping Direct observation NB Use triangulation (ie multiple information sources to generate information about one topic/ issue) Pay attention to how you analyse and write up qualitative data

7 Examples of PRA techniques Timelines: very useful to capture how local people perceive events: cause & effect Ranking, eg most useful type of relief assistance Proportional piling eg sources of livelihood and how they have changed Institutional mapping

8 Focus group discussions Small group representing a particular section/ group in the population (eg men, women, by ethnicity etc) Ideal number: 6-10 Informal style, using checklist of questions Use open questions: what, how, when, who, (why) Can combine with ‘ PRA techniques ’ eg proportional piling, institutional mapping

9 Example of quantifying information collected in focus group discussion (relative values)

10 To be added by facilitator – suggest inserting photos of evaluation field work

11 Are the methods proposed consistent with the time and resources available for the evaluation? Will the methods provide the type and quality of evaluation findings required by the stakeholders? Have specific questions or hypotheses relating to each evaluation criteria been generated during the inception stage of the evaluation? Will the methods selected by the evaluators provide valid and reliable information, which will allow these questions to be answered? Are the methods to be used clearly described in the evaluation proposal? (Source: Evaluation guidelines, Evaluation Department, DfID, 2000) Checklist when selecting evaluation methods

12 Data analysis The purpose of analysis is to transform the data into credible evidence about the development of intervention and its performance. Typically, the analytical process involves three steps: Organising the data for analysis, ie data preparation Describing the data, eg generating findings of fact Interpreting the data, eg assessing the findings against criteria Source: Danida evaluation guidelines, 2006


Download ppt "Methods: Pointers for good practice Ensure that the method used is adequately described Use a multi-method approach and cross-check where possible - triangulation."

Similar presentations


Ads by Google