Presentation is loading. Please wait.

Presentation is loading. Please wait.

Evaluating agricultural value chain programs: How we mix our methods

Similar presentations


Presentation on theme: "Evaluating agricultural value chain programs: How we mix our methods"— Presentation transcript:

1 Evaluating agricultural value chain programs: How we mix our methods
15/01/2019 Marieke de Ruyter de Wildt AEA Conference San Antonio November 2010

2 Regulatory environment
From Agricultural Value Chains to Systems 15/01/2019 Supporting services Inputs, land, energy, price information, R&D Inputs Farmers Brokers Processors Trader Retailers Consumer Regulatory environment Standards, regulation, sector policies International prices, trade agreements, tariffs policies, speculations

3 Change in Approach, change in impact patterns
15/01/2019 End program Now: developing market systems around chain actors Outreach Before: direct delivery to chain actors Time 3

4 Guideline 1: Critical Ingredients
15/01/2019 Any evaluation should at least have: Logic Model (beliefs, activities and results) Methods that can face scrutiny Insights that allow replication

5 Guideline 2: Test Against Validity Threats
15/01/2019 Explore robustness of ingredients from different angles: Construct validity: Are concepts properly defined and operationalized Internal validity: Resolve issues of causality/attribution Statistical conclusion When using statistics, do it validity: properly External validity: Under what conditions do conclusions apply Source: Shadish, W. R., T. D. Cook, et al. (2002) Experimental and Quasi-Experimental Designs for Generalized Causal Inference

6 Combining Ingredients and Validity Threats
15/01/2019 Combining Ingredients and Validity Threats Construct Internal Statistic External Logic model High Method Replication

7 1. Focus Define logic model On what basis do we expect success?
15/01/2019 Define logic model On what basis do we expect success? What ‘level’ of definition suits our evaluative question? What are critical assumptions, obviously about impact but also about assumed causalities (arrows)? How to test logic model and counterfactuals (critics)? Construct Internal. X

8 2. Method Mix methods to anticipate validity threats
15/01/2019 Mix methods to anticipate validity threats 1. Negotiate core methodology that fits main evaluative questions and ‘real-world constraints’: how can we anticipate implementation issues? 2. Add additional methods for assumptions in A. Program theory: are concepts precise enough to measure? B. Methodology: Right timing? Have enough indicators? Control group biased or sufficiently clean (spillover)? Construct Statistical X

9 3. Replication Explore conditions that make it work
15/01/2019 Explore conditions that make it work Reflect on common elements in pilots: do we have ‘generalisation domain’ defined? Focus on mechanisms in context: What works for whom under what conditions? We have methods that allow more general conclusions? Construct External X

10 Example: training coffee farmers in Vietnam

11 1. Logic Model Critical assumptions Methodological implications
15/01/2019 More knowledge on good practices More income, better quality and less damage (PPP) Better agricultural practices Training farmers Critical assumptions Methodological implications Training is fairly homogeneous: Realist case comparison of access criteria, modules and delivery More knowledge leads to better practices: and of application mechanisms of knowledge More training = better: compare intensive training to less intensive training (factor or cluster analysis) Reduce threats to construct and internal validity

12 Reduce threats to Construct and
2. Method 15/01/2019 Core method: Difference in Difference to scan for results in PPP Added mixed methods: For key assumption in program theory Realist case studies to scan unexpected outcomes (eg increase in self esteem) Considering mediating and moderating variables (thanks to Kathleen, Research Works) For methodological assumptions Nested survey (power analysis) Pilots (data availability) Reduce threats to Construct and Statistical validity

13 Reduce threats to Construct and
3. Replication 15/01/2019 Reduce threats to Construct and External validity

14 Conclusions 15/01/2019 One-method research might be good for publication in top journals, but rarely for generating convincing evidence for involved agents Evaluation design needs to be Theory-based (clarify evaluative questions) Using mix methods (minimize validity threats) Address policy relevance (make sense of diversity) Considering validity threats up front helps to find a more robust mix of methods

15


Download ppt "Evaluating agricultural value chain programs: How we mix our methods"

Similar presentations


Ads by Google