Presentation is loading. Please wait.

Presentation is loading. Please wait.

Impact Evaluation in the Real World One non-experimental design for evaluating behavioral HIV prevention campaigns.

Similar presentations


Presentation on theme: "Impact Evaluation in the Real World One non-experimental design for evaluating behavioral HIV prevention campaigns."— Presentation transcript:

1 Impact Evaluation in the Real World One non-experimental design for evaluating behavioral HIV prevention campaigns

2 Implementation realities BCC program: –Has already started –Builds on the previous campaign (not the first one addressing behaviour) –Is being rolled out in communities that have other HIV prevention interventions –There are endogenous ‘interventions’ (e.g. conversations on the way to school, or in the waiting line at the clinic) –Diffusion is a good thing –Cannot (and does not want to) control implementation

3

4

5 Difference in Differences Example

6 6 57.50 - 46.37 = 11.13 66.37 – 62.90 = 3.47 Non-participants Participants Effect = 3.47 – 11.13 = - 7.66

7 7 After Before Effect = 8.87 – 16.53 = - 7.66 66.37 – 57.50 = 8.87 62.90 – 46.37 = 16.53

8 8 Counterfactual assumption: Without intervention participants and nonparticipants’ pregnancy rates follow same trends

9 9 74.0 16.5

10 10 74.0 -7.6

11 Matching Example

12 Implementation realities BCC program: –Has already started –Builds on the previous campaign (not the first one addressing behaviour) –Is being rolled out in communities that have other HIV prevention interventions –There are endogenous ‘interventions’ (e.g. conversations on the way to school, or in the waiting line at the clinic) –Diffusion is a good thing –Cannot (and does not want to) control implementation

13 What do we need to know? Can a specific set of communication messages manipulate a specific set of sexual behaviors? What magnitude of behaviour change will give what magnitude of changes in incidence?

14 Approach decided on NON-intervention approach: –We are NOT trying to prove that one campaign works….. BUT we are trying to see whether a specific set of messages work, irrespective of the method of delivery or transmission of the method Observation approach –We are not trying to force one intervention to work; not focusing on implementation of one intervention

15 So what WILL we do? Non-experimental design –Researcher does not manipulate the independent variable (message exposure) –No control group in the community; create the control group statistically through matching Collection of exposure, behavioural and biological data from random sample of individuals and their sexual partners Develop a measurement of intensity of exposure (‘doses’ of exposure) Determine the probability of having a specific dose of exposure Match individuals with similar covariates, but different doses of exposure Compare biological and behavioural outcomes

16 So what WILL we do? 1. Survey to measure demographic covariates (or use pop survey data) 2.Measure type and intensity of exposure to messages –Different doses of exposure to MCP campaign messages among the population –Detailed measurement of method of exposure to messages during surveys: Direct channels (# times heard messages on radio…); AND indirect channels (conversation with friends, relatives, etc.; as shown to be important in accounting for HIV declines in Uganda) –Construct message exposure scale (low vs. high, or more detailed) using statistical techniques (e.g., principal components analysis) –Every individual has a single score for message exposure

17 So what WILL we do? 3. Survey to measure exposure, behavioural outcomes, couple and social network norms and HIV incidence amongst random selection of individuals 4. Nested sub-study to trace partners of those who reported one or more sexual partners, and collect same data from them 5. Parallel measurement of ‘social norms’ – hearsay ethnography or other methods

18 So what WILL we do? 6.Analyses –Use covariates to calculate an individual’s propensity (scalar summary of all covariates) to receive a specific ‘dose of treatment’ (message exposure scale) –Match pairs of participants (index cases and their sexual partners) with similar propensity scores and different doses of treatment (control and treatment groups) –Now, we can calculate impact (behavioural and biological outcomes) by comparing the means of outcomes across participants and their matched pairs 7.Modeling –Has the density of the sexual network changed over time, and to what extent has it changed? –How 'much' behaviour change is needed, over what period of time in how many individuals, to bring about what levels of reductions in new infections –What are the individual and the combined effects of MC, ART, increased condom use, and MCP reductions, respectively, on the number of new infections –What is the ideal 'mix' of interventions to implement?

19 ‘Low exposure’ ‘High exposure’ High probability of exposure given XLow probability of exposure given X Density of scores for low exposure Density of scores for high exposure

20 What we will know Can a specific set of communication messages (delivered in different ways) manipulate a specific set of sexual behaviors? What magnitude of behaviour change will give what magnitude of changes in incidence?


Download ppt "Impact Evaluation in the Real World One non-experimental design for evaluating behavioral HIV prevention campaigns."

Similar presentations


Ads by Google