2 By the end of the lecture fellows will be able to Define biasIdentify different types of biasExplain how bias affects risk estimatesCritique study designs for biasDevelop strategies to minimise bias
3 What do epidemiologists do? Measure effects Attempt to define a cause Epidemiologic StudyWhat do epidemiologists do?Measure effectsAttempt to define a causean estimate of the truthImplement public health measureAn epidemiologic measure could be, for example, a rate or risk.
4 Estimated effect: the truth? Mayonnaise SalmonellaRR = 4.3True association?Bias?Chance?Confounding?
5 Chance and confounding can be evaluated quantitatively Warning!Chance and confounding can be evaluated quantitativelyBias is much more difficult to evaluateMinimise by design and conduct of studyIncreased sample size will not eliminate biasDespite all preventive efforts, bias should always be considered among alternative explanations of a finding.It has to be remembered that bias: - may mask an association or cause a spurious one - may cause over or underestimation of the effect sizeIncreasing the sample size will not eliminate any bias. A study that suffers from bias lacks internal validity
6 Definition of biasAny systematic error in the design or conduct of an epidemiological study resulting in a conclusion which is different from the truthMeasurements
7 Errors in epidemiological studies Random error (chance)Systematic error (bias)Study sizeSource: Rothman, 20027
8 Main sources of biasSelection biasInformation bias
9 Selection bias Two main reasons: Selection of study subjects Factors affecting study participationassociation between exposure and disease differs between those who participate and those who don’t
10 Types of selection bias Sampling biasAscertainment biasreferral, admissionDiagnostic/surveillanceParticipation biasself-selection (volunteerism)non-response, refusalsurvivalSampling bias is systematic error due to a non-random sample of a population,2 causing some members of the population to be less likely to be included than others, resulting in a biased sampleAscertainment biasSurveillance -referral, admission – tertiary referral centreDiagnostic – those who are tested are diagnosed, pre-screening of trial participantsParticipation biasself-selection (volunteerism) – new screening test likely to be taken up by those who are health concious, or worried well,non-response, refusal – moved out of area, tobacco smokerssurvival - discounting trial subjects/tests that did not run to completion
12 Estimate association of alcohol intake and cirrhosis Selection of controlsEstimate association of alcohol intake and cirrhosisOR = 6How representative are hospitalised trauma patients of the population which gave rise to the cases?
13 Selection of controls a b c d OR = OR = 36Higher proportion of controls drinking alcohol in trauma ward than non-trauma wardabcd
14 Some worked examples Work in pairs In 2 minutes: Identify the reason for biasHow will it effect your study estimate?Discuss strategies to minimise the bias
15 Oral contraceptive and uterine cancer You are aware OC use can cause breakthrough bleedingOverestimation of “a” overestimation of ORDiagnostic biasOC use breakthrough bleeding increased chance of testing & detecting uterine cancerAlternative:Prospective cohort study with clinician blinded to the exposureabcd
16 Asbestos and lung cancer Prof. “Pulmo”, head specialist respiratory referral unit, has 145 publications on asbestos/lung cancerLung cancer cases exposed to asbestos not representative of lung cancer casesAlternative:Strict admission criteria?Strict testing criteria?Study in another hospital?Other lung disease controls?Overestimation of “a” overestimation of ORAdmission biasabcd
18 Association between occupational exposure X and disease Y Healthy worker effectAssociation between occupational exposure X and disease YGeneral population includes those who do not work due to illnessSource: Rothman, 2002
20 Prospective cohort study- Year 1 lung canceryes noSmokerNon-smoker
21 50% of cases that smoked lost to follow up Loss to follow up – Year 2lung canceryes noSmokerNon-smoker50% of cases that smoked lost to follow up
22 Minimising selection bias Clear definition of study populationExplicit case, control and exposure definitionsCC: Cases and controls from same populationSame possibility of exposureCohort: selection of exposed and non-exposed without knowing disease status
24 Information bias During data collection Differences in measurement of exposure data between cases and controlsof outcome data between exposed and unexposed24
25 Information biasArises if the information about or from study subjects is erroneous
26 Information bias 3 main types: Recall bias Interviewer bias Misclassification
27 Cases remember exposure differently than controls Recall biasCases remember exposure differently than controlse.g. risk of malformationOverestimation of “a” overestimation of ORWhat to do?Check other sources of informationQuestionnaireMothers of children with malformations remember past exposures better than mothers with healthy children
28 Interviewer biasInvestigator asks cases and controls differently about exposuree.g: soft cheese and listeriosisCases ofControlslisteriosisOverestimation of “a” overestimation of OREats soft cheeseabDoes not eatcdsoft cheeseInvestigator may probe listeriosis cases about consumption of soft cheese (knows hypothesis)
29 Misclassification Exposure Outcome Measurement error leads to assigning wrong exposure or outcome categoryExposureOutcome
30 Misclassification Systematic error Missclassification of exposure DIFFERS between cases and controlsMissclassification of outcome DIFFERS between exposed & nonexposed=> Measure of association distorted in any direction
33 Minimising information bias Standardise measurement instrumentsquestionnaires + train staffAdminister instruments equally tocases and controlsexposed / unexposedUse multiple sources of information
34 Summary: Controls for Bias Choose study design to minimize the chance for biasClear case and exposure definitionsDefine clear categories within groups (eg age groups)Set up strict guidelines for data collectionTrain interviewers
35 Summary: Controls for Bias Direct measurementregistriescase recordsOptimise questionnaireMinimize loss to follow-up
36 The epidemiologist’s role Reduce error in your study designInterpret studies with open eyes:Be aware of sources of study errorQuestion whether they have been addressed
37 Bias: the take home message Should be prevented !!!!At PROTOCOL stageDifficult to correct for bias at analysis stageIf bias is present:Incorrect measure of true associationShould be taken into account in interpretation of resultsMagnitude = overestimation? underestimation?