Presentation on theme: "Total Survey Error Design, Implementation, and Evaluation"— Presentation transcript:
1 Total Survey Error Design, Implementation, and Evaluation Paul P. BiemerRTI International andUniversity of North Carolina at Chapel Hill
2 Modern View of Survey Design Surveys should be designed to maximize total survey quality within timeliness and budget constraints.But how when…survey budgets are severely constrained,data must be produced and disseminated in a timely fashion,public interest in participating in surveys has been declining world-wide for years, andeven when participation is obtained, responses may not be accurate.This is the challenge for survey research in the 21st century
3 Outline What is total survey quality? How does it differ from total survey error?How can surveys be designed to maximize total survey quality?What is the total survey error paradigm and what does it say about the design, implementation, and evaluation of survey?
4 User and Producer Have Very Different Perspectives on Survey Quality Producers place high priority onAccuracy – total survey error is minimizedCredibility – credible methodologies; trustworthy dataUsers place higher priority onTimeliness – data deliveries adhere to schedulesRelevance – data satisfy user needsAccessibility – access to data is user friendlyInterpretability – documentation is clear; meta-data are well-managed
5 Users Also Demand…Comparability – valid demographic, spatial and temporal comparisonsCoherence – estimates from different sources can be reliably combinedCompleteness – data are rich enough to satisfy the analysis objectives without undue burden on respondents
6 TSQ Optimally Balances Producer and User Requirements TimelinessAccessibilityComparabilityCoherenceCompletenessRelevanceAccuracyCredibilityInterpretabilityProducer
7 The Total Survey Quality Paradigm Identifies measurable and achievable objectives for each user-defined dimension of qualityDetermines costs and resources required to achieve these objectivesMaximizes survey accuracy within remaining budget
8 Accuracy is maximized by minimizing total survey error Sampling ErrorSampling schemeSample sizeEstimator choiceNonsampling ErrorSpecificationNonresponseFrameMeasurementData processingSystematicVariableBiasVarianceMean Squared Error (MSE)MSE = Bias2 + Variance
9 Optimal Design for Total Survey Quality Minimize total survey errorAccuracySubject toSampling errorbudgettimeTimelinessSpecification errorAccessibilityNonresponse errorInterpretabilityFrame errorconstraintsMeasurement errorComparabilityData processing errorCredibility
10 Designing Surveys to Minimize Total Survey Error Objective – minimum mean squared error (MSE) subject to cost and timeliness constraintsMajor bias contributorsconcept misspecificationframe noncoveragenonresponsemeasurement biasediting errorsMajor variance contributorssampling errormeasurement unreliabilityinterviewer error
11 Key Design PrinciplesDesign robustness – accuracy does not change appreciably as the survey design features change; i.e. optimum is “flat” over a range of alternate designsEffect generalizability – design features found to be optimal for one survey are often generalizable to other similar surveysoptimum accuracy
12 Key Design PrinciplesDesign robustness – accuracy does not change appreciably as the survey design features change; i.e. optimum is “flat” over a range of alternate designsEffect generalizability – design features found to be optimal for one survey are often generalizable to other similar surveysloss in accuracy
13 Implications for Design Compile information on TSE (e.g., quality profiles)Identify major contributors to TSEAllocate resources to control these errorsUse results from the literature and other similar surveys to guide the designDevelop an effective process for modifying the design during implementation to achieve optimalityEmbed experiments and conduct studies to obtain data on TSE for future surveys
14 Design Implementation Strategies The initial survey design must modified or adapted during implementation to control costs and maximize quality.Four strategies for reducing costs and errors in real-time:Continuous quality improvementResponsive designSix SigmaAdaptive total design and implementationInitial qualityFinal quality
15 Continuous Quality Improvement (CQI) Prepare a workflow diagram of the process and identify key process variables.Identify characteristics of the process that are critical to quality (CTQ).Develop real-time, reliable metrics for the cost and quality of each CTQ.Continuously monitor costs and quality metrics during the process.Intervene as necessary to ensure that quality and costs are within acceptable limits.
16 Responsive Design Strategy Developed for face to face data collection (Groves & Heeringa, 2006)Similar to CQI but includes three phases:Experimental phase – tests major design optionsFor e.g., split sample designs to test incentive levelsMain data collection phase – implements design selected in first phaseContinues until “phase capacity” is reachedNRFU phase – special methods implemented to reduce nonresponse bias and control data collection costsNR double sampling, higher incentives, more intensive followupPhase capacity – point at which efforts to reduce NR bias under current protocol are no longer cost effectiveInnovative uses of paradata for CQI
17 Six Sigma Developed by Motorola in the 1980’s Definition (from Pande, et al, 2000, p. xi)Extends ideas of Total Quality Management (TQM) and continuous quality improvement (CQI)Has mostly been applied in business and manufacturing.“A comprehensive and flexible system for achieving, sustaining and maximizing business success,…uniquely driven by a close understanding of customer needs, disciplined use of facts, data, and statistical analysis, and diligent attention to managing, improving, and reinventing business processes.” – Pande, et al (2000, p. xi)
18 Strengths of Six SigmaProvides a systematic, highly effective approach for quality improvement (DMAIC).Focuses on attributes of a process that are most important to the client.Emphasizes decision making based on data analysis.Strives for verifiable and sustainable improvements for both costs and quality.Contains a rich set of techniques and tools for monitoring, controlling, and improving a process.
19 Weaknesses of Six Sigma Can be expensive to implement.Achieving 3.4 defects per million opportunities is an impossible goal for many survey processes.Often requires data that do not exist and cannot be obtained affordably.Terminology and some techniques are too business and manufacturing oriented. This obscures its applicability to survey work.Uses a lot of jargon.
20 Six Sigma’s DMAIC Strategy Define the problem.Measure key aspects of the process and collect relevant data.Analyze the data to determine root causes of the problem.Improve the process based upon results from the data analysis.Control the process by continuously monitoring metrics from the process.
21 Typical Survey Design and Implementation Process Develop a survey design with design options A, B, etcMonitor critical-to-quality design attributes (CTQs)1Post-survey processing, adjustment, and file preparationModify design to maximize accuracy while meeting cost and timeliness objectivesPretest design and optionsPre-release quality evaluationsBudget or schedule exhausted?Data releaseSelect and Implement best design optionnoSTOPyes1
22 Six Sigma Focuses Primarily on these Activities Develop a survey design with design options A, B, etcMonitor critical-to-quality design attributes (CTQs)1Post-survey processing, adjustment, and file preparationModify design to maximize accuracy while meeting cost and timeliness objectivesPretest design and optionsPre-release quality evaluationsBudget or schedule exhausted?Data releaseSelect and Implement best design optionnoSTOPyes1
23 Adaptive Total Design and Implementation An approach for continuously monitoring survey processes to control errors, improve quality, and reduce costs.Adaptive in that it combines the real-time error control features of CQI, responsive design, and Six Sigma strategies.Total in that it simultaneously monitors multiple sources; for e.g.,Sampling frame and samplingResponse qualityNonresponse bias reductionField productionCosts and timeliness
24 Six Sigma Tools and Concepts Workflow diagramCommon vs. special cause variationProcess control chartDashboardFishbone diagramPareto chartMany others are available (see Breyfogle, 2003))
25 Workflow Diagram for Sampling and Initial Interview Attempt 11123Compute domain sample sizeCompute current eligibility rateCompute required sample per PSU2343Select sample lines to send to fieldAssign case prioritiesTransmit to FS’s in field45456Assign cases to FIsConduct travel efficiency sessionsOptimize work sequence order7Contact?Interview?FI places initial contact attemptYesYes289Complete ROC log1Set appointmentCritical to quality key:Achieve target sample sizes for each domainDistribute sample to PSUs to minimize design effectsFI workloads must be adequateEnsure high response propensity for high priority casesMinimize FI travel costs through work sequence optimizationEnsure that high priority cases are worked fullyEnsure good cooperation at first contactRecord of calls (ROC) is completed accuratelySchedule an firm appointment after each contact
26 CTQs and Metrics for Frame Construction and Sampling Maximize frame coverageMaximize within unit coverageDetect/control duplications andineligiblesEffectively post-stratified samplefor bias reductionUse auxiliary data and efficientestimators to minimize samplingerrorMinimize design effects for keyanalytic domainsAchieve target sample allocationsfor key domainsOptimally allocate sample to strataand sampling stagesProcess MetricsIneligibility rates by domain, PSU,and overallAchieved # interviews by domain# HU's identified by q.c.Projected vs. actual coverageDesign effects by domainScreener mean propensityInterview mean propensity# active cases
27 CTQs and Metrics for Observation Quality Detect/control post-surveymeasurement errorsIdentify/repair problematic surveyquestionsDetect/control response errorsMinimize interviewer biases andvariancesProcess MetricsCARI results by interviewer andoverallInterviewer exception reportMissing data item frequency byinterviewerReplicate measurement analysissummaryInterview length by interviewerCARI refusal rate by FI, byphase
28 CTQs and Metrics for Nonresponse Followup Maximize response ratesMinimize nonresponse biasEffectively adjust for unitnonresponseEffectively impute missing data forkey itemsProcess MetricsOverall Phase 3 sampling rate byPSU and overallResponse rate for high prioritycasesHours per convertednonrespondent (refusal vs. other)Projected WRR by PSU and overall(actual vs. expected)Projected design effects by domainBudgeted vs actual hours chargedfor Phase II
29 CTQs and Metrics for Costs, Production, and Timeliness Maximizing interviewing efficiencyMaximize effectiveness of refusalconversions attemptsComplete call histories accuratelyand completelyMinimize hours per completedscreenerMinimizes hours per completedinterviewMaintain planned costs per quarterMaintain planned schedule forsample completion per quarterProcess MetricsCost per interviewDollars spent vs. dollars budgetedby interviewerDollars spent vs. value of workconducted by interviewerCost breakdown (by phase andoverall)Number of cases interviewed(actual vs. budgeted)Calls per hour (actual vs. expected)Refusal conversion rates byinterviewerHours charged (actual vs.expected)Level of effort per case byinterviewer and overallHours per completed screenerHours per completed interview
30 Special vs. Common Cause Variation Special causes – assignable to events and circumstances that are extraordinary, rare and unexpectede.g., frame was not sorted prior to samplingAddressed by actions specific to the cause leaving the design of the process essentially unchangedCommon causes – naturally occurring random disturbances that are inherent in any process and cannot be avoided.e.g., normal fluctuations of response across regions and monthsActions designed to address a common cause is neither required nor advisable; this lead to process “tampering”
35 Process Control Chart with More Extreme Values Special cause
36 Interviewer Efficiency - Contacting and Locating Example of a DashboardInterviewer Efficiency - Contacting and Locating
37 Dashboard Showing Weighted Response Rates, Interview Costs, Interviewer Exceptions and Production
38 Other Useful Tools Cause and effect (fishbone) diagrams Helps to identify all possible root causes of a problemAn important component of the measure stage of DMAIC.EconomySupervisionAvailability of higher paying jobsReward systemPoor supervisionLack of benefitsInterviewer turnoverMisinformation from other FIsLack of steady workFamily situationLow payConflict with supervisorInadequate trainingJob difficultyJob characteristicsPersonal reasonsUnrealistic employee expectations
39 Other Useful Tools (cont’d) Pareto chartUseful for identify the “vital few” sources of process deficiencies
40 Total Survey Error Evaluation Addresses several dimensions of total survey quality.Essential for optimizing resource allocations to reduce the errors.In experimentation, needed to compare the quality of alternative methods.Provides valuable information on data quality for gauging uncertainty in estimates, interpreting the analysis results, and building confidence and credibility in the data.
41 Primary MethodsNonresponse bias studies (required by OMB for some surveys)Evaluates differences between respondents and nonrespondents for key survey itemsFrame data or prior waves provides data on nonrespondentsModel-based approaches for nonignorable nonresponse biasMeasurement bias studiesRecord check studiesReconciled reinterviewsInternal and external consistency checksTest-retest reinterview approachesEmbedded repeated measures analysis (e.g. structural equation modeling, latent class analysis)
42 Primary Methods (continued) Other methodsFrame undercoverage evaluationsEditing error (pre- and post-editing comparisons)Cognitive methods for detecting comprehension errors, recall problems, data sensitivity, etc.Subject matter expert reviews of concepts vs. question meaningProcess data summariesResponse rate analysisData entry error ratesEdit failure ratesMissing data ratesPost-survey adjustment factors
43 Major Take-Home Points Survey quality is multi-dimensional including both data user and producer dimensions.Accuracy is maximized subject to cost and timeliness constraintsSurvey design optimization begins with the initial design and extends throughout implementation and post-survey processing.ATDI combines CQI, responsive design and Six Sigma strategies to provide a comprehensive approach for real-time reduction of total survey error and costs.Survey evaluation is an essential component of the total survey error framework.
44 It is not enough to do your best; you must know what to do, and then do your best. – W. Edwards Deming