Presentation is loading. Please wait.

Presentation is loading. Please wait.

Methodology and Data Quality of the ANES 2008-2009 Panel Study: Lessons for Future Internet Panels Matthew DeBell Stanford University.

Similar presentations


Presentation on theme: "Methodology and Data Quality of the ANES 2008-2009 Panel Study: Lessons for Future Internet Panels Matthew DeBell Stanford University."— Presentation transcript:

1 Methodology and Data Quality of the ANES Panel Study: Lessons for Future Internet Panels Matthew DeBell Stanford University

2 2 Acknowledgments National Science Foundation Stanford University & the University of Michigan Knowledge Networks collected data ANES people: –Jon A. Krosnick & Arthur Lupia, PIs ( ) –Vincent Hutchings, Associate PI ( ) –Matthew DeBell, Stanford project director –lots of others!

3 This talk Describe an Internet panel study: ANES Panel Study Evaluate data quality and related factors Methodological lessons

4 3 problems to frame the talk

5 Coverage error –Inaccurate representation

6 3 problems to frame the talk Coverage error –Inaccurate representation Nonresponse bias –Inaccurate representation

7 3 problems to frame the talk Coverage error –Inaccurate representation Nonresponse bias –Inaccurate representation Attrition –Loss of representation –Loss of power

8 Other issues Conditioning –Loss of representation Measurement error –Reporting error, satisficing, other mode- related effects

9 This talk Describe an Internet panel study: ANES Panel Study Evaluate data quality and related factors Methodological lessons

10 American National Election Studies Presidential elections in the USA 1948-present Standard mode is face-to-face Two-wave panel: before and after each presidential election

11 ANES Panel Study

12 12 Design Sequential mixed-mode –Phone recruitment –Internet panel 21 waves online (25-30 minutes each) –7 ANES waves –14 off-waves of non-political KN data Internet provided to Rs without access

13 13 MSN TV 2

14 14 Recruitment RDD recruitment –Landline only Two cohorts –Cohort 1 started January 2008 –Cohort 2 started September 2008 One person per household $10 per month incentive Recruitment-stage response rate (AAPOR RR3): 42%

15 15 Mixed-Mode Recruitment Internet-only recruitment fallback –Cases not enrolled by telephone (2,992) were mailed a letter asking them to complete the recruitment survey on-line. 119 did so.

16 Communications (first 8 months) D-0invitation D+3 reminder D+6 reminder #2 D+14phone reminder ( repeated weekly )

17 Revised communications D-3 notice D-0 invitation D+3 reminder D+6 reminder #2 D+11 reminder #3 D+13phone reminder ( repeated weekly ) D+19 reminder #4 ( repeated 10d )

18 This talk Describe an Internet panel study: ANES Panel Study Evaluate data quality and related factors Methodological lessons

19 19 Part 2: Quality Evaluation Will show: –Number of cases –Dropout recovery –Case validation –Response rate –Panel retention / attrition numbers –Attrition effects –Accuracy of estimates

20 20 n cohort 1 cohort2 Wave 1 (Jan 08)1,624 Wave 2 (Feb 08)1,458 Wave 6 (Jun 08)1,421 Wave 9 (Sep 08)1,488+ 1,106 = 2,594 Wave 10 (Oct 08)1,511+ 1,126 = 2,637 Wave 11 (Nov 08)1,508+ 1,167 = 2,675 Wave 13 (Jan 09)1,453+ 1,094 = 2,547 Wave 17 (May 09)1,387+ 1,016 = 2,403

21 21 Panelist Recovery Experiment

22 22 Panelist Recovery Action $50 offers to all 282 panel dropouts. –Dropouts had completed the Profile or Wave 1 or Wave 2 but not waves 7-9. Result highlights: –W10: 132 completions (47 percent) –W11: 132 completions (47 percent) –W17: 129 completions (46 percent)

23 23 Case Validation Telephone surveys with 10% subsample of Rs to each of the 7 planned ANES waves 1,482 interviews We found…

24 24 Case Validation Telephone surveys with 10% subsample of Rs to each of the 7 planned ANES waves 1,482 interviews 100% confirmed names and participation

25 25 But… Only half to two-thirds recalled topic of previous survey –Probably due to excessive delay in validation calls. Imperfect item reliability –18 instances of sex inconsistency –91 instances of year of birth inconsistency Many appear to be data entry errors Error rates appear consistent with face-to-face surveys

26 26 Response Rates RR1RR3RR5 minestmax Recruitment Wave Wave Wave Wave Wave

27 Retention: ANES vs. KnowledgePanel ANES panel retention was better than KP –[graphic & numbers not for public distribution] 27

28 28 Retention Rates at Wave 11 n = 2,675 Retention –From Recruitment63 percent (2,649) –From Profile84 percent (2,439) –From Wave 185 percent (1,381) –From Wave 1095 percent (2,500)

29 29 Completions at Wave 11 (Nov 08) totalcohort 1 Total Completed: –Recruitment2649(99%)1483 (98%) –Profile2439(91%)1328 (88%) –Wave 11381(52%)1381 (92%) –Wave (93%)1433 (95%) –Waves 9 & (87%)1345 (89%) –All ANES stages1058(40%)1058 (70%) –All stages (1-10) 738(28%) 738 (49%)

30 30 Other Retention Rates Mean wave-to-wave retention: 91 percent Cohort 1 Rs who completed all ANES waves through May 2009 (1,2,6,9,10,11,17): 68 percent (939) 83% of May 2009 completers had completed Waves 9, 10, and 11.

31 31 Attrition effects 1623 Rs completed Wave 1 1,258 of these also completed Wave 17 –78 percent retention; 22 percent attrition Ran frequencies on all Wave 1 variables for these two groups and…

32 32 Attrition effects 1623 Rs completed Wave 1 1,258 of these also completed Wave 17 –78 percent retention; 22 percent attrition Ran frequencies on all Wave 1 variables for these two groups Average difference: 1.3 points.

33 Differential attrition (wave17) FactorW1W17Diff year olds Males HS dropouts Home renters Non-voters No Obama affect

34 34 Accuracy of Estimates ( 1 of 2 ) Benchmark to CPS. 43 statistics examined, for: –Age, sex, race, ethnicity, race/ethnicity, education, home tenure, household size, marital status, household income, presidential vote choice, voter turnout Estimates are within 5 points of benchmark for 84 percent (36 of 43) of statistics examined.

35 Accuracy of Estimates ( 2 of 2 ) >5pt errors for a few statistics –Renters (-9.3) –One-person households(-5.4) –Married(+10.2) –Income >$100,000/yr (-6.6) Average error: 2.1 points

36 This talk Describe an Internet panel study: ANES Panel Study Evaluate data quality and related factors Methodological lessons

37 Part 3: Lessons Concerning: Data quality Measuring quality immediately Promoting quality

38 Data Quality Data quality from the telephone-recruited panel is consistent with expectations for a high-quality telephone survey –Good accuracy –Only moderate attrition

39 Measuring Quality Immediately Monitor attrition during panel Validation interviews –Should happen immediately after web completion Concern: respondent identity

40 Promoting Quality Fight non-response bias –high-quality recruitment –multiple contacts over a long period of time –incentives Fight attrition –pleasant content –incentives –dropout recovery with added incentives Fight conditioning with variety

41 Final Thoughts Other possibilities –Targeted incentives to combat NR bias –Oversamples at recruitment to combat NR Landline RDD is obsolete! Literacy

42 42 Thank you Matthew DeBell


Download ppt "Methodology and Data Quality of the ANES 2008-2009 Panel Study: Lessons for Future Internet Panels Matthew DeBell Stanford University."

Similar presentations


Ads by Google