Presentation is loading. Please wait.

Presentation is loading. Please wait.

NONSAMPLING ERROR RESEARCH IN PRACTICE J. Michael Brick and Graham Kalton Westat.

Similar presentations


Presentation on theme: "NONSAMPLING ERROR RESEARCH IN PRACTICE J. Michael Brick and Graham Kalton Westat."— Presentation transcript:

1 NONSAMPLING ERROR RESEARCH IN PRACTICE J. Michael Brick and Graham Kalton Westat

2 2 OUTLINE Review sources of nonsampling error Discuss examples of nonsampling error research: –NHES – YATS –NALS – RCGS –NIPRCS Discuss how we choose which methodological studies to be conducted

3 3 TOTAL SURVEY ERROR Sampling error Nonsampling error –Missing data Coverage error Nonresponse error –Measurement error Response error Processing (coding, data entry) error

4 4 COVERAGE ERROR Undercoverage to avoid missing persons within households Undercoverage due to missing households –Studies of estimation methods to reduce bias –Studies of efficient designs with lower coverage rates –Studies of the level of coverage bias for specific topics.

5 5 NATIONAL HOUSEHOLD EDUCATION SURVEY (NHES) Repeating RDD survey on education topics. Two topics of interest in 1989 were high school dropouts and preschool enrollment of 3 to 5 year olds. Concerns about undercoverage lead to an evaluation using data from CPS supplement that covered these topics and could be classified by telephone status.

6 6 NHES COVERAGE BIAS ESTIMATES Coverage rates-14-21 yr olds = 92%; 3-5 yr olds = 88% EstimateBias of estimate CharacteristicTel.(%)Nontel.(%)SimplePost. High school dropout7.233.6-23%-15% Enrolled in preschool58.941.63%0%

7 7 NONRESPONSE ERROR Nonresponse bias studies to evaluate the level of nonresponse bias in estimates based on: –Frame data, –Nonresponse follow-ups, –Simulations. Studies evaluating estimation methods (e.g., use of different auxiliary variables) to reduce bias. Studies evaluating methods of increasing response rates.

8 8 NHES:2003 INCENTIVE EXPERIMENT Incentives (initial/refusal conv) Relative Resp. rate Cost ratio with refusal subsampling rate 1.00.70.5 1. $0/$01.001.0 2. $0/$21.071.1 3. $0/$51.121.21.11.2 4. $2/$01.091.2 1.3 5. $5/$01.131.6 1.7 6. $2/$21.131.3

9 9 NATIONAL ADULT LITERACY SURVEY (NALS) 1992 Adults interviewed and given literacy tests. Concern that nonresponse was related to literacy. An incentive experiment offered $0, $20, and $35. Response rates for $20 & $35 were about 9 pct. pts. higher than $0; for minorities 20 pts. higher. Scores substantially higher for $0 vs. $20 & $35. Data collection cost lowest for $20.

10 10 RESPONSE ERROR Studies evaluating the level of errors due to: –Recall –Questionnaire design –Sensitive items –Interviewers

11 11 NATIONAL IMMUNIZATION PROVIDER RECORD CHECK STUDY Parents reported children’s immunizations in a supplement to the NHIS. Concerns about the accuracy of the parent reports (especially if reported by recall rather than from shot cards) lead to checks with medical providers. Provider and parent reports reconciled to create “best” values which are treated as “true values”.

12 12 GROSS AND NET DIFFERENCE RATES Gross difference rate gdr = (B + C)/N Net difference rate (bias) ndr = (B – C)/N

13 13 NDR AND GDR FOR DTP, BY USE OF SHOT CARD, 1994-1996 Parents substantially underreported DTP. Greater underreporting when shot cards used. Greater accuracy when shot cards used.

14 14 NHES 1995 REINTERVIEW STUDY The 1995 Adult Education Survey had a response variance reinterview (n = 1,109 out of 19,722) 21% reported work-related (WR) activities; –gdr = 12.5%ndr = −5.7% 22% reported personal development (PD); –gdr = 14.3% ndr = −1.2%

15 15 NHES INTENSIVE BIAS STUDY Used an intensive, cognitive-type reinterview to determine “true values” Small sample (n = 206) chosen to explore reporting AE participation in WR and PD

16 16 YOUTH ATTITUDE TRACKING STUDY Annual cross-sectional RDD survey of 16-24 year olds conducted for the DoD to track attitudes towards military service. Design shifted to include a panel component. Annual enlistment propensities declined because panel members had lower propensities to enlist.

17 17 YATS ADVISORY GROUP Panel attrition and conditioning were the main sources considered. Few variables consistently related to panel attrition and enlistment propensity. Revised weighting adjustments did not narrow the gap between RDD and panel estimates. DoD reverted to a fully cross-sectional design.

18 18 THE 1991 RECENT COLLEGE GRADUATE SURVEY The RCGS included: –A nonresponse study, –A reinterview study, –An interviewer variance study, –A record check study, and –Other evaluation studies Made strong assumptions of additive errors to model mean square error of estimates. Major contribution is understanding general magnitude of errors by source.

19 19 FACTORS INFLUENCING RESEARCH CHOICES Study the major error sources for the specific survey design Include substantively important variables Conduct studies with the potential for assessing current estimates and/or designing future surveys Take advantage of opportunities for research –Small studies can be valuable –Inexpensive studies on low priority issues or using less rigorous methods can be worthwhile

20 20 References Brick, J.M., Burke, J., and West, W. (1992). Telephone undercoverage bias of 14- to 21-year- olds and 3- to 5-year-olds (Technical report No. 2, NCES 92-101). Washington, DC: U.S. Department of Education. Brick, J.M., Cahalan, M., Gray, L., and Severynse, J. (1994). A study of selected nonsampling errors in the 1991 Survey of Recent College Graduates. U.S. Department of Education, Office of Educational Research and Improvement, NCES 95-640. Brick, J.M., Hagedorn, M.C., Montaquila, J., Roth, S.B., and Chapman, C. (2004). Using an experiment to design an RDD survey. Proceedings of the Survey Methods Section of the American Statistical Association [CD-ROM], 4923-4928. Brick, J.M., Kalton, G., Nixon, M., Givens, J., and Ezzati-Rice, T. (2000). Statistical issues in a record check study of childhood immunization. Proceedings of the 1999 Federal Committee on Statistical Methodology Research Conference (Statistical policy working paper 30, 625- 634). Brick, J.M., and Morganstein, D. (1996). Estimating response bias in an adult education survey. Proceedings of the Survey Research Methods Section of the American Statistical Association, 728-733. Brick, J.M., Wernimont, J., and Montes, M. (1996). The 1995 National Household Education Survey: Reinterview results for the adult education component (NCES 96-14). Washington, DC: Office of Educational Research and Improvement, U.S. Department of Education. Mohadjer, L., Berlin, M., Rieger, S., Waksberg, J., Rock, D., Yamamoto, K., Kirsch, I., Kolstad, A. (1997). The role of incentives in literacy survey research, Chapter 10 pp 209-244 in Adult Basic Skills: Innovations in Measurement and Policy Analysis, eds. Tuijnman, Kirsch, and Wagner, Hampton Press, 1997.


Download ppt "NONSAMPLING ERROR RESEARCH IN PRACTICE J. Michael Brick and Graham Kalton Westat."

Similar presentations


Ads by Google