Brian A. Harris-Kojetin, Ph.D. Statistical and Science Policy

Slides:



Advertisements
Similar presentations
Innovation data collection: Advice from the Oslo Manual South East Asian Regional Workshop on Science, Technology and Innovation Statistics.
Advertisements

Protocol Development.
1 A View of the United States Federal Statistical System from OMB Katherine K. Wallman Chief Statistician U. S. Office of Management and Budget.
Federal Guidance on Statistical Use of Administrative Data Shelly Wilkie Martinez, Statistical and Science Policy, OIRA U. S. Office of Management and.
From Cutting Red Tape to Maximizing Net Benefits Alexander T. Hunt U.S. Office of Management and Budget Challenges on Cutting Red Tape Rotterdam, The Netherlands.
BLS U.S. Bureau of Labor Statistics The Paperwork Reduction ACT (PRA) What You Need to Know for Government UX Work Jean Fox Presented to the.
TITLE OF PROJECT PROPOSAL NUMBER Principal Investigator PI’s Organization ESTCP Selection Meeting DATE.
Enhancing Data Quality of Distributive Trade Statistics Workshop for African countries on the Implementation of International Recommendations for Distributive.
Workshop on Energy Statistics, China September 2012 Data Quality Assurance and Data Dissemination 1.
Midterm Review Evaluation & Research Concepts Proposals & Research Design Measurement Sampling Survey methods.
Documentation and survey quality. Introduction.
Multiple Indicator Cluster Surveys Data Interpretation, Further Analysis and Dissemination Workshop Overview of Data Quality Issues in MICS.
Chapter 13 Survey Designs
FINAL REPORT: OUTLINE & OVERVIEW OF SURVEY ERRORS
Multiple Indicator Cluster Surveys Survey Design Workshop Data Analysis and Reporting MICS Survey Design Workshop.
Cover Letters for Survey Research Studies
Conducting the IT Audit
Tool for Assessing Statistical Capacity (TASC) The development of TASC was sponsored by United States Agency for International Development.
Literature Review and Parts of Proposal
American Community Survey ACS Content Review Webinar State Data Centers and Census Information Centers Gary Chappell ACS Content Review Project Lead April.
Audit objectives, Planning The Audit
Volunteer Angler Data Collection and Methods of Inference Kristen Olson University of Nebraska-Lincoln February 2,
Chapter 12: Survey Designs
Multiple Indicator Cluster Surveys Survey Design Workshop Sampling: Overview MICS Survey Design Workshop.
DRAFT - not for publication Nonresponse Bias Analysis in a Survey of Banks Carl Ramirez U.S. Government Accountability Office
Eurostat Overall design. Presented by Eva Elvers Statistics Sweden.
UNICEF’s work and planned activities for the production of data on children with disabilities Claudia Cappa, Data and Analytics Section, UNICEF, NY.
GLOBAL ASSESSMENT OF STATISTICAL SYSTEM OF KAZAKHSTAN ZHASLAN OMAROV DEPUTY CHAIRMAN, STATISTICS AGENCY OF REPUBLIC OF KAZAKHSTAN. 4.3.
United States Department of Agriculture Food Safety and Inspection Service 1 National Advisory Committee on Meat and Poultry Inspection August 8-9, 2007.
American Community Survey ACS Content Review Webinar State Data Centers and Census Information Centers James Treat, ACSO Division Chief December 4, 2013.
National Commission for Academic Accreditation & Assessment Developmental Reviews at King Saud University and King Faisal University.
1 Hair, Babin, Money & Samouel, Essentials of Business Research, Wiley, Learning Objectives: 1.Understand the key principles in sampling. 2.Appreciate.
Brown, Suter, and Churchill Basic Marketing Research (8 th Edition) © 2014 CENGAGE Learning Basic Marketing Research Customer Insights and Managerial Action.
Audit Sampling: An Overview and Application to Tests of Controls
Strengthening Science Supporting Fishery Management  Standards for Best Available Science  Implementation of OMB’s Peer Review Bulletin  Separation.
1 Assessing the Quality of Administrative Data 2012 FCSM Statistical Policy Seminar Session 7: New Perspectives on the Quality of Administrative Data December.
SURVEY RESEARCH.  Purposes and general principles Survey research as a general approach for collecting descriptive data Surveys as data collection methods.
The Challenge of Non- Response in Surveys. The Overall Response Rate The number of complete interviews divided by the number of eligible units in the.
United Nations Regional Seminar on Census Data Dissemination and Spatial Analysis for Arabic Speaking Countries, Amman, Jordan May 2011 Identification.
Copyright 2010, The World Bank Group. All Rights Reserved. Reducing Non-Response Section A 1.
Copyright 2010, The World Bank Group. All Rights Reserved. Reducing Non-Response Section B 1.
1 Non-Response Non-response is the failure to obtain survey measures on a sample unit Non-response increases the potential for non-response bias. It occurs.
A Theoretical Framework for Adaptive Collection Designs Jean-François Beaumont, Statistics Canada David Haziza, Université de Montréal International Total.
Copyright 2010, The World Bank Group. All Rights Reserved. Principles, criteria and methods Part 2 Quality management Produced in Collaboration between.
TCPS 2 Consultation: Revisions Relevant to Clinical Trials Laura-Lee Balkwill, PhD, Policy Analyst Secretariat on Responsible Conduct of Research CAREB.
Implementation of the European Statistics Code of Practice Yalta September 2009 Pieter Everaers, Eurostat.
Regional Seminar on Promotion and Utilization of Census Results and on the Revision on the United Nations Principles and Recommendations for Population.
An Update of COSO’s Internal Control–Integrated Framework
Evaluating Impacts of MSP Grants Ellen Bobronnikov January 6, 2009 Common Issues and Potential Solutions.
United Nations Oslo City Group on Energy Statistics OG7, Helsinki, Finland October 2012 ESCM Chapter 8: Data Quality and Meta Data 1.
United Nations Regional Seminar on Census Data Dissemination and Spatial Analysis for Arabic Speaking Countries, Amman, Jordan May 2011 Identification.
Questions for practitionaires Clementina Ivan-Ungureanu Training: Essential SNA: Building the basics Addis Ababa, February 2012.
Tools for Mainstreaming Disaster Risk Reduction: Guidance Notes for Development Organisations Charlotte Benson and John Twigg Presented by Margaret Arnold.
Croatia: Result orientation within the process of preparation of programming documents V4+ Croatia and Slovenia Expert Level Conference Budapest,
Non-response Bias Analysis and Evaluation Reporting Passport Demand Forecasting Study 11/14/2013.
Planning a Customer Survey Part 1 of 3 Elaine Carlson, Westat Anne D’Agostino, Compass Evaluation & Research.
WG/UNICEF Child functioning module: Preliminary results from Samoa & Supporting documentation Mitchell Loeb National Center for Health Statistics/ Washington.
United Nations Statistics Division
Principal Investigator ESTCP Selection Meeting
Standards and Guidelines For Cognitive Interviews
Principal Investigator ESTCP Selection Meeting
4.1. Data Quality 1.
UNODC-UNECE Manual on Victimization Surveys: Content
Assessing Academic Programs at IPFW
Principal Investigator ESTCP Selection Meeting
TG EHIS January 2012 Item 4.1 of the agenda EHIS wave 2 Implementing Regulation Bart De Norre, Eurostat.
United Nations Statistics Division
Principal Investigator ESTCP Selection Meeting
Roles and Responsibilities
STEPS Site Report.
Presentation transcript:

OMB Standards and Guidelines for Statistical Surveys: Examining Nonresponse Bias Brian A. Harris-Kojetin, Ph.D. Statistical and Science Policy Office of Management and Budget The views expressed in this presentation are those of the author and do not represent changes in OMB policy.

Background OMB is charged to develop and oversee the implementation of Government-wide polices, principles, standards and guidelines concerning statistical collection procedures and methods statistical data classifications statistical information presentation and dissemination timely release of statistical data 44 USC 3504(e)

OMB Guidance and Standards Standards and Guidelines for Statistical Surveys Issued in final in September 2006 Questions and Answers When Designing Surveys for Information Collections Issued in January 2006 Both available at www.whitehouse.gov/omb/ Go to Statistical Programs and Standards

OMB Q&A Guidance Audience: all Federal agencies conducting and sponsoring collections of information that use statistical methods (broadly defined) Assumes little knowledge of clearance process, survey methodology, and statistics 81 Q&A’s

Standards for Statistical Surveys Revision and Update of Statistical Policy Directives 1 and 2, Standards for Statistical Surveys and Publication of Statistics last updated in 1974 all surveys were paper & pencil references to punch cards

Standards Development Process Interagency team formed as subcommittee of the FCSM Charged with reviewing and making recommendations to OMB on updates and revisions to standards Solicited representatives from all ICSP agencies Subcommittee reviewed current standards as well as standards at statistical agencies also some other National Statistical Institutes

Standards Development Process Draft Subcommittee recommendations were reviewed by FCSM Then reviewed by ICSP agencies Proposed standards submitted to OMB OMB issued Federal Register Notice 60-day public comment period OMB reviewed and addressed the public comments

Standards and Guidelines for Statistical Surveys Supplemented with Guidelines or Best Practices help agencies interpret and fulfill the goals of the Standard

Framework for Standards Development of Concepts, Methods, and Design Collection of Data Processing and Editing of Data Production of Estimates and Projections Data Analysis Review Procedures Dissemination of Information Products

Standards for Statistical Surveys Standard 1.3 Survey Response Rates Agencies must design the survey to achieve the highest practical rates of response, commensurate with the importance of survey uses, respondent burden, and data collection costs, to ensure that survey results are representative of the target population so that they can be used with confidence to inform decisions. Nonresponse bias analyses must be conducted when unit or item response rates suggest the potential for bias to occur.

Survey Response Rates Guidelines Guideline 1.3.3: Prior to data collection, identify expected unit response rates at each stage of data collection, based on content, use, mode, and type of survey. Guideline 1.3.4: Plan for a nonresponse bias analysis if the expected unit response rate is below 80 percent. Guideline 1.3.5: Plan for a nonresponse bias analysis if the expected item response rate is below 70 percent for any items used in a report.

Standards for Statistical Surveys Standard 3.2 Nonresponse Analysis and Response Rate Calculation Agencies must appropriately measure, adjust for, report, and analyze unit and item nonresponse to assess their effects on data quality and to inform users. Response rates must be computed using standard formulas to measure the proportion of the eligible sample that is represented by the responding units in each study, as an indicator of potential nonresponse bias.

Nonresponse Analysis Guideline 3.2.9: Given a survey with an overall unit response rate of less than 80 percent using unit response rates as defined above, conduct an analysis of nonresponse bias with an assessment of whether the data are missing completely at random.

Nonresponse Analysis For a sample mean, an estimate of the bias of the sample respondent mean is given by: Where: = the mean based on all sample cases; = the mean based only on respondent cases; = the mean based only on the nonrespondent cases; n = the number of cases in the sample; nnr= the number of nonrespondent cases.

Nonresponse Analysis For a multistage (or wave) survey, focus the nonresponse bias analysis on each stage, with particular attention to the “problem” stages. A variety of methods can be used to examine nonresponse bias, for example, make comparisons between respondents and nonrespondents across subgroups using available sample frame variables. Comparison of the respondents to known characteristics of the population from an external source can provide an indication of possible bias, especially if the characteristics in question are related to the survey’s key variables.

Implementation of Standards Application of standards requires judgment balancing use of the information and resources Agencies need to provide sufficient information in their Information Collection Requests to OMB to demonstrate whether they are meeting the standards. uses of the information will be considered agency should provide reasons why a standards could not be met and actions taken to address potential issues

Next Steps FCSM has identified Nonresponse Bias as key area to examine Planning a workshop in Spring 2008 on how to conduct NR bias analyses provide examples of NR bias analyses done for different Federal surveys separate tracks for household and establishment surveys Also have two ongoing interest groups on household and establishment survey nonresponse

Overview of the Q&A guidance Purpose of this guidance Submission of ICRs to OMB Scope of the information collection Choice of methods Sampling Modes of data collection Questionnaire design and development

Overview of the Q&A guidance OMB Statistical standards Informing respondents about their participation and confidentiality Response rates and incentives Analysis and reporting Studies using stated preference methods Glossaries

Response Rates and Nonresponse Bias Why are response rates important? How should response rates be calculated? What are acceptable response rates for different kinds of collections? How can agencies examine potential nonresponse bias?

Why are response rates important? Common data quality and field performance indicator Nonresponse can occur for a number of reasons with different implications refusals noncontacts Response rates are a useful indicator for the risk of nonresponse bias

What are acceptable response rates for different kinds of collections? Surveys collecting “influential information” (see Information Quality Guidelines) should achieve high response rates Agencies need to consider how they will use the data and how the methods chosen will achieve acceptable response rates and data quality

What are acceptable response rates? In their Information Collection Requests, Agencies need to: Provide expected response rates and Description of how response rate determined Description of steps to maximize the response rate If expected response rate is less than 80% include plans to evaluate potential nonresponse bias

How can agencies examine potential nonresponse bias? Nonresponse bias analyses can include: Response rates by different subgroups Comparing nonrespondents and respondents on frame variables Comparing initial refusers with initial respondents Subsample nonrespondents for more extensive follow-up efforts Use information to estimate characteristics of nonrespondents