Presentation is loading. Please wait.

Presentation is loading. Please wait.

Maintenance of Selective Editing in ONS Business Surveys Daniel Lewis.

Similar presentations


Presentation on theme: "Maintenance of Selective Editing in ONS Business Surveys Daniel Lewis."— Presentation transcript:

1 Maintenance of Selective Editing in ONS Business Surveys Daniel Lewis

2 Overview Introduction Selective editing in ONS The need to review selective editing Threshold review Process for maintaining selective editing Next steps

3 Introduction ONS uses selective editing on nine key business surveys Aim to focus editing effort where it improves output quality Three different implementations: Original, based on prioritising edit failures New “standard”, inspired by Australian method Selekt – Swedish software Important to regularly review thresholds and maintain editing systems

4 Selective editing in ONS – standard method Calculate score for each key variable (item): weight * | returned value - expected value | * 100 previous period domain estimate Expected value is usually the previous period value, when available Otherwise use relationship to register variable Calculate unit score by taking average of item scores Follow up businesses with score larger than (domain) threshold

5 Selective editing in ONS – standard method Thresholds set for each output domain by analysis of past data Aim to keep “pseudo-bias” no greater than 1% for each domain i.e. bias introduced by not editing all suspicious responses

6 Selective editing in ONS - Selekt Standard method works well when there is high sample overlap between variables and not too many key variables Annual Business Survey (ABS) has many variables and less than 50% overlap Selekt used instead Flexible in making use of available data Score split into three parts: Suspicion Impact Importance

7 Selective editing in ONS – Selekt To use Selekt, need to specify many parameters and choose which variables (or combinations of variables) to include in the score Tested options using an iterative approach, aiming to minimise failures whilst keeping quality acceptable Settings also partially informed by survey team experience

8 The need to review selective editing Selective editing offers good efficiency savings, but systems are often set up based on initial analysis of data and not reviewed Over time problems often arise with the use of selective editing on a survey Need to consider: Unforeseen data issues Cultural issues Threshold review

9 Unforeseen issues and changes It may be necessary to review selective editing due to unforeseen data issues or changes to questions or coverage of a survey Possible to fix these issues, but need process to identify and deal with them quickly

10 Cultural issues Also important to address any cultural issues with the use of selective editing When editing or results team do not trust / understand the process they may perform their own additional editing Important to embed culture – workshops explaining methods and discussing issues often work well

11 Threshold review Key challenge in maintaining selective editing is keeping thresholds and parameters up to date Original analysis to set thresholds relies on fully edited dataset to evaluate performance Once selective editing is in place, we no longer have access to such a dataset Two options to deal with this Make model assumptions about data and their error structure Sample and re-contact some businesses that passed selective editing

12 Threshold review Developing method to re-set thresholds based on sample of businesses passing selective editing Initial study in 2012 to evaluate performance of thresholds in Retail Sales selective editing Average monthly sample size around 3500, with around 900 failing selective editing Agreed resource to re-contact an additional 600 businesses as a one-off exercise Sampled proportional to

13 Threshold review Three approaches considered for sampling businesses within each domain: 1. Simple random sample 2. Stratified random sample using unit score as stratification variable 3. Split domain into two strata based on unit score, fully enumerate largest scores Approaches tested using RSI data before selective editing implemented Suggestion from Sweden (not tested) to use Poisson sampling with probabilities proportional to score

14 Threshold review Performance of tested approaches assessed using two estimates of pseudo-bias Full sample estimate Sub-sample estimate

15 Threshold review Bias estimates compared for 35 months of RSI data Simple random sample within domain most accurate for estimating pseudo-bias Disadvantage that no guarantee of getting units that narrowly pass selective editing Implemented method in practice and discovered no domains with pseudo-bias above 1% In this case, no need to change thresholds Fortunate as we still need to develop method to enable new threshold analysis!

16 Process for maintaining selective editing Would like a well defined process to ensure thresholds are regularly reviewed, data issues are identified and dealt with, selective editing culture is properly embedded Agreed that sub-annual surveys will have thresholds agreed every three years, annual surveys every five years Any ongoing selective editing issues will be identified as part of new regular Survey Action Plan meetings

17

18 Next steps Develop and pilot method for testing thresholds using RSI data Review any other issues with RSI selective editing Following implementation of successful approach, create timetable for reviewing all surveys with selective editing Aim to fully implement process by end 2014


Download ppt "Maintenance of Selective Editing in ONS Business Surveys Daniel Lewis."

Similar presentations


Ads by Google