Presentation is loading. Please wait.

Presentation is loading. Please wait.

Quality Measures for ONS population estimates: Introduction Local Insight Reference Panels Autumn 2014 1.

Similar presentations


Presentation on theme: "Quality Measures for ONS population estimates: Introduction Local Insight Reference Panels Autumn 2014 1."— Presentation transcript:

1 Quality Measures for ONS population estimates: Introduction Local Insight Reference Panels Autumn 2014 1

2 Session Summary Plausibility ranges Administrative sources and demographic analysis comparison tool Measures of Uncertainty Visualisation tool 2

3 Reference periods for the measures MeasureYear 01020304050607080910111213 Plausibility Ranges Admin/ demographic indicators Uncertainty Visualisation Tool 3

4 How accurate do you think our estimates are? A.A perfect measure of the population B.Within +/- 1% C.Within +/- 2% D.Within +/- 5% E.Within +/- 10% F.Within +/- 15% G.Within +/- 20% H.>20% little relation to the true population For 2011 Census (all persons 2011 Rolled forward (all persons) 2011 Census (25-29 year olds) 2011 Census (25-29 year old males) 4

5 Accuracy of Population Estimates for 2011 5 Note: Average = average weighted by population size

6 Your awareness of quality tools Had you heard of the Quality tools before this session? Have you used any of them? 6

7 Part 1 Using Administrative Data to Set Plausibility Ranges for Population Estimates- Assessment Following the 2011 Census 7

8 Background Update the work carried out in 2012 which used the 2009 Mid-Year Estimates. One of several initiatives taken forward for the quality assurance of Mid-Year Estimates. - Release of 2011 Census estimates allowed methods to be evaluated. Same methodology as the 2012 report. -How the ranges performed against both the 2011 Census estimates and MYEs for 2011. Only for those aged 0-15 8

9 What are plausibility ranges? Definition of plausibility ranges: A plausibility range is the setting of upper and lower limits, calculated using administrative data, within which the population estimates could reasonably be expected to fall. Outside range Within range LowerUpper 9

10 What are plausibility ranges? Plausibility range MYE Census Confidence interval Census estimate SCPRCB SC= School CensusPR= Patient Register CB= Child Benefit 10

11 Does the Census Validate the ranges? 264201266280 69 80 43 64 15 67 39 25 4 --- Under 1s14yrs57yrs811yrs12-15yrs Within range lower 25%Within range Within range upper 25% Above Plausibility RangeBelow Plausibility Range Age group Number of LAs 11

12 Summary of findings Absolute Approach FindingsRelative Approach Findings Around 1/5 th of LAs’ Census estimates fell outside of the plausibility ranges Those that fell within the ranges- Most agreement between MYE11s and Census estimates Those that fell outside of the ranges- On average the Census estimates were closer to the plausibility ranges than the MYEs. Around 12% (42 LAs) false positives. LAs which were not a problem but were flagged as areas of concern. Little useful information about the under 1 and 1-4 year age group. Around 14% (49 LAs) false negatives. LAs which potentially should have been flagged that were not. Some useful information about the 5-15 year age group. 12

13 Limitations Data sources- ranges only as good as the administrative sources used to calculate them. Census variability- current methodology compares a point with a range rather than comparing a range with a range. Methodology- more sensitive at picking up over estimates than under estimates. Age grouping- following a cohort is difficult, different sized age groups. Specific areas- for example armed forces and areas with high levels of independent schools. 13

14 Plausibility Ranges 5 mins discussion Were you aware of the original report? Have you seen the revised report? Do you agree with our conclusions? 14

15 Key Points Around 1/5 th of LAs’ plausibility ranges not validated by the 2011 Census. Some useful information (5-15 years), little useful information for 0-4 year olds. Plausibility ranges not advised for future use, as they currently stand. The use of tolerance ranges (as in Census) not ruled out for future use 15

16 Part 2 Mid-year estimate QA tool 16

17 Background to MYE QA tool Quality assurance of the 2011 Census made extensive use of admin data and demographic analysis MYE QA made some use Wanted to carry out something similar for the MYEs but taking into account the speed of release and the resources available. Solution, take the most useful and appropriate elements and use those. Mixed mode approach Carry out the sort of analysis our stakeholders do 17

18 The MYE QA Tool Comparing MYEs for 2013 with..... 18

19 Comparing MYEs for 2013 with..... Admin data 19

20 Understanding the quality of admin data 20

21 Coherence between counts from admin sources and MYEs Coverage and definitional differences  School census under-represents resident population  Changes to eligibility for child benefit  Areas with special populations  Timing PR list inflation PR list cleaning – reduces list inflation 21

22 The MYE QA Tool Comparing MYEs for 2013 with..... 22

23 Comparing MYEs for 2013 with..... 2011 and 2012 MYEs on a period basis 23

24 Comparing MYEs for 2013 with..... 2011 MYEs on a cohort basis 24

25 Quality assuring estimates for women to quality assure estimates for men (1) Admin data for working age males generally weaker than for working age females. Availability of data on fertility provides additional means of looking estimates of females. QA of estimates for females more comprehensive than for males. Use confidence around estimates for females to allow QA of males – sex-ratios. 25

26 Quality assuring estimates for women to quality assure estimates for men (2) 26

27 Using sex-ratios for QA Sex-ratio = males/females Analysis of sex-ratios in decade 2001 and 2011 shows these can be a strong indication of issues with the MYEs. Comparison of sex-ratios for 2001 and 2011(Census based) shows distribution of sex-ratios is broadly constant. Use distribution of sex-ratios in 2011 to evaluate local authorities over the decade. 27

28 Spread of sex ratios Given by standard deviation Note: Excludes Isles of Scilly 28

29 Using sex-ratios What does the real distribution of sex-ratios look like? 29

30 Using sex-ratios How does Ceredigion compare? 30

31 Summary of MYE QA tool Necessity of a mixed mode approach Patient register, child benefit, state pensions, school census. Sex-ratios Fertility Change over time Present data on period and cohort basis Published alongside MYEs on day of release Access to the same data for each local authority (lower & upper tier), regions and England and Wales. 31

32 Improving the process The 2013 MYEs represent the first time we’ve run through this QA process. The main issue is time, the volume of estimates to QA more than fills the time available to do it. Increasing the amount of time to allow for contingency would be useful. For 2014 it is hoped to implement some prioritisation of “more tricky” local authorities. The potential to automate some of the checks. More resources 32

33 Evaluation As part of the development of the tool we talked to stakeholders, future developments require further engagement. Usefulness to stakeholders outside of ONS What else could be included? What could be clarified? Via StatUserNet, Population Statistics Community Via LIRPs! 33

34 What do you think? Have you looked at or used the MYE QA tool? Your experiences? From what you’ve seen today is this something you would find useful? What else would you like to see? Do you do something similar? 34

35 Part 3 Measuring uncertainty in the ONS mid-year estimates 35

36 Measuring uncertainty around the mid-year population estimates allows users to evaluate change over time We have already published uncertainty measures for 2002-10 as research statistics Uses modelled immigration Uses school boarder adjustment We are now reviewing the methods to take into account (1) recent changes in the way the MYEs are calculated and (2) using new information from the 2011 Census Uncertainty measures- work in progress 36

37 Summary of uncertainty measures (as % of MYE) 37

38 Methods have been developed in collaboration with academics at Southampton University We use a simulations-based approach to measure variability around the MYEs Our methods mirror the complexity of current population estimates, which involve using administrative, survey and census data and a range of statistical techniques Uncertainty measures- our approach 38

39 Base population Natural change International migration Internal migration Other changes Base population International migration Internal migration Natural change Other changes Assume no variance +-+- +-+- +-+- +-+- Uncertainty estimates = Bootstrapping to create 1,000 simulations to derive 95% CIs for MYEs MYEs= Cohort component method 39

40 Bootstrapping International Immigration International Passenger Survey National Estimate Workers Students Others UK Returners Split by type MWS HESA/BIS/WAG PRDS Census Use admin data to distribute to LAs 348 Local Authority estimates of international immigrants Recombine to create LA totals MYEs= Uncertainty estimates = 1,000 simulations from IPS 1,000 simulated admin counts for each migrant type in each LA Apply admin-based proportions to IPS estimate for each migrant type to derive counts of each migrant type in each LA 1,000 Worker counts 1,000 Student counts 1,000 ‘Other’ counts 1,000 UK Returner counts 1,000 simulated international immigrant counts for each LA Sum these to produce 1,000 LA totals. 26 th and 975 th ranked values provide uncertainty interval 40

41 What is bootstrapping? International immigration over time 41

42 International immigration bootstrap 42

43 2 bootstraps 43

44 More bootstraps 44

45 Lots of bootstraps 45

46 International Immigration 46

47 International Immigration 47

48 Apportionment to LA - streams 48

49 Apportionment to LA - streams 49

50 e.g. Foreign-born worker in-migrants

51 1 bootstrap

52 2 bootstraps

53 More bootstraps

54 Lots of bootstraps

55 Summary all bootstraps for allocation of IPS to LA

56 Summary of resultant estimates by LA

57 57

58 Key Points Produces 95% confidence intervals around overall estimates for each LA (no age/sex breakdown). Uses bootstrapping (simulation) Only covers variance due to 2001 Census, internal migration and international migration. 58

59 What do you think? Your experiences of looking at/using data from the previous release? Is it useful to have the Confidence intervals around total population? 59

60 Part 4 Visualising the causes of discrepancies between rolled forward and Census-based mid- year estimates 60

61 Background Based on QA work for 2011 Census and from dealing with LA queries on differences between Census estimates and MYEs Consistent approach was needed for all LAs: explores MYEs at component level seeks to understand processes used to derive MYEs uses comparators where available 61

62 Aims of work 1.To provide a consistent way of explaining the most likely causes of discrepancies between Census based and rolled forward mid-year estimates 2.To provide a consistent way of understanding the most likely causes of bias in the mid-year estimates rolled forward from 2011 62

63 Basic Principles Within MYEs overall discrepancies are a result of discrepancies at a component of change level Multiple components set up a complex web of effects and may cause compensating differences Comparators only allow us to indicate potential discrepancies Highlighting a potential discrepancy at the component level can improve understanding of the estimates and may aid the improvement of their quality Visualisation aims to make a complex analysis quick and easy to understand for a non-technical user 63

64 Basic principles Tool to explain risk/flow interaction Auto-text to explain method and output to user 64

65 Basic Principles Develop an understanding of how each component may be deficient or how it compares to an external benchmark. Construct an alternative population estimate ad for each issue. Compare alternative to published series to determine impact. Calculate Z-scores to indicate whether each LA/Sex/age group for each component is unusual. 65

66 Basic Principles Values by age and sex are compared against a mean across LAs using a standard deviation across LAs. Values at the ends of the distributions are most likely to show a potential discrepancy. Uses survey and administrative data 2001/2011 Census, 2001/2011 Census Response Rates, MYEs 2001-2011, Patient Register 2001-2011, Flag4 2006-2010, IPS 2001-2011, Internal Migration 2001-2011 66

67 Demonstration Compounding Compensating Messaging 67

68 Moving Forward... Ongoing work includes: Using the 2011 Census as a base and exploring current and forthcoming mid-year estimates Conversion of the processing element to SAS Auto-text to aid users in understanding of output Fine tuning Differential sensitivity of indicators (school boarders more sensitive than other components) Utilising other intelligence (such as the characteristics of each LA). 68

69 Key Points Aims to provide reasonably informed intelligence about where they may be issues with the mid-year estimates. Based on intelligence gathered from Census QA and dealing with stakeholder queries Articulating complexity of compounding and compensating errors. Provides intelligence by sex and quinary age group. Feeds into development of improved methods 69

70 What do you think? This provides indicative rather than definitive intelligence about potential issues Is this type of information useful to you? How would you use it? We intend to publish, the interactive tool along with papers outlining the methods What do you think Does this look plausible to you Would you like to help? 70

71 Quality tools Summary ONS have pursued four very different ways of assessing the quality of MYEs Plausibility ranges no longer being taken forward MYE QA tool for 2013 MYEs available now New Confidence intervals coming soon Visualisation tool in development 71

72 Different tools for different roles Confidence intervals aim to provide statistically robust measure of accuracy. MYE QA tool aims to find where MYEs look unusual against administrative and demographic comparators. Visualisation tool aims to show which components are likely to be contributing to discrepancies between MYEs and reality. 72

73 What do you think? Are these quality tools what you were expecting? Coherence between tools Usefulness of different approaches How would you use these tools? 73

74 Where to find further information....... Plausibility ranges http://www.ons.gov.uk/ons/guide-method/method-quality/specific/population-and- migration/population-statistics-research-unit--psru-/latest-publications-from-the- population-statistics-research-unit/index.html MYE QA Tool http://www.ons.gov.uk/ons/publications/re-reference- tables.html?edition=tcm%3A77-322718 Uncertainty in LA MYEs http://www.ons.gov.uk/ons/guide-method/method-quality/imps/latest news/uncertainty-in-la-mypes/index.html Visualisation tool (coming soon) Mark.auckland@ons.gsi.gov.uk 74


Download ppt "Quality Measures for ONS population estimates: Introduction Local Insight Reference Panels Autumn 2014 1."

Similar presentations


Ads by Google