A Panel Pilot Study for English Business Survey Presented by Yi Zhang Department for Business, Innovation and Skills.

Slides:



Advertisements
Similar presentations
Guidelines on Integrated Economic Statistics United Nations Statistics Division Regional Seminar on Developing a Programme for the Implementation Programme.
Advertisements

1 Non-Response Bias Analyses of the Survey of Workplace Violence Prevention Andrew Kato, Kathy Downey, William McCarthy, and Samantha Cruz U.S. Bureau.
Annual growth rates derived from short term statistics and annual business statistics Dr. Pieter A. Vlag, Dr. K. van Bemmel Department of Business Statistics,
The National Center on Addiction and Substance Abuse at Columbia University ®
Innovation data collection: Methodological procedures & basic forms Regional Workshop on Science, Technology and Innovation (STI) Indicators.
Innovation data collection: Advice from the Oslo Manual South East Asian Regional Workshop on Science, Technology and Innovation Statistics.
Survey of Surveys The September 2012 Answering Practices Survey Presented and sponsored by Jonathan Wood, Head of CBI Survey Management Group, with analyses.
European Commission Directorate General Economic and Financial Affairs New question on capacity utilisation in services - state of play and way forward.
An Introduction to the UK Data Archive and the Economic and Social Data Service November 2007 Jack Kneeshaw, UKDA.
Connect Nevada Residential Technology Assessment Results.
Integrated Business Statistics Program (IBSP) Introduction Daniela Ravindra Director, Enterprise Statistics Division November 9th, 2010.
PB How can the Disabled Children’s Services National Indicator be used? Tuesday 8 th December.
User Satisfaction Why? User Satisfaction Surveys are conducted to ensure we receive feedback from our customers in order to gauge.
1© GfK 2014 | Considerations around switching from telephone to online survey methodology | November 2014 EU WORKSHOP ON RECENT DEVELOPMENT IN BUSINESS.
United Nations Statistics Division Revision policy and dissemination practices Training Workshop on the Compilation of Quarterly National Accounts for.
Building Web Collection Capability for ONS Social Surveys Fiona Dawe and Laura Wilson.
Destinations What do you aim to achieve through the publication of destination measures? We have made it very clear that we want to put more information.
2 nd September 2014 Sean White – Tourism Intelligence Unit (ONS) Measuring Tourism Locally in Northern Ireland.
1 Health Status and The Retirement Decision Among the Early-Retirement-Age Population Shailesh Bhandari Economist Labor Force Statistics Branch Housing.
Trade and business statistics: use of administrative data Lunch Seminar Enrico Giovannini Italian National Statistical Institute (ISTAT) New York, February,
Ten Year Longitudinal Study of Adolescent Mothers and their Children Catholic Family Service of Calgary Louise Dean Centre Holly Charles & Brenda Simpson.
Version 1 | Internal Use Only© Ipsos MORI 1 Version 1| Internal Use Only Sheffield CCG CCG 360 o stakeholder survey 2014 Summary report.
MEADOW: Guidelines for a European survey of organisations Nathalie Greenan CEE and TEPP-CNRS Exploring possibilities for the development of European data.
Understanding Society: the UK Household Longitudinal Study Professor Vernon Gayle ISER, Essex University
Fourth Joint EU-OECD Workshop on Business and Consumer Opinion Surveys EU-wide business survey in the financial service sector Brussels, 12th October 2009.
Customer Satisfaction Research Produced for: Raven Housing Trust – November 2012 Presented by Emma Hopkins Customer Satisfaction Research Produced for:
HOUSEHOLD SURVEY PROGRAMME IN UGANDA: PAST EXPERIENCES AND FUTURE PLANS By James Muwonge Uganda Bureau of Statistics OCTOBER, 2009.
Presentation by Wendy Launder General Manager CRC and Small Business Programs.
ESRD-CAHPS Field Test Beverly Weidmer, M.A. RAND Corporation CAHPS RAND Team.
12th Meeting of the Group of Experts on Business Registers
1 Metagora: Current Progress and the Way Forward PARIS21 Steering Committee Paris, 13 November 2007.
REFERENCE METADATA FOR DATA TEMPLATE Ales Capek EUROSTAT.
SUSTAINABLE PROCUREMENT CENTRE OF EXCELLENCE FOR HIGHER EDUCATION EAUC Annual Conference - York 28 th March 2012.
Improving the Design of UK Business Surveys Gareth James Methodology Directorate UK Office for National Statistics.
National Institute of Economic and Social Research WERS 2004 Second User Group Meeting WERS 2004 Information & Advice Service.
Discussion of Hujer, R. and S. Thomsen (2006). “How Do Employment Effects of Job Creation Schemes Differ with Respect to the Foregoing Unemployment Duration?”
1 Data Linkage for Educational Research Royal Statistical Society March 19th 2007 Andrew Jenkins and Rosalind Levačić Institute of Education, University.
Beyond surveys: the research frontier moves to the use of administrative data to evaluate R&D grants Oliver Herrmann Ministry of Business, Innovation.
Centraal Bureau voor de Statistiek Challenges of redesigning household surveys and maintaining output quality Menno Cuppen Paul van der Laan Wim van Nunspeet.
Improving the Respondent Experience in the United Kingdom Julie Curzon/Debra Prestwood UK Office for National Statistics (ONS)
SAMPLE SELECTION in Earnings Equation Cheti Nicoletti ISER, University of Essex.
Slide 1 November 23, GHS - PANEL SURVEY GHS PANEL SURVEY Nigeria Exploring Financial Capability Reserve Bank of India-OECD-World Bank Regional Conference.
Improving of Household Sample Surveys Data Quality on Base of Statistical Matching Approaches Ganna Tereshchenko Institute for Demography and Social Research,
3.14 X AXIS 6.65 BASE MARGIN 5.95 TOP MARGIN 4.52 CHART TOP LEFT MARGIN RIGHT MARGIN ©TNS 2013 Are ‘better’ interviewers more successful at.
Using propensity score matching to understand what works in reducing re-offending GSS Methodology Symposium Sarah French & Aidan Mews, Ministry of Justice.
Pastors Report Mixed Economic Signs Survey of 1,000 Protestant Pastors.
Welcome to the “Aiming High for Disabled Children” Information Day.
Analysis of the characteristics of internet respondents to the 2011 Census to inform 2021 Census questionnaire design Orlaith Fraser & Cal Ghee.
Overview of evaluation of SME policy – Why and How.
Introduction to the Pennsylvania Kindergarten Entry Inventory.
Financial Downturn for Churches in 2010 Survey of 1,000 Protestant Pastors October 2010.
CHANGE READINESS ASSESSMENT Measuring stakeholder engagement and attitude to change.
2016ECA Partners Opinion Survey j 2016 UNECA Partners Survey 2016 Conference of Ministers, March 31 st to April 5 th, Addis Ababa, Ethiopia 1.
Skills Context – North East LEP 22 th March 2016 Michelle Duggan Fiona Thom.
March 2011 UNECE Statistical Division 1 Challenges & Problems of Short- Term Statistics (STS) Based on the UNECE paper on Short-Term Economic Statistics.
1 Strategy for mobilizing funds for agricultural census – Tanzania Experience By Lubili Marco Gambamala National Bureau of Statistics 97.7% of smallholder.
1 Transnational Partner Search Toolkit Transnationality Contact Points Meeting 30 September Warsaw.
1MIL client logo to be positioned at the mark minimum height maximum size navigator Text Lines MIL Agenda.
Looking for statistical twins
Hillingdon CCG CCG 360o stakeholder survey 2014 Summary report.
Sharne Bailey, Tony Byrne UK, Office for National Statistics
Emerging Trends in the Production of Short-term Economic Statistics
Global Assessment on Tendency Surveys
LAMAS Working Group December 2014
Richard Heuberger, Nadja Lamei Statistics Austria
LAMAS Working Group 29 June-1 July 2016
Guidelines on Integrated Economic Statistics
The Hub Innovation Program Evaluation Plan
Harrow CCG CCG 360o stakeholder survey 2014 Summary report.
Development and Testing
Presentation transcript:

A Panel Pilot Study for English Business Survey Presented by Yi Zhang Department for Business, Innovation and Skills

Overview Introduction to EBS –Why, Who, How –Dissemination Panel Pilot Study –Pros and Cons of Panel approach –Objective –Methodology Timescale Sample Selection and profile Fieldwork –Results Analysis of responses Mode effects –A propensity weighting approach –Conclusions and Next steps Comments and Questions

Introduction to EBS

EBS - Why ? Need to maintain regular flow of timely sub-national data after close-down of RDA EBS seen as best way to provide this Survey intended to complement existing sources, not replace them In accordance with transparency agenda, data will be freely available for users to conduct own analysis

Introduction to EBS EBS - Who? Data collected by TNS-BMRB – independent contractor commissioned by BIS Three-thousand workplaces surveyed every month Covers all nine English regions Sample drawn from Inter-Departmental Business Register Covers all sectors, including public sector

Introduction to EBS EBS- How? Short telephone-based survey Designed to be light-touch as possible Approximately minutes to complete Voluntary Directional questions - able to complete without detailed material Currently use a cross-sectional approach Panel pilot study undertaken to explore the possibility of moving to a panel from year 2 onwards

EBS Dissemination Monthly Statistical Releases Planned Quarterly Statistical Releases Online reporting tool (under development) Balance statistics (i.e. higher minus lower) by workplace size) Jan 12 vs. Oct 11

Panel Pilot Study

Pros of a Panel Approach Increase analytical capability –Track individual workplaces over time and across samples –Enhance indicators of economic activity and behaviour Improve quality and reliability –More larger businesses can be interviewed –Less variation in the sample between waves Cost effectiveness –Cheaper to re-contact the same respondents

Cons of a Panel Approach Potentially increase respondent burdens - a maximum of four interviews per year for each workplace as opposed to one currently May cause a break in the series so far – size of break is unknown but could be investigated in early stages of panel Expected to be small given the questions remain the same and the panel gradually built up

Panel Pilot Study - Objectives

Objectives of the Pilot Study To estimate the proportion of workplaces to participate in a Panel survey (by size, region and sector) To estimate the proportion of workplaces to respond online To explore any mode effects between the online and telephone data

Panel Pilot Study – Method

Timescale for the Pilot Study MonthAction November 2011Main fieldwork starts January 2012Select sample for Panel re-contact February 2012Run Panel experiment (contact 1) (online invite followed by telephone contact) March 2012Interim analysis of results April 2012Run Panel experiment (contact 2) (online invite followed by telephone contact) Early May 2012Analyse results May- June 2012Discussion of panel results and assessment of pros and cons of a changed approach By end of June 2012Decision taken as to whether to adopt a panel approach

Sample Selection 3081 interviews in Nov main fieldwork 2357 agreed to be re-contacted 1000 selected in Re-contact 1 (957 continued in Re-contact 2) 1,357 remain A1: workplaces with address Re-contact 1 – 688 Re-contact 2 – 662 A2: workplaces who provided no - to be contacted by phone Re-contact 1 – 312 Re-contact 2 – 295 B: workplaces with addresses (contacted to boost online response) Re-contact 1 – 926 Re-contact 2 – 914

Sample Profile by Region and Sector Similar profiles of selected sample, those who agreed to recontact, and the Nov fieldwork sample within each region and sector

Sample Profile by Workplace Size Profile of those who agreed to re-contact mirrored very closely to the main Nov sample by workplace size Relatively higher proportion of 50+ selected for the panel – to encourage more larger workplaces to participate

Panel Fieldwork Two re-contact panels –Each one calendar month: Feb and Apr 2012 –Overlap of respondents between two panels : ~40% –Same design for each follow up: two groups: A1: mixed modes ; A2: CATI only two modes: online (CAWI), telephone (CATI) A1:Supplied address & Phone # CAWI CATI & CAWI A2:Supplied Phone # only CATI wk1 wk2 wk3 wk4

Panel Pilot Study – Results

Results 1: Conversion rates and online take up Conversion rates consistently lower in Panel Apr across workplaces of different size, regions and sectors No consistent pattern - conversion rate higher for the CATI only (A2) compared to the mixed mode (A1) in Panel Feb, but conversion rates for A1 and A2 similar in Panel Apr –Due to a lower conversion rate for large workplaces from A2 group in Panel Apr Rates for the online mode much lower than CATI for both panels

Results 2: Conversion rates by workplace size and industry Workplaces with 250+ employment were less likely to respond for both panels Those in Education, Health Public Admin and defence were more willing to participate in both panels

Results 3: Conversion rates by region Regional variation for both panels – London and the North West were consistently less likely to participate

Mode Effect – Propensity Weighting Approach To investigate whether there was a online (treatment) vs. telephone (control) effect Propensity score weighting –Aim: Control for differences in profile between two groups –Method: A logistic regression model to produce the propensity weights Applied weights to the telephone group to match the profile of workplaces that completed the survey online Y: mode of completion X: Workplace characteristics - employment size, region, sector, single/multi sites, Age Weight:

Mode Effect – Propensity Weighting Approach (2) Only workplace size predictive of being in the online sample –The online sample has a higher proportion of small workplaces compared with the telephone sample After weighting, workplace sizes in the telephone sample closely matches those in the online panel, while region and industry profiles remain similar

Results: Mode Effect Difference between two modes but not consistent pattern –small on-line sample –samples not aligned as closely as they need to be after propensity score matching Significant different outcomes of key questions between online and weighted telephone samples: –Those responding by telephone were more likely to agree to re-contact in future and to have their data linked –Different responses to the key questions, but not consistently negative or positive E.g. Those responding by telephone were less likely to say The same at level of business activity or volume of output last month compared with 3 months before

Panel Pilot Study – Conclusions and Next Steps

Conclusions Workplaces would engage with a panel survey - conversion rates where higher than predicted. If a panel adopted, ~50% of the interviews come from re-contact sample in the first follow up month Online is not a popular mode - Only a small proportion of any mixed mode panels would be online respondents Indications of a possible mode effect - exist but not coherent. More agreements to re-contact and data linkage via telephone mode than online mode

Next Steps Results point to a telephone only panel approach Further analysis of pilot results Put to EBS SG for a decision about whether to change the design for November 2012 fieldwork

Thanks for your attention! Comments or Questions?