Presentation is loading. Please wait.

Presentation is loading. Please wait.

An introduction to surveys – the basics U3A Cambridge Summer term July 4 th 2014 Jill Tuffnell.

Similar presentations


Presentation on theme: "An introduction to surveys – the basics U3A Cambridge Summer term July 4 th 2014 Jill Tuffnell."— Presentation transcript:

1 An introduction to surveys – the basics U3A Cambridge Summer term July 4 th 2014 Jill Tuffnell

2 Scope & coverage My credentials Overview of statistical platform & key concepts: confidence interval; confidence level and sample size Survey design and methodology Response rates Examples Questions

3 My background Degrees in economics & statistics; operational research Work for National Coal Board, local government & public policy consultancy Headed Research Group at Cambs County Council; specialist in labour market, Census analysis and involvement in many surveys – ranging from new settlements to travellers to fire-fighters to footpaths

4 (Brief) overview of statistics (i) Kit of tools enabling us to measure, interpret and analyse data – quantitative & qualitative Sample surveys based on probability theory Sample size has a non-linear relationship with the population being sampled The ‘best’ surveys require every member of a population to have the same, random chance of being surveyed; important to measure response rates; also applicable to Censuses

5 (Brief) Overview of statistics (ii) Confidence interval: produce a result which is, for example within + or – 5%, or + or – 3% Confidence level or margin of error: measures how ‘sure’ we can be: i.e. that we are 95% certain Sample size: generally the larger the sample, the lower the confidence interval See sample size calculator

6 (Brief) overview of statistics (iii) Example: 5,000 in your total population Question on, e.g. involvement in volunteering Want a sample to generate a 95% confidence level within a + or – 2.5% confidence interval Requires a random sample giving 1,176 responses Actual sample numbers must account for non-response But a population of 75,000 would only require a response by 1,506; 1,000,000 population: 1,534 These sample response numbers must apply to all sub- populations of interest – e.g. women, aged over 70

7 Survey design Depends on ‘population’ being sampled: e.g. – Households – Population – Businesses – Footpaths Sampling frame (total ‘population’): – Electoral register (now restricted access) – Address list – Telephone book – BT list very restricted; random dialling? – Business rates; Companies House; Inter Departmental Business Register – Record of public rights of way

8 Main types of survey Random or quota Face to face: interviewer Postal – written questionnaire Telephone Online/ Critical to measure who responds and who does not; non-response is a major issue and can seriously impact on results Must adjust results for non-response

9 Face to face Randomly-chosen from total population of interest best – but ‘quota’ often used, e.g. for street surveys Collect information on key demographics such as sex, age, ethnicity, proxy for socio-economic group, such as job or tenure Can adjust results for response rates from different sectors Household-based surveys generally most robust, but expensive: must involve ‘call-back’ for non-response Problems with restricted access to electoral rolls & coverage of electoral rolls Language issues; also rarely cover children If you can afford it – the ‘Rolls Royce’ of surveys

10 Postal Requires comprehensive and up-to-date address list (students & private lets an issue!); no national Address List Problems with multiple-occupancy homes & institutions Issue of language important Non-response very high; how to attract? Often require second and third postings, so time frame can be long Offer incentives, such as inclusion in prize draw; promise of feed-back on results Always provide stamped, addressed envelopes Keep as short as possible Known response bias – owner-occupiers, professionals, elderly{ high; private renters, unemployed, young{ low

11 Telephone Most adults have a phone But there is no comprehensive sampling frame/population, even for landlines Landline inadequate – no coverage of young people who mainly use mobiles Text surveys now developing Response bias – so has to be accounted for Random dialling? Best suited to marketing – but increasingly used for short surveys, especially by Political Parties

12 On-line surveys Data protection may prevent access to s Bias due to user profile – against elderly, poor & children Have to adjust for differential response rates Survey overload an increasing problem More use of incentives (e.g. entry to prize draw) Free software (e.g. Survey Monkey) suggests surveys are easy – but can be awful! Cheap to run, as data can be downloaded easily into survey analysis programs

13 Example – National Trust ‘visitor experience survey’ Moved from written questionnaire of visitors to on-line survey, based on a sample of visitors drawn from the barcodes of membership cards ‘zapped’ and then ed where possible Discovered ‘drop’ in satisfaction levels But the card survey was biased – mainly offered to ‘happy’ smiling visitors! Survey restricted to existing members – rather than the wider audience of ‘one-off’ visitors and prospective members; not very helpful if you want to find out how to attract more members NT ‘VE’ targets for properties are set very high

14 Questionnaire design (i) Identifying what to include: initial use of ‘focus’ group(s) to bring out key issues valuable But beware recruitment of the ‘loony’ brigade! Order of options/questions important – most people lose interest/concentration towards the end; can rotate order Try to avoid giving an uneven number of choices – e.g. excellent, good, average, poor, very poor – as people tend to go for the middle option: ‘average’ Avoid ‘leading’ questions: ‘You enjoy your visits to Wimpole Hall, don’t you?’ Yes or No

15 Questionnaire design (ii) Many people will avoid the top & bottom choices in a range; therefore unrealistic for the NT to set targets based on ‘top’ responses alone (i.e. scores on ‘very satisfied’ alone, not counting ‘satisfied’). 7 rather than 5 options? Change order of questions over a survey as a whole Always provide respondents the opportunity to complete an ‘any other comments/views’ section No point in carrying out a survey to support a pre- determined approach – a waste of money! Keep it as short as possible & feed back results

16 Dealing with response bias Respondents do not fully mirror the population sampled – response bias Even the Population Census (100% of homes) has a biased response (low on young men living in shared, rented homes) If you know the true population shares you can adjust, e.g.: – Elderly home owners are over-represented – Young private renters in multiple-occupancy homes under- represented – Social renters under-represented – Unemployed under-represented – Small service businesses under-represented in business surveys

17 Example of interesting surveys (i) Future housing/site needs of Travellers living in parts of the East of England, 2007 for next 15 years Very expensive! Involvement of ARU and University of Buckingham specialist academics – Face to face only realistic option – Identified key Traveller with respect of the community(ies) – Recruitment and training of Travellers to carry it out – Had to be person to person because of illiteracy – Used hand-held recorders – Issues of women only interviewing female travellers and men interviewing males – Questions restricted due to community concerns

18 Examples of interesting surveys (ii) Cambourne new settlement in Cambs: surveyed in % of households; known private/social split Postal, with prize draw & free post-back; two waves required to enhance social-rented sector responses (recorded questionnaires by code) Quantitative & qualitative issues covered Critical demographics on who moved in: sex, age, tenure, jobs, occupations, place of work, previous home location & tenure; compared with Census & administrative data to identify response bias Also views on the community and how it was developing; short-comings; plus points, why people moved here Feeds into planning for other new communities, such as Northstowe All new housing estates in Cambs surveyed since 2007 Results available on Cambs CC website

19 Cambourne survey results 2007 People attracted by Comberton Village College secondary catchment – brought in far more families with children aged 9+ than expected Identified more ‘private-renters’ than expected High levels of commuting to Papworth and Cambridge; also London commuters via St Neots Mainly professionals, with most adults working Residents liked the environment overall But were fed-up with living on a building site, with few community facilities provided as promised

20 Implications for U3AC surveys Likely to be a biased response from members: but we are the demographic group most likely to complete surveys! Should we be surveying eligible non-members? How do we identify and approach them? Offer incentives? Use Census 2011 data, compared with our membership list, by age/sex/ qualifications /tenure. This is generally of high quality! Use on-line surveys as most potential new members will have

21 Your questions/views Over to you!


Download ppt "An introduction to surveys – the basics U3A Cambridge Summer term July 4 th 2014 Jill Tuffnell."

Similar presentations


Ads by Google