Designing and Using a Behaviour code frame to assess multiple styles of survey items Alice McGee and Michelle Gray.

Slides:



Advertisements
Similar presentations
Using Text Effectively in the Biology Classroom
Advertisements

Survey design. What is a survey?? Asking questions – questionnaires Finding out things about people Simple things – lots of people What things? What people?
Child Protection Rapid Assessment Tool
HOW TO PASS AS EVALUATING STATISTICAL REPORTS.
“... providing timely, accurate, and useful statistics in service to U.S. agriculture.” Using Mixed Methods to Evaluate Survey Questionnaires Heather Ridolfo,
Different approaches and techniques of behaviour coding Yfke Ongena Workshop on Behaviour Coding Wivenhoe House, University of Essex 16 February 2007.
An Introduction to Focus Groups Peter Harper. Focus Groups are Group Interviews Focus group interviews are characterised by: having a clear focus involving.
Educational Research: Competencies for Analysis and Application, 9 th edition. Gay, Mills, & Airasian © 2009 Pearson Education, Inc. All rights reserved.
CHAPTER 2 THE RESEARCH PROCESS. 1. Selection of topic  2. Reviewing the literature  3. Development of theoretical and conceptual frameworks  4.
Formative and Summative Evaluations
Research Methods for Business Students
Business research methods: data sources
Chapter 6 The Survey Interview. © 2009 The McGraw-Hill Companies, Inc. All rights reserved. Chapter Summary Purpose and Research Structuring the Interview.
Developing a Questionnaire. Goals Discuss asking the right questions in the right way as part of an epidemiologic study. Review the steps for creating.
Survey Research Questionnaire construction Types of surveys
FINAL REPORT: OUTLINE & OVERVIEW OF SURVEY ERRORS
RESEARCH QUESTIONS, AIMS AND OBJECTIVES
Interviewing Stakeholders: Evaluating Support for Policy Change in Your Community.
ACE TESOL Diploma Program – London Language Institute OBJECTIVES You will understand: 1. The challenges of assessing student speaking ability. 2. Various.
Sample Design.
Business and Management Research
Questionnaires and Interviews
School Counselors Doing Action Research Jay Carey and Carey Dimmitt Center for School Counseling Outcome Research UMass Amherst CT Guidance Leaders March.
Bryman: Social Research Methods, 4 th edition What is a structured interview? Useful tool of quantitative research Often used in social surveys Standardized.
ISU Phase 4 – Primary Research Proposal. The Primary Research Proposal Phase will be handed in (along with Phase 3) as part of your final research paper.
Evaluation Techniques Material from Authors of Human Computer Interaction Alan Dix, et al.
Progression in ICT Key Stage 1 - Children learn how to…... explore ICT; use it confidently and purposefully to achieve outcomes; use ICT to develop their.
S7: Audit Planning. Session Objectives To explain the need for planning To explain the need for planning To outline the essential elements of planning.
Data Collection Methods
EVALUATION APPROACHES Heather Aquilina 24 March 2015.
Becoming Familiar with the GRE General Test GRE Test Preparation Workshop for Campus Educators.
Evaluation Techniques zEvaluation ytests usability and functionality of system yoccurs in laboratory, field and/or in collaboration with users yevaluates.
Audit Planning. Session Objectives To explain the need for planning To outline the essential elements of planning process To finalise the audit approach.
Using Learning Technology to Design and Administer Student Assessments Catherine Kane Centre For Learning Technology, CAPSL Ref:
MGT-491 QUANTITATIVE ANALYSIS AND RESEARCH FOR MANAGEMENT OSMAN BIN SAIF Session 5.
How do we assess whether we are improving instrument design? Alice McGee.
How to Use Media Master to record your voice in the ESL Lab By Marsha Chan.
Understanding Sampling
Survey Methodology Survey Instruments (1) EPID 626 Lecture 7.
The Marketing Research Process Overview. Learning Objectives  To learn the steps in the marketing research process.  To understand how the steps in.
SOCIAL SCIENCE RESEARCH METHODS. The Scientific Method  Need a set of procedures that show not only how findings have been arrived at but are also clear.
© 2001 South-Western College Publishing1 CHAPTER SEVEN DECISION SUPPORT SYSTEMS AND MARKETING RESEARCH Prepared by Jack Gifford Miami University (Ohio)
Sampling Design and Analysis MTH 494 Ossam Chohan Assistant Professor CIIT Abbottabad.
Using Behavior Coding to Evaluate the Effectiveness of Dependent Interviewing Joanne Pascale QUEST Conference Ottawa, Canada April 26, 2007.
Program Evaluation for Nonprofit Professionals Unit 4: Analysis, Reporting and Use.
CEM – 599 GRADUATE SEMINAR 1 CONSTRUCTION ENGINEERING & MANAGEMENT DEPT. CEM 599 RESEARCH METHODS IN CONSTRUCTION RESEARCH METHODS IN CONSTRUCTIONBY RIZWAN.
Lesson 11: Designing Research. Naturalistic Observation When designing a naturalistic observation researchers need to consider;  behavioural categories,
Helpful hints for planning your Wednesday investigation.
Survey Training Pack Session 3 – Questionnaire Design.
Slide 7.1 Saunders, Lewis and Thornhill, Research Methods for Business Students, 5 th Edition, © Mark Saunders, Philip Lewis and Adrian Thornhill 2009.
Agenda: Surveys –In-person, Paper, telephone, and web –Computer assisted –Instrument Design Administrative data WorkFirst Longitudinal Study Hands-on surveys.
Qualitative Research Methods Interviews Alexandra Bousiou (School of Public Administration)
Preparing to Interview Plan the interview The purpose of the interview is to get usable audio to tell the story involved What do you want to get from the.
By Dr Hidayathulla Shaikh. Objectives  At the end of the lecture student should be able to –  Define survey  Mention uses of survey  Discuss types.
Improved socio-economic services for a more social microfinance.
Evaluation What is evaluation?
Research in natural settings 2 Qualitative research: Surveys and fieldwork Macau University of Science and Technology.
AC 1.2 present the survey methodology and sampling frame used
Human Computer Interaction Lecture 21 User Support
Data collection – questionnaires and surveys
Preparing for your research report
Social Research Methods
Interviews & focus groups
Filming Interviews Indoors.
Interviews & focus groups
Interviews & focus groups
HCI Evaluation Techniques
Evaluation tools training
Evaluation Techniques
Interviews & focus groups
Presentation transcript:

Designing and Using a Behaviour code frame to assess multiple styles of survey items Alice McGee and Michelle Gray

Presentation outline Background to study Aims of research Methodology Designing a behaviour code frame Using the behaviour code frame Analysing the data Lessons learned

Background to study English Longitudinal Study of Ageing (ELSA) Dependent Interviewing (DI) Two types of data item  Feed Forward (DI)  Non-Feed Forward (non DI) Little evaluation of the impact of DI on data quality conducted to date

Aims of research Research aims:  To assess how DI affects data quality  To explore how Rs react to feed-forward phrases  To find whether this varies by nature and sensitivity of topic Methodological aim:  To explore the combination of CARI and Behaviour Coding as methodological tools

Methodology Computer Assisted Recorded Interviewing (CARI)  Computer acts as a sophisticated tape recorder  Unobtrusively records interaction Behaviour Coding  Codes systematically applied to interviewer- respondent behaviours  Uncover and assess problems with questions Two methods combined for this study

Designing the code frame

Principles for good design Code frame adapted from Cannell et al (1989) Short and straightforward Few, easy to apply codes Discrete Broad rather than specific

Behaviours coded Question asking behaviour for interviewers Immediate response behaviour for respondents Whether partner intervened (concurrent interviews) Final outcome of the entire exchange

Two behaviour code frames Two code frames designed:  DI (feed-forward) items  non DI (non feed-forward) items First level exchange (initial utterance) Code what occured before other person speaks

Code frame

Behaviours coded Interviewer/Interviewer feed-forward Respondent/Respondent feed-forward Whether partner intervened Final outcome One code per behaviour

Interviewer codes Exact Wording/Slight Change01 *Major change02 *Omission03 *Question became a statement04 *Inaudible Interviewer/Other05 Not applicable 99 *denotes where notes must be made

Interviewer feed-forward codes FF item read as worded/slight change01 *FF statement became a question02 *FF question became a statement03 *Other major change04 *Omission05 *Inaudible Interviewer/Other06 Not applicable 99 *denotes where notes must be made

Respondent codes Adequate Answer 01 *Inadequate Answer/Elaboration 02 *Clarification 03 Question Re-Read 04 Don't Know 05 Refusal 06 *Inaudible Respondent/Other 07 Not applicable99 *denotes where notes must be made

Respondent feed-forward codes *Affirmed FF item - adequate 01 *Disputed FF item - adequate 02 *Inadequate Answer/Elaboration 03 *Clarification 04 Question Re-Read 05 Don't Know 06 Refusal 07 *Inaudible Respondent/Other 08 Not applicable 99 *denotes where notes must be made

Partner intervention codes *Yes 01 No 02 Not applicable (no partner present)99 *denotes where notes must be made Code used for where the respondents partner intervened and subsequently answered for the respondent

Final outcome codes Adequate Answer 01 *Inadequate Answer 02 Don't Know 03 Refusal 04 *Inaudible/Other99 *denotes where notes must be made Coding whether the final answer meet the objective of the question

Technical details

CARI equipment Equipment testing  External microphones CARI built into Blaise program Recording switched on and off at relevant items Sound files automatically generated and saved Sound files removed from interviewer laptops  Macro run  Data sticks (USB)

Behaviour coding system Conducted within Blaise Coding program designed for this purpose  Weststat testnote software Three windows displayed simultaneously  Blaise interviewing screen  Coding entry screen  Sound file (.wav) Automatically routed through interview Tags to skip to relevant data items

Using the code frame

Sound file

Blaise interviewing screen

Coding program

Data preparation and analysis

Organising the data Two types of data  Behaviour codes (quantitative)  Coder notes on non-standard behaviours (qualitative) All data automatically stored in Excel tab delimited file One Excel file produced for each coder Excel files amalgamated Exported into SPSS

Data preparation More cleaning than expected Two main problems:  Duplicate files (limitations of system used)  Incorrect code frame used at interviewer and respondent behaviours (DI and non DI items)

Analysing the data SPSS Frequencies and crosstabulations Coder notes provided additional context Very small base sizes at some items due to routing

Advantages and disadvantages of our approach and lessons learned

What worked CARI  Unobtrusive in nature  Minimal impact on interviewers and respondents Behaviour coding  Able to run statistical analyses  Able to draw conclusions  Method of coding easier than paper (routing)

What didn’t worked CARI  High number of inaudible or hard to hear cases (1/3 of respondents)  Purchased speakers to help Next time…  Fully re-test microphones  Probe respondents reasons for not giving consent to being recorded

What didn’t worked Behaviour coding  Lengthy and costly process Coding (approximately 45 mins per interview) Data cleaning  Over complex code frame  Coding method found cumbersome, limited and error prone  Coder judgement not measured

Next time... One code frame only Build in sufficient time for each stage Clear rationale for behaviour coding Inter-coder reliability test (Kappa score) Adequate sample for uncommon questions Create more sophisticated, less error prone coding system

Discussion & Questions...