Presentation is loading. Please wait.

Presentation is loading. Please wait.

Quantitative methods Questionnaire Design. 2 Stage 1: Research aims Stage 2: Literature Stage 3: Research design Stage 4: Instrumentation Stage 5: Piloting.

Similar presentations

Presentation on theme: "Quantitative methods Questionnaire Design. 2 Stage 1: Research aims Stage 2: Literature Stage 3: Research design Stage 4: Instrumentation Stage 5: Piloting."— Presentation transcript:

1 Quantitative methods Questionnaire Design

2 2 Stage 1: Research aims Stage 2: Literature Stage 3: Research design Stage 4: Instrumentation Stage 5: Piloting Stage 6: Data collection Stage 7: Data cleaning and Data analysis Stage 8: Research report Research Stages

3 3 What is a questionnaire? A questionnaire is a structured instrument for collecting primary data on populations of interest in applied or theory-based research. A well-designed questionnaire facilitates the respondents to provide complete and accurate information.

4 4 Main types of questionnaires Mail/Online questionnaires Structured interview schedules

5 5 Methods of administering Interview: face to face or telephone; Mail or other distribution method Computer-based

6 6 Interview/telephone Sampling implication Time of interview, sample selection Minimise interviewer effect Standardise interview schedule, scripted Example:Year_After_9-11.doc

7 7 Mail or Distribution Sampling implication Not supervised Clarity of questions Complexity versus simplicity Room for comments, problems with the questionnaire

8 8 Online or computer administered Sampling implication Not supervised Clarity of questions Can incorporate complex pathways of questionnaire items If answer is yes, go to Q5. If answer is no, go to Q7, etc Room for comments, problems with the questionnaire Missing responses checked

9 9 1. Establish a table of specifications, panel and revise if necessary 2. Write the questions 1. Determine the general question content needed to obtain each of the desired information 2. Determine the form of response for each of the questions 3. Choose the exact question wording. 4. Panel and revise the questions if necessary 3. Prepare the questionnaire layout for printing 1. Arrange the questions into an effective sequence. 2. Specify the physical characteristics of the questionnaire (paper type, number of questions per page, etc.) 3. Panel and revise the questions and the whole questionnaire 4. Pre-testing and Pilot the questionnaire. Analyse and revise the questions and the whole questionnaire if needed Steps

10 10 Steps 1. Identify the program objectives for which the questionnaire is being developed 2. Operationalise the objectives 3. Identify the population to be addressed 4. Identify the methods of administration 5. Establish the link between research questions, information needed, source of information and methods of data collection 6. Decide on how to measure each variable 7. Establish a table of specifications Establish a table of specifications

11 11 Identify the program objectives for which the questionnaire is being developed What are the general program objectives or research questions? What are the specific research questions? What are hypotheses? (draw a diagram)

12 12 PISA Contextual Questionnaire PISA_Questionnaire_TechnicalReport.pdf Example

13 13 PISA Research Themes Table 3.1 Examples: Student engagement with mathematics Mathematics self-efficacy Mathematics self-concept Mathematics anxiety Interest in and enjoyment of mathematics Instrumental motivation to learn mathematics Study time in mathematics

14 14 Example Report: Student engagement in schools StudentEngagementOECDWillms.pdf

15 15 List dimensions, variables that you should measure List specific information you hope to collect Determine specific information needed

16 16 Establishing the link between information needed, source of information and methods of data collection Information needed / variables to be measured Source of information Methods of data collection

17 17 Identify the population to be addressed Source of information Who is appropriate to provide the necessary information Characteristics of the target population

18 18 Think in advance how the questionnaire should be administered? Identify the methods of administration

19 19 Decide if the variable is directly observable or latent Is the variable measured by one item? Is the variable a composite of a number of items (indicators)? How to measure each variable

20 20 Example: Student engagement in schools (see Willms report) Define student engagement. Student engagement is measured by two components: Sense of belonging Participation What are indicators of Sense of belonging and Participation?

21 21 Latent variable (Construct) Is the variable measured through a set of indicators? If yes, what are the possible dimensions and/or indicators How is each indicator measured? Produce a table showing how to measure each indicator How to measure each variable

22 22 Questionnaire design Latent Variable Not directly observable Directly observable Indicators Latent Variables and Indicators

23 23 Questionnaire design Construct development Step 1: Define a meaning for your construct. It will be of narrow focus, capable of sustaining precise measurement. Step 2: Develop appropriate items for this construct. Step 3: Test the hypothesis that the items do indeed imply the meaning of the construct as defined. Step 4: Revise the items (Barrett, 2002)

24 24 1. Draft the first items 2. Panel the indicators How to develop indicators (items)

25 25 Example: How to measure Sense of belonging – draft items A. I feel like an outsider (or left out of things) B. I make friends easily C. I feel like I belong D. I feel awkward and out of place E. Other students seem to like me F. I feel lonely G. I do not want to go to school H. I often feel bored.

26 26 Example: How to measure participation Measured by the frequency of absence, class-skipping and late arrival at school during the two weeks prior to the PISA 2000 survey.

27 27 Do the variables cover all the information needed for the program? Do the indicators cover all the dimensions of the variable measured? Panel and Review the table of specifications for the program/ project

28 28 Data analysis to check the appropriateness of the indicators Example on student engagement in schools: (p64, Willms) A factor analysis of the responses found two factors one that is based on the first six items and describes whether students feel accepted and included by their classmates, The second is based primarily on the last two items and describes whether students like school and find it interesting. The analysis also revealed that the six belonging items contributed almost equally to the first factor. Therefore, the measure of sense of belonging used in this report is based on a Rasch scaling of the first six items

29 29 Reliability and Validity

30 30 High Validity Reliable Low ReliabilityHigh Reliability but low validity (A)(B)(C) Reliability and Validity

31 31 Validity is the ability of an instrument to measure what is designed to measure (Smith, 1991) Validity refers to the extent to which an empirical measure adequately reflects the real meaning of the concept under consideration (Babbie, 1990: 33) Validity

32 32 Example: Student engagement in schools (p.18, Willms) Participation is measured by the frequency of absence, class-skipping and late arrival at school during the two weeks prior to the PISA 2000 survey. There are two issues concerning the validity of the participation measure. One issue is that the measure of participation could be more extensive. It was measured in this study with a rather narrow focus on student absenteeism. The second issue pertains to how participation is measured. A number of students may have missed school because of illness or for other legitimate reasons.

33 33 Reliability - 1 Reliability is concerned with how much error is included in the evidence. If there is no error in the measurement, the same measurement should be consistent over time and context. The reliability of a measure refers to the consistency of measurement for repeated measurements of the same phenomenon. (Willms, p65)

34 34 Internal consistency reliability Description Concerned with how well the items act together to elicit a consistent type of response. Often referred to as Coefficient Limitations Requires statistical procedures to estimate reliability. Does not capture sources of error such as variation over time. Assumes all items tap into one single dimension. Usages Important to establish when designing a scale Reliability - 2

35 35 Reliability example Willms, p65. The measures of sense of belonging and participation are highly reliable at the country level: the reliability coefficients are 0.99 for both sense of belonging and participation.

36 Writing questions

37 37 Questionnaire design Design the items Issues to be considered for each item What information do I want to get? Is this factual or non-factual? How to ask? What kind of responses do I want to get? /How do I want the respondents to answer my questions (format) How will I code this item? Will I include the coding in the item format?

38 38 Questionnaire design Measurement Types Nominal Ordinal Interval

39 39 Questionnaire design Question types To facilitate question writing, it is important to know types of questions. There are two ways of classification of questions: Classification by response format Classification by types of information Factual. Non-factual (e.g., attitude)

40 40 Questionnaire design Question types - classification by response format Closed questions Open-ended questions

41 41 Questionnaire design Open-ended questions Open response types Explain why you left school? What were your reasons for leaving school?

42 42 Questionnaire design Open-ended questions - advantages People can express their exact opinions and feelings Do not limit the range of possible answers Potentially produce responses which draw attentions to an unanticipated situation or outcome when constructing the questionnaire Useful for testing hypotheses about ideas or awareness

43 43 Questionnaire design Open-ended questions - Disadvantages Difficult and time consuming to answer (require much effort from respondents) Difficult and time consuming to analyse

44 44 Questionnaire design Closed questions Alternative answers are provided and respondents are asked to choose from a list of provided answers

45 45 Questionnaire design Types of closed questions Checklists Two-way questions Multiple choice questions Ranking Scales Scaling questions

46 46 Questionnaire design Checklists Is used to verify the presence or absence of some phenomenon

47 47 Questionnaire design Checklists Example Which of these materials did you use? Which of these activities did you engage in? Which of these are the steps of conducting the project?

48 48 Questionnaire design What is a good checklist? Contains all the relevant options It is helpful to provide the option other for respondents to fill in at the end of a checklist.

49 49 Questionnaire design Two-way questions Measure a dichotomous variables Respondents are asked to choose one from two alternatives: Yes/No; Agree/disagree; For/Against; Good/Bad; Like/Dislike; Approve/Disapprove;

50 50 Questionnaire design MCQ questions MCQ is useful when there are several possible responses and you want to ensure that the respondents is aware of all the possibilities. Alternatives in MCQ should be mutually exclusive categories.

51 51 Questionnaire design Ranking Scales This format gives you an indication of how a respondent ranks a number of things. It is useful when there are a limited number of things you would like to have ranked.

52 52 Questionnaire design Scaling questions Questions with ratings on a latent scale

53 53 Questionnaire design Advantages of closed questions Compared to open questions, this type of questions is quicker and easier to answer More questions can be asked in a given length of time Can deal with a large number of respondents Low cost Make group comparison easy Avoid interviewer training

54 54 Questionnaire design Disadvantages of closed questions Loss of spontaneous responses May introduce bias by forcing respondents to choose between given alternatives May irritate respondents Relatively difficult to design

55 55 Questionnaire design Types of questions - Classification by types of information Factual questions Non-factual questions (Attitudes, stereotypes, beliefs, awareness)

56 56 Questionnaire design Factual questions Can be verified Single variable Relatively easy to design

57 57 Questionnaire design Non-factual questions Difficult to verify Latent variable Relatively difficult to design

58 58 Questionnaire design Issues to consider when writing the factual questions Do the respondents have the necessary information to answer the question?– Knowledge, memory. Will the respondents provide the information willingly? – Sensitive issues.

59 59 Questionnaire design Question Wordings Use Simple Words the catalogue system is too difficult for most readers to master vs I can never find the books I want (more direct, more appealing) Avoid acronyms, abbreviations, jargon and technical terms Avoid ambiguous words or the words with many meanings Have you ever assessed your colleagues teaching? Avoid leading questions You havent skipped any lessons in this semester, have you?

60 60 Questionnaire design Question wordings Avoid double-barrelled questions Do you buy weekly and monthly magazines / newspapers? Avoid implicit assumptions 1. When did you last borrow a video tape? 2. Did your siblings decision to leave school influence your decision to leave school? Dont overtax the respondents memory.

61 61 Questionnaire design Question Wordings Avoid proverbs or well-known sayings Avoid loaded words (heavily value laden terms) Do you think union bosses should be allowed so much power? Attitude statements are good if the respondents recognise the statements which force them to think

62 62 Questionnaire design Selection of types of questions The number of respondents The amount and types of information needed The characteristics of respondents (knowledge, age, culture, religions) The amount of time you have to process and interpret the data Your knowledge of the issues (the extent to which you can anticipate the range of possible answers). Your methods of data analysis

63 63 Preparing questionnaire layout Panelling, pre-testing and piloting

64 64 Questionnaire design Spacing Allocating sufficient space for answers Space requirements should be considered for : Open-ended questions Scaling questions Coding

65 65 Questionnaire design Instructions General Section Question

66 66 Questionnaire design General Instruction Reason(s) for the questionnaire A statement about anonymity The sample design - to indicate how the respondent was chosen How to return the questionnaire - if it is mailed A contact person What will happen to the results Thanks

67 67 Questionnaire design Question instructions How to answer the questions Make sure that the instructions and the questions correspond

68 68 Questionnaire design Order of the questions Very important There is no correct order

69 69 Questionnaire design Suggestions 1. Begin with easy and non-threatening questions 2. Do not begin with open-ended questions 3. Arrange questions from general to specific 4. Group questions into sections or topics 5. Use filter questions to ensure that the respondents are answering relevant questions 6. Attitude statements are suggested to be arranged in more or less random order 7. Keep the questionnaire as short as possible

70 70 Questionnaire design Consistency of questionnaire layout Try to use similar format for questions Distinguish different instruction levels

71 71 Questionnaire design Panelling and Reviewing Relevance of questions to the topic (check against the table of specification) Wording (Instructions, questions) Layout Spacing Instructions Order Consistency

72 72 Questionnaire design Pre-testing Test questionnaire Do the respondents understand the questions? Are there any difficulties? Are there any sensitive questions? Is the question order appropriate Does the researcher understand the respondent's response

73 73 Questionnaire design Who will be involved in Pre-testing Very small sample of the population targeted

74 74 Questionnaire design How to conduct Pre-testing Step 1: Brief the respondents about the questionnaire Step 2: Researchers record the respondents process of completing the questionnaire: through observation, video recording or audio recording to find out signs of difficulties or distractions and timing

75 75 Questionnaire design How to conduct Pre-testing Step 3: Debrief the respondents about the questions in the questionnaire Any difficulties? Why? What are the easy questions? Why? Any suggestions for improvement? Step 4: Revise the questions if needed

76 76 Questionnaire design Pilot Test the whole process Questionnaire Methods of administration and collecting the questionnaires Response rate/missing data Item analysis Data analysis

77 77 Questionnaire design Analysis Look at frequency of options in each question Too many uncertain, dont know responses, too many skipped or omitted items are bad signs in a pilot study. Reliability of scale constructed Decision of removing or replacing items of scales (for measuring latent variables)

78 78 Questionnaire design Revise and prepare the final version

Download ppt "Quantitative methods Questionnaire Design. 2 Stage 1: Research aims Stage 2: Literature Stage 3: Research design Stage 4: Instrumentation Stage 5: Piloting."

Similar presentations

Ads by Google