Presentation is loading. Please wait.

Presentation is loading. Please wait.

Best Practice Ideas: Questionnaire Surveys V. 1 March 26, 2009.

Similar presentations


Presentation on theme: "Best Practice Ideas: Questionnaire Surveys V. 1 March 26, 2009."— Presentation transcript:

1 Best Practice Ideas: Questionnaire Surveys V. 1 March 26, 2009

2 Best Practices: Questionnaire Survey 2 What are goals and benefits of a Questionnaire Survey: The baseline questionnaire survey follows the Concept Modelling, Threat Ranking, BRAVO and Results Chain process, but comes before the BROP. Hence, it is a key link among (1) LAP-led development of the Theory of Change, (2) the community generated & validated Concept Model, and (3) the creation of the BROP. The survey can help validate the assumptions about benefits/barriers needed to ensure behaviour change takes place and validate, improve or discount proposed strategies The survey provides data to understand the target audience’s and broader community’s (1) use of various media and their trusted sources of information, (2) level of understanding, commitment and attitudes towards the environment that can help form decision making about where to place messages and what messages need to be created. That data, when compared to post-campaign survey data, also provides a measure of how successful the campaign was in achieving its SMART objectives. What are the top things great Questionnaire Surveys have? A written research plan Clear link to the campaign’s objectives so that the data can help in campaign design, test assumptions and be used to measuring its impact Questions are clear, unbiased and appropriate. They have been pre- tested There are clear and plentiful instructions to the enumerators written into the questionnaire for them to administer the survey correctly. Enumerators are trained in the survey and are comfortable working with different levels/types of respondents in the community, and the community is comfortable working with them Sample size is valid across all the major target audiences There is a control/comparison group Data interpretation is objective, and not used to validate pre-conceived notions Are not strictly quantitative, but ask good open ended questions that contextualize the responses and surface underlying issues Completed in Survey Pro, reviewed by Rare staff What are the most common pitfalls:  Questions are poorly written, and don’t follow guidelines set out in manual. For example, they are biased, or ask two questions in a single question, etc.  Questions are too long and complex for the community  Questionnaire touches upon socially sensitive issues which introduces bias  Survey sample is not selected using an accepted protocol, giving an unrepresentative sample.  Enumerators don’t understand the importance of taking their time and asking all the questions in the same way for all the respondents  There is not enough time to conduct planning, field work, data entry, and analysis  The analysis is conducted as if the only data that existed was the survey rather than using other data to help interpret the results  Failure to pretest questions  Poor use of Survey Pro (jumbled files, no cross tab, no filters etc) Management – what are the signs that a campaign is apt to have a poor Questionnaire Survey:  CM has not planned the logistics: how to recruit/supervise GOOD volunteer enumerators, train them, & get the completed surveys back and the data entered into the computer. They need a plan. Not enough volunteer enumerators have been recruited to meet needs of male/female interviewers, correct ethnic or language of communities, etc. Need up to 20 enumerators that may work in male/female pairs. Evidence of questionnaire or sampling bias, questions not pre-tested and survey not reviewed by Rare (need 1 week review) LAP does not see the value of the survey. Too many parties are involved in writing the survey and each introduces their own questions, needs and objectives that don’t relate to the campaign. However, LAP should be allowed to add some questions, which contributes to their buy-in to the survey process. Wording is not appropriate for the community or the questions are biased or unclear; or questions not tied to campaign needs 1 2 3 4

3 Additional Comments and Ideas Structuring: Megan’s thoughts (min of a 3-week process) 1.Survey Planning (where, what communities, logistics, volunteers): 1. What is the sampling frame and do they have a strategy for conducting the field work? 2. Do they have enough volunteers of the right type (male/female; ethnic group or language group) of community to be surveyed? Are the volunteers likely to do a good job (unbiased sampling, no faking etc) 2. Review of survey itself – this is the area with the most need for QA: 1.How it relates back to the objectives, validating assumptions and helping design campaign 2.Are the questions biased, are they sharp and clear 3.Have the questions been pre-tested? 3. Input and Data Analysis 1.Input correctly into Survey Pro 2.Correct interpretation – half full/half empty 4. Data Management & Storage – backup, backup … and backup. The nature of these files is to get corrupted. Please upload to the resource page for each campaign on RarePlanet. Overlay Surveys with Other Data (Keith) – if there is a belief the data may be biased then use other sources of data to help confirm data quality. While you can’t easily fuse two surveys – one can be used to contextualize another. Good enumerators are: Site specific issues e.g. role of women; do women need to interview women and men interview men? Maybe need to two enumerators working as a team (safer, more official, helps interaction, helps to manage process, helps avoid cheating) May help to have men and women to work together Person needs to be willing to take the time to understand the purpose of the survey Must be well versed in the survey Youth/college students are often good enumerators because they have Energy Enthusiasm Want experience Willing to talk with anyone Contribute to academic marks


Download ppt "Best Practice Ideas: Questionnaire Surveys V. 1 March 26, 2009."

Similar presentations


Ads by Google