Presentation on theme: "Nathan Lindsay & Larry Bunce February 19, 2014"— Presentation transcript:
1 "Developing Surveys to Measure Student Satisfaction and Learning Outcomes“ Nathan Lindsay & Larry Bunce February 19, 2014Adapted from Presentations given by Jennie Gambach, Assistant Director of Assessment programs at Campus Labs, and Annemieke Rice, Director of Campus Success for Campus Labs, and Amy Feder, Assessment Consultant at Campus Labs.@CampusLabsCo #labgabLike us on Facebook!
2 There are many types of surveys to consider… Satisfaction SurveysLearning Outcome SurveysNeeds AssessmentsExit SurveysAlumni SurveysUser SurveysNon-User SurveysStudent/Faculty/Staff/General Public SurveysOther?
3 Steps in survey design Outline topic(s) and draft items Pilot test survey and reviseReview and revise surveyDetermine physical characteristics of surveyDetermine item sequenceWrite and edit itemsChoose response formatsOutline topic(s) and draft items
5 Begin with the end in mind… What do you want/need to show?Why do you need to show it?Who is the source of your data?How will you use the data?Who will need to see results?
6 The purpose of this assessment is… To better understand what the needs of our veteran students are, and how the new Veteran’s Center can meet themTo evaluate if students achieved the stated learning outcomes of our workshop, and what additional training needs they haveTo demonstrate to stakeholders the impact that living in the residence halls has on student developmentTo assess student awareness of services in order to develop our marketing and communications plan
7 Determine your purpose Examine past assessmentsESIGN
8 Examine past assessments Those who cannot remember the past are condemned to repeat it.Philosopher, George Santayana 1905Did you use the data?If not, what kept you from examining it?If you used the data…What was useful?Was any of the data difficult to analyze?Were there questions you wished you had asked?Did any question wording make you unsure of what the data meant?What feedback did you receive from those who participated?
9 D E S I G N Determine your purpose Examine past assessments Select the appropriate methodSIGNWhat type of data do you need?Has someone already collected the information you are looking to gather?(How) can you access the existing data?(How) can you use the existing data?Is there potential for collaboration with another individual, program, or department?How can you best collect this data?
10 Select an appropriate method Indirect vs. directQuantitative vs. qualitativeFormative vs. summativePopulation vs. sample
11 Quantitative Qualitative Focus on text/narrative from respondents Focus on numbers/numeric valuesWho, what, where, whenMatch with outcomes about knowledge and comprehension (define, classify, recall, recognize)Allows for measurement of variablesUses statistical data analysisMay be generalize to greater population with larger samplesEasily replicatedFocus on text/narrative from respondentsWhy, howMatch with outcomes about application, analysis, synthesis, evaluateSeeks to explain and understandAbility to capture “elusive” evidence of student learning and development
12 A subsection of that group SamplingPopulationThe whole groupExample: survey goes to entire campusIf entire campus: use sparingly and coordinate with Institutional ResearchSampleA subsection of that groupExample: survey goes to 30% of campusSo sampling is a way to obtain information about a large group by examining a smaller selection (the sample) of group members. If the sampling is conducted correctly, the results will be representative of the group as a whole; therefore it is just as effective as sending out your assessment to the whole group – and you are getting that added benefit of limiting the number of surveys individual students are receiving.
13 Stratified Random Sample Sampling strategiesSimple Random Samplegives everyone in sampling population an equal chance of selection; a probability sampleStratified Random Samplebreaks total sample into subpopulations and then selects randomly from each stratumThere are different types of samples you can pull.Simple Random SampleExample: Asking your registrar’s office to give a random sampleStratified Random SampleExample: Splitting a demographic file by academic year to randomly choose from each year (often used strategy for our Consortium projects). Or this done be by any number of demographic areas depending on your needs.Systematic SampleExample: List of 300 people, start on the 5th person and use every 3rd person after thatThe type of sample you will need is dependent on your particular assessment project. Your assessment consultant at campus labs is happy to give you advice on sampling, or often there are in house experts you can go to in research oriented academic departments or the Institutional Research that can support you if you need assistance in this area.
14 Sample suggestionsNumber of Students in PopulationRandom Sample Size (suggestion)1,000278500217350184200132100805044Based on 5% margin of errorSuggestion in: Assessing Student Learning by Linda SuskieSample size is the desired number of respondents NOT the number of individuals invited to participate.Here are some sample suggestions. Note that your sample size is the desired number of respondents not the number of individual invited to participate. Because as we all know, if you a survey to 278 students, it’s not very likely that all 278 students are going to respond to it. So for this reason, if your population is relatively small, sampling may be less a of factor, and the number of participant you give the survey to will very depending on the method of administration. For example, if you have a population of 100 student leaders and you give them an assessment in person at the first student leadership conference of the year that they are all required to attend, in that case, just giving the assessment to 80 individuals might work. But on the other hand, if you are ing the survey to them after the fact, and you know your survey response rate is usually only around 50%, in that case you are probably going to send the survey to everyone in that population.
15 Direct Methods Indirect Methods Any process employed to gather data which requires subjects to display their knowledge, behavior, or thought processes.Any process employed to gather data which asks subjects to reflect upon their knowledge, behaviors, or thought processes.I know where to go on campus if I have questionsabout which courses to register for in the fall.Strongly agreeModerately agreeNeither agree nor disagreeModerately disagreeStrongly disagreeWhere on campus would you go orwho would you consult with if youhad questions about which coursesto register for the fall?
16 Formative Summative Conducted during the program Purpose is to provide feedbackUse to shape, modify or improve programConducted after the programMakes judgment on quality, worth, or compares to standardCan be incorporated into future plansInteraction: In our example, what do we need? Talk about pros/cons
17 Is a survey right for you? Pros:Include large numbersRelatively fast and easy to collect dataLots of resources availableRequires minimal resourcesFast to analyzeGood for surface level or basic dataCons:Survey fatigue and response ratesNon-responsiveLimited in type of questions askedLacks depth in dataSkills set in both designing questions and analyzing data properly
18 Focus GroupsGroup discussions where the facilitator supplies the topics and monitors the discussion.The purpose is to gather information about a specific (or focused) topic in a group environment, allowing for discussion and interaction by participants.Similar to interviews, but use when the group interaction will give contribute to a richer conversationReally about getting the group talking amongst themselves
19 Is a focus group right for you? Pros:Helps to understand perceptions, beliefs, thought processesSmall number of participantsFocus groups encourage group interaction and building upon ideasResponsive in natureRelatively low costs involvedCons:Getting participants (think of time/places)Data collection and analysis takes timeData is as good as the facilitatorBeware of bias in analysis reportingMeant to tell story, may not help if numbers are neededData is not meant to be generalizable
20 Quick, 1-Minute Assessments On a notecard, write a real-world example of how you can apply what you learned. Pass an envelope containing notecards with quiz questions. Students pick one and have 60 seconds to answer and pass along. At the end of a workshop, ask students to write down 1 thing they learned, and 1 lingering question.
21 Is a quick assessment right for you? Pros:Provides a quick summary of take away from student perspectiveQuickly identifies areas of weakness and strengths for formative assessmentCan track changes over time (short-term)Non-verbal (provides classroom feedback from all students)Captures student voiceShort time commitmentProvides immediate feedbackCons:Non-responsiveShort (so you may lose specifics)Sometimes hard to interpretNeed very specific prompts in order to get “good” dataPlan logistics ahead of time and leave time during program/courseMay need to be collected over time
22 Mixed Methods Look for same results across multiple data collections Build upon or relate results from one assessment to anotherUse data from one method (e.g., a survey) to inform another method (e.g., a focus group)Able to increase scope, number, and types of questions
23 DDetermine your purposeEExamine past assessmentsSSelect the appropriate methodIIdentify ethical/logistical considerationsGN
24 Identify ethical/logistical considerations Do you have the necessary resources and brain power?Do you need to go through IRB?Do you need to identify respondents for follow up, merging of data, tracking of cohorts, or pre/post analysis?Do you need to include demographic questions to drill down or separate data?Who needs to be involved at planning stage to avoid problems when results are in? Does anyone need to approve the project?Are there any political issues to be aware of?
25 DDetermine your purposeEExamine past assessmentsSSelect the appropriate methodIIdentify ethical/logistical considerationsGGenerate the best question and answer formatN
26 What to consider Scales that match Mutually exclusive Exhaustive Neutral/Not applicable/Non-response optionsChoose not to respond Don’t knowNot applicable Unable to judgeNo opinion NeutralNeither ___ nor ___
27 Pairing Question Text with Answer Choices Question text should be compatible with the answer choicese.g., “How satisfied were you with the following?”e.g., “Did you enjoy the Black History Month speaker?”Strongly agreeSomewhat agreeSomewhat disagreeStrongly disagreeExcellentGoodFairPoorMeals at the conferenceLocation of the conferenceDate of the conference
28 Mutually Exclusive Answer Choices Response options should never overlape.g., How many hours per week do you work?0-1010-2020-3030-40Response options should exist independently of one anothere.g., Which of the following statements describes your peer mentor?He/she is helpful and supportiveHe/she is difficult to get a hold of
29 Exhaustive Answer Choices Respondents should always be able to choose an answere.g., How often do you use the University website?Daily2-3 times a weekWeeklyMonthly
30 Non-response options Always consider a non-response option Choose not to respond Don’t knowNot applicable Unable to judgeNo opinion NeutralNeither ___ nor ___Customize the non-response option when possiblee.g., How would you rate the leadership session?ExcellentGoodFairPoorDid not attend
31 Pitfalls to avoidSocially desirable responding – based on social normsCan never be eliminatedConsider sensitive topics like race, drug and alcohol use, sexual activity, and other areas with clear social expectationsLeading questions – suggesting there is a correct answere.g., “Why would it be good to eliminate smoking on campus?”Double-barreled questions – asking more than one questione.g., “What were the strengths and weaknesses of orientation?”Double negatives – including negative phrasing which makes responding difficulte.g., “I do not feel welcome in my residence hall.”
32 Response Formats Open ended responses Free response - textNumericYes/No with please explainTypes of multiple choice responsesYes/NoSingle responseMultiple response (e.g., Check all that apply, Select 3)RankingScales.
33 Yes/No When to Use: When Not to Use: There is no response between “Yes” and “No”e.g., “Have you ever lived on campus?”You consciously want to force a choice even if other options might existe.g., “Would you visit the Health Center again?”When Not to Use:There could be a range of responsese.g., “Was the staff meeting helpful?”
34 Single response When to Use: When Not to Use: All respondents would only have one responsee.g., “What is your class year?”You consciously want to force only one responsee.g., “What is the most important factor for improving the Rec Center?”When Not to Use:More than one response could apply to respondentse.g., “Why didn’t you talk to your RA about your concern?”
35 Multiple response Options: “Check all that apply” or “Select (N)” When to Use:More than one answer choice might be applicablee.g., “How did you hear about the Cultural Dinner?” (Check all that apply)You want to limit/force a certain number of responsese.g., “What were your primary reasons for attending?” (Select up to 3)When Not to Use:It’s important for respondents to only be associated with one responsee.g., “What is your race/ethnicity?”
36 Ranking When to Use: When Not to Use: You want to see the importance of items relative to one anothere.g., “Please rank how important the following amenities are to you in your residence: (1=most important)”You are prepared to do the analysis and interpretation!When Not to Use:You want to see the individual importance of each iteme.g., “How important are the following amenities to you?”
37 Scales When to Use: When Not to Use: You want to capture a range of responsese.g., “How satisfied were you with your meeting?”When you would like statisticse.g., 4 = strongly agree3 = agree2 = disagree1 = strongly disagreeWhen Not to Use:The question is truly a Yes/No questione.g., “My mother has a college degree.”
38 Scales Very safe Consider… Number of points Inclusion of neutral Bipolar – positive or negative (with or without a midpoint)Very safeSomewhat safeSomewhat unsafeVery unsafeUnipolar – no negativeA great dealConsiderablyModeratelySlightlyNot at allConsider…Number of pointsInclusion of neutralWhether labels are neededOrder (e.g., 1, 2, 3, 4, 5 or 5, 4, 3, 2, 1)
40 DDetermine your purposeEExamine past assessmentsSSelect the appropriate methodIIdentify ethical/logistical considerationsGGenerate the best question and answer formatNNote the purpose for each data point
41 Note the reason for each data point Bubble next to questionCompare against purpose to identify gapsLook for overlapEliminate “nice to know”Help with orderingRetain for analysis step
42 DATA COLLECTION METHODS After thinking through some of these pre-considerations
43 Paper surveys Things to consider Captive audience Administrator available for questionsNo technology issues or benefitsData entry necessaryWith all the technology available to us, it’s easy to overlook paper surveys, but sometimes paper surveys can be a really great option. If you do have a captive audience, perhaps students taking your class, or employers at a career fair, or students attending a workshop presentation, then passing out a paper survey can really up your survey response. Some of the other benefits is that you don’t need to worry about technology issues and often you are going to be there, if there are any questions. In terms of cons, it’s not going to be as eco friends and of course the big detriment of course is that data entry will be necessary; however we do try to this as easy as possible for you in the Baseline system.
44 Web surveys Things to Consider: No data entry Technology issues and benefitsImmediate resultsCan be anonymous or identifiedNot a captive audienceAs oppose to paper surveys or mobile surveys, you don’t necessarily have a captive audience, but you are getting the benefit of having the data automatically entered into the system, they can either be anonymous or identified, as we discussed eariler.
45 Data Collection Methods ProsConsWebNo data entryAccuracy is excellentTechnology benefits (e.g., display rules, required questions)Immediate resultsAnonymousAudience is not usually captivePossible misinterpretation (can’t ask ?s)Technology issuesResponse sample unrepresentativeMobileAccuracy is goodCaptive audienceAdministrator is available for ?sLimited formattingAnonymity is questionablePaperNo technology issuesNo benefits of technologyAccuracy can be compromisedData entry necessary
46 SURVEY FATIGUESo let talk about survey fatigue, because everything I said regarding survey administration so far doesn’t mean much if no one agrees to take your survey.
47 General information Survey response rates have been falling Difficult to contact peopleRefusals to participate increasingStrategies for correcting low response rates:Weight the data for non-responseImplement strategies to increase response ratesWe know that Survey response rate have been falling throughout the past decades. There are strategies for correcting low response rates after data has been collected be weighting the data for non-response, but it requires some sophistocated statistical anaysis, so generally it’s a little better to implement strategies to increase responses.
48 Non-response may not be random Correlation exists between demographic characteristics and survey responseHigher response has been found among certain sub-populations:WomenCaucasiansHigh academic abilityLiving on campusMath or science majorsResearch is inconsistentResearch is inconsistent here but there are some studies that suggest that there is a correlation between demographic characteristics and survey response.
49 IMPROVING RESPONSE RATES So from what we know of the research and theories of survey response, what can we do to improve response ratees.
50 Specific techniques Survey length Preannouncement Invitation text RemindersTiming of administrationIncentivesConfidentiality statementsSalienceRequest for helpSponsorshipDeadlinesWe’ll go through a number of these techniques from moderating survey length to including deadlines.
51 Survey length Greater attrition at 22 questions or 13 minutes What to consider:Excluding “nice to know”Eliminate what you already knowOutlining how results will be usedNumber of open-ended questionsNumber of required questionsAs a caveate to this slide I will just say that this is very dependent on individual assessment and campus in questions, but looking holistically research has shown greater attrition about the 22 question or 13 minute mark. This is an either or type senaro, because you could have a 22 question survey that is half open ended questions that could take someone 30 minutes to complete and you will experience greater attrition.If you are looking to cut down the length of your survey, a really good exercise is to go through each question one by one abd outline how you plan to use the results so that you can cut out questions that are non-essential. You can also limit the number of open-ended questions and also limit the number of required questions, if maybe it is more important for you to have a student get through to the end, even if it means they skip a couple of questions.
52 Invitations Importance/Purpose Relevancy to respondent Request for helpHow and by whom results are usedHow long it will take to respondDeadlineIncentives/CompensationContact information
53 Timing of contact/administration Avoid busy times or holidaysSend / preannouncement 2-3 days prior to survey mailingFirst half of semester/term may be better if you are surveying in an academic environmentIn general is best to avoid those extremly busy times and at the same time the slower holiday times when students might not be on campus or checking their . We have also found that usually the first half of the semester is more effective in the academic environment. Which makes sense, because that is before it is getting too hectic.This is where is it really important to know your own campus culture, however, because the best time to send surveys does vary, for example, we know one campus where sending assessments in that last week after finals, but before graduation has worked pretty well.
54 Piloting 1. Take it as if you were respondent 2. Seek reviews from colleagues with no prior knowledge3. Administer to sample of actual population being studiedFocus groupQuestions at end of surveyObserving
55 Reliability & Validity Reliability – yielding the same results repeatedlyTest/Re-test – consistency over timeInter-rater – consistency between peopleValidity – accurately measuring a conceptInternal – confidence results due to independent variableExternal – results can be generalizedFace validity – does this seem like a good measure?If a survey is valid, it is almost always reliable!
59 Some Final AdviceGoogle your area to see what other surveys have been conductedContact Larry Bunce (Director of Institutional Research) or Nathan Lindsay for help in designing your surveyOnline surveys should be coordinated through the Office of Institutional Research
60 QUESTIONS?Nathan Lindsay Assistant Vice Provost for Assessment University of Missouri-Kansas CityCustomize your name and contact info