Presentation is loading. Please wait.

Presentation is loading. Please wait.

O FF - THE -S HELF OR H OMEGROWN ? S ELECTING THE A PPROPRIATE T YPE OF S URVEY FOR Y OUR A SSESSMENT N EEDS Jennifer R. Keup, Director National Resource.

Similar presentations


Presentation on theme: "O FF - THE -S HELF OR H OMEGROWN ? S ELECTING THE A PPROPRIATE T YPE OF S URVEY FOR Y OUR A SSESSMENT N EEDS Jennifer R. Keup, Director National Resource."— Presentation transcript:

1 O FF - THE -S HELF OR H OMEGROWN ? S ELECTING THE A PPROPRIATE T YPE OF S URVEY FOR Y OUR A SSESSMENT N EEDS Jennifer R. Keup, Director National Resource Center for The First-Year Experience and Students in Transition

2 Institutional data are meaningless without a comparison group.

3 My institution is unique in its programs and goals.

4 The main outcome of interest on my campus is student development.

5 G OALS FOR T ODAY Introduce & discuss Ory (1994) model for comparing and contrasting local vs. commercially-developed instruments Identify elements of your institutional culture & structure that would influence decision Discuss myths with respect to survey administration Share examples of: …the most prominent national surveys for first- year assessment …software and services available to facilitate institutional assessment

6 W HAT DO W E M EAN Often commercially- developed Scope to include multiple institutions Primarily pre-set content Examples: CIRP NSSE EBI Developed locally Focused on institution Content developed and adapted by the campus/unit Examples: Program review Utilization/satisfaction surveys for specific programs “Off-the-Shelf”“Homegrown” Continuum

7 Q UESTIONS TO A SK What is my budget? What is my timeline? What are my analytical capabilities? Who needs to see these data? How will this fit with my other responsibilities? Who needs to make decisions with these data?

8 O RY (1994) M ODEL FOR COMPARING AND CONTRASTING LOCAL VS. COMMERCIALLY - DEVELOPED INSTRUMENTS

9 S IX F ACTOR C OMPARISON Purpose Match Logistics* Institutional Acceptance Quality Respondent Motivation to Return the Instrument

10 P URPOSE Allows for comparison to national norm group Examples: Comparison to peer or aspirant group Benchmarking Contextualize a broad higher education issue or mandate Allows for a thorough diagnostic coverage of local goals and interests Examples: Satisfaction with campus program Achievement of departmental goals Program review “Off-the-Shelf”“Homegrown” Why are we doing this study, and how will the results be used?

11 M ATCH May provide in- complete coverage of local goals & content Tailored to local goals and content “Off-the-Shelf”“Homegrown” o What are the program/institutional goals, outcomes, and areas of interest? o Does an existing instrument meets my needs? o Does the survey address my purpose? o Can the existing instrument be adapted to meet my needs? Local Questions!

12 I NSTITUTIONAL A CCEPTANCE Professional quality & national use may enhance acceptance Failure to completely cover local goals and content may inhibit acceptance Local development can encourage local ownership and acceptance Concerns about quality may interfere with acceptance “Off-the-Shelf”“Homegrown” o How will the results be received by the intended audience? o Who needs to make decisions with this data? o What is the assessment culture? Politics IRB

13 Q UALITY Tend to have better psychometrics Professional quality may compensate for incomplete coverage of local goals and objectives Must fully test psychometric properties Create a professional appearance Lack of professional quality may affect results and institutional acceptance “Off-the-Shelf”“Homegrown” o What is the track record of quality? o What is the psychometric soundness of the instrument?

14 R ESPONDENT M OTIVATION TO R ETURN THE I NSTRUMENT Can create instant credibility Sometimes provide institutional or individual incentives Local specificity may yield greater respondent “buy in” Local instruments may not “impress” people Can create student perception of immediate impact “Off-the-Shelf”“Homegrown” o What will yield the highest response rate? Incentives

15 L OGISTICS (10 CONSIDERATIONS ) Availability Preparation time Expertise Cost Scoring Testing time Test & question types Ease in administration Availability of norms Reporting The Devil is in the details!

16 L OGISTICS ( CONTINUED ) Does a survey currently exist for our needs? If you can afford it, the survey is available “If you build it they (i.e., data) will come” Takes time & resources to develop OTS: AvailabilityHG: Availability OTS: Prep timeHG: Prep time ShortCan take considerable time What is the survey timeline? Is it feasible? Have you considered administration planning?

17 L OGISTICS ( CONTINUED ) Fully-developed protocol allows one to administer after reading manual Takes content, measurement, and administrative experience Psychometrics!!! OTS: ExpertiseHG: Expertise OTS: ScoringHG: Scoring Can be delayed if scoring off campus Need to adhere to the administration cycle Can be immediate Related to expertise

18 L OGISTICS ( CONTINUED ) Fixed based upon content and administration protocol Flexible as long as the survey meets institutional & programmatic needs OTS: Testing timeHG: Testing time OTS: Test typeHG: Test type Type of test and questions are predetermined Allows for flexibility in type of test (objective /open-ended) and type of question (MC, rank ordering, etc.) If administering in class do you have faculty buy-in?

19 L OGISTICS ( CONTINUED ) Requires standardized administration Special training for testers Allows for greater flexibility OTS: Ease of AdmnHG: Ease of Admn OTS: NormsHG: Norms National & inter- institutional comparison IRB Intra-institutional comparison OTS: ReportingHG: Reporting Standard formats that don’t always relate to institution Institutional tailoring of results and reporting

20 L OGISTICS ( CONTINUED ) Primary costs associated with purchase price Other costs: Scoring Data Specialized reporting Human resources to coordinate campus administration Recurring cost Primary costs associated with development costs Instrument development Ensuring psychometric properties Scoring & recording data Reporting findings Other costs Software/hardware Training Primarily one-time investment OTS: CostHG: Cost

21 Purpose Match Logistics Accept- ance Response Quality

22 O-T-S VS. HG M YTHS You can only gather comparison data from national (OTS) surveys It is cheaper to develop and administer a homegrown survey Off-the-shelf surveys don’t require any work Homegrown surveys are hard. You don’t need IRB approval for local assessment Off-the-shelf surveys study all the important topics

23 FYE A SSESSMENT E XAMPLES CIRP Freshman Survey Your First College Year (YFCY) Survey NSSE Educational Benchmarking Incorporated Services Eduventures Student Voice Software Zoomerang Survey Monkey “Off-the-Shelf”“Homegrown”

24 C ONTINUUM OF A SSESSMENT CIRP NSSE EBI Eduventures Student Voice Survey Monkey Zoomerang Off- the- Shelf Home Grown


Download ppt "O FF - THE -S HELF OR H OMEGROWN ? S ELECTING THE A PPROPRIATE T YPE OF S URVEY FOR Y OUR A SSESSMENT N EEDS Jennifer R. Keup, Director National Resource."

Similar presentations


Ads by Google