Student Opinion Survey It’s required…how do we make meaning out of it? How can results be the start of action?

Slides:



Advertisements
Similar presentations
© Federal Statistical Office of Germany, Sections Human Resources Development and Questionnaire Pretesting, Data Collection Methods Upward feedback at.
Advertisements

Campus-wide Presentation May 14, PACE Results.
AASHTOWare  - Transportation Software Solutions Contents Participants Follow-up Purpose Introduction Results Trnsport 2004 Customer Satisfaction Survey.
Provided by the LAUSD Food Services Division
The Interview – presenting yourself in person
Academic Assessment Workshop: A Review of the Student Satisfaction Inventory Implementation & Implication Liz Baldizan, Ed.D., Assistant Dean, Academic.
SE 450 Software Processes & Product Metrics Survey Use & Design.
Volunteer Recognition Honoring and recognizing individuals for their unique contribution to educational program efforts Honoring and recognizing individuals.
The Evergreen State College Game show Your hosts: The folks from Institutional Research and Assessment.
We’re lost, but we’re making good time Yogi Berra.
Welcome to lesson one in the Customer Service module
UHCL Support Staff Association (SSA) and Professional and Administrative Staff Association (PASA) In consultation with Dr. Lisa M. Penney RAs: Lisa Sublett,
UGA Libraries Faculty Job Satisfaction Survey Conducted Spring 2006.
Students’ Involvement in University Administration: The Role of students’ Satisfaction Survey. By: Paul Kwadwo Addo Solomon Panford SEMINAR FOR SENIOR.
Registration Satisfaction Survey FAS Report, Fall Presented by: K. El Hassan, PhD. Director, OIRA.
The Role of Assessment in the EdD – The USC Approach.
2010 Annual Employee Survey Results
Power Point Slides by Ronald J. Shope in collaboration with John W. Creswell Chapter 13 Survey Designs.
Report of findings prepared for: FGI Research, Inc. May 2007 The Impact of CSR on the General Public A Nationwide Poll on Corporate Social Responsibility.
1 Module #6 – Business Retention & Expansion. 2 Why Existing Businesses are Important:  Recruitment has limitations  Over time, they create more new.
1 1 Copyright Universum 2008 Copyright Universum 2009 University Report Universum Student Survey 2009 Italian Edition Tor Vergata Business.
Who is Sinking Your Boat?
Captcha Soft solutions Pvt Ltd is a recognized name in the web design industry. For the past three years, we’ve been doing what we love: inventing, conceptualizing,
Mountain View College Spring 2008 CCSSE Results Community College Survey of Student Engagement 2008 Findings.
Hey, I am Just a Volunteer! January Introduction. The National Member Services Committee has developed a series of National Education Seminars to.
24 August 2011 Dr JC Henning Technology based Quality Evaluation Instrument for Teaching and Learning: UNISA Library Services.
Streamlined NCATE Visits Donna M. Gollnick Senior Vice President, NCATE 2008 AACTE Annual Meeting.
Staff Survey Executive Team Presentation (Annex B) Prepared by: GfK NOP September, Agenda item: 17 Paper no: CM/03/12/14B.
New Teachers’ Induction January 20, 2011 Office of Curriculum and Instruction.
Assessment of Cataloging Services in an Academic Library Catherine Sassen Principal Catalog Librarian Kathryn Loafman Head, Cataloging and Metadata Services.
2005 Performance Development System Survey Human Resources Staff Meeting March 20, 2006.
COMMUNICATION ENGLISH III October 4/5 th Today Introduction to Discussion Board. More Task 2 info. Surveys.
January 18, 2012 Administrative Council Presentation.
Spring 2013 Student Opinion Survey (SOS) Take it Seriously… YOUR OPINION COUNTS!!!
Where to Find Institutional Research Information   Navigation Menu: Student Services.
Why Are Executives Leaving? Jennifer Hanson MBC Final Project May 2008.
Customer Service. Objectives What is the definition of customer service? What are the principles of good customer service? Who are our customers? What.
Campus Quality Survey 1998, 1999, & 2001 Comparison Office of Institutional Research & Planning July 5, 2001.
Using Client Satisfaction Surveys as a Quality Improvement Tool.
Principles of Effective Work.  Direction The more definite and focused you are, the easier it is for you to make better decisions on your priorities.
VT University Libraries: Identifying, Teaching, and Assessing What Matters Most Office of Academic Assessment Ray Van Dyke,
Slide 1 Systems Analysis and Design with UML Version 2.0, Second Edition Alan Dennis, Barbara Wixom, and David Tegarden Chapter 5: Requirements Determination.
Health Quality Ontario: Health System Performance New Zealand Master Class March 25, 2014.
Sustainable Fundraising Susan Hay Patrick CEO, United Way of Missoula County
RESULTS OF THE 2009 ADMINISTRATION OF THE COMMUNITYCOLLEGE SURVEY OF STUDENT ENGAGEMENT Office of Institutional Effectiveness, April 2010.
What does it all mean?. Communication Skills  Communication is the transfer of a message from one person to another. Maybe spoken, written, non-verbal.
Gallaudet University 2015 There’s No Place Like Home: Assessing Climate Prepared by OAQ/Office of Institutional Research October 20,
THE ROAD AHEAD Why We’re Here Review the Chamber’s strategic priorities Share baseline findings from 2012 Economic Development Community Survey.
Quality Improvement in Primary/Ambulatory Care: The new Frontier Focus on Patients Piera Poletti CEREF, Padua (Italy)-
Based on the book of Kerry Godfrey and Jackie Clarke The Tourism Development Handbook.
Using Data to Improve Student Achievement December 1, 2010.
RESULTS OF THE 2009 ADMINISTRATION OF THE COMMUNITYCOLLEGE SURVEY OF STUDENT ENGAGEMENT Office of Institutional Effectiveness, September 2009.
Effective Communication and Client Maintenance By Sherran S. Spurlock December 20, 2005.
Human Resources Office of 1 Summary of Results College of Design Dean’s Reports.
Welcome! Now, get to work. What is the purpose of your employee performance management system? What would you change about your employee performance management.
School Improvement Needs Assessment – © Iowa Association of School Boards Assessment Conducted by the Iowa Association of School Boards.
©2012 THE ADVISORY BOARD COMPANY ADVISORY.COM Gaining Provider Feedback In February – March 2014, we administered a medical staff survey to employed &
supported by a local government initiative sharing nationally to improve services locally Contextualised Customer Satisfaction CIWG.
Going World-Class Through Merger: How the State Affects Identity Formation in Russian Flagship Universities Igor Chirikov (CIM) Ivan Pavlyutkin (LSES)
Improved socio-economic services for a more social microfinance.
Summary of VCU Student Satisfaction Fall 2012
Community Survey Report
City of Washougal 2016 Community Survey Findings
Business Model Validation Lab
administered at MMCC six times:
Overview – Guide to Developing Safety Improvement Plan
Overview – Guide to Developing Safety Improvement Plan
Ch.2 Discussion Questions
McPherson College, Fall 2017
2009 Student Opinion Survey Results
Presentation transcript:

Student Opinion Survey It’s required…how do we make meaning out of it? How can results be the start of action?

This is radical: We need to decide in advance what we want from the SOS, not just be unhappy (or happy) with what we get

A possible first step Find a partner, a functional office like Student Life or Academic Services

How in the heck do we do that? Identify areas of concern Look at other assessments: what makes us anxious? Look at data from past SOS administrations: what do we not want to share? What in the environment/ atmosphere/ milieu keeps us up at night?

Be strategic with local questions Look at all the opportunities for local questions In a given time frame, keep the local questions largely the same Compare from one survey to the next

Four Phases Prep by IR –Decide local questions –Decide areas of interest Decide on the meaning of the data –To what will you respond? –How will you report? Dissemination of data –Who will see the data? –When? –In what format? Take action –Who gets to say “Houston, we have a problem?” –Who gets to say “This is what we’ll do to respond?”

Prep Respondents Yes, have a campaign –In 2006, you said X, Y, Z –We did A, B, C to address –Respondents want to know that what they said made a difference –Respondents are volunteers and the best volunteers are prepared for the task

Prep Respondents In the very few areas you are specifically interested in, explain to respondents what the content of an item means Give them a rubric: what does “Very Satisfied” mean? What does “Very Dissatisfied” mean?

Now Here’s a Rubric!

Prep Ourselves To what will we pay attention? Decide in advance Keep it to a small number (yes, that means everything is not #1) Create clusters of questions around these areas, using local questions

Prep Ourselves It’s not just an IR job Needs high level of involvement To whom will you report results? These are the people/offices you need to work with before the administration Urge these folks to commit to action

Build an Appetite for the Data Who are the natural customers of SOS data? What is the best way to provide the data to them? How do we get feedback about what they did in response to the data?

Prep Ourselves Conduct focus groups with students a year before the survey administration Focus on the areas with which you are concerned Or the areas in which you were unhappy with the results

Make meaning via triangulation of the Data Results from the specific administration Comparisons of means to past administrations - trends Sector rankings Always display data in context

Understand the limitations of the instrument –The SOS is a once-over lightly survey of many items –Most content areas have one question –Broad administration in a semi-controlled environment –Points us to more research or to action where general perceptions are validated –Is neither everything nor nothing

Everything is Relative (thanks Mr. Einstein) If we are doing ‘badly’ in areas that are of top priority to us, then we need action If the results are mixed, then we need to find out more information If the area is not in our top 10, then ‘who cares’? A plan to improve would involve changes in areas (see #1), and also a metaplan

Conversation based on results of SUNY New Paltz 2006 data A few 2006 SOS positives: –We are #1 in “Contribution to understanding/appreciating ethnic/cultural differences” –We do well in helping students develop an open mind to the opinions of others –What do we do to respond to these findings?

A few New Paltz not-so-positives: –We rank #12 out of 12 in the comprehensive sector in ‘helping students get a part-time job’ –Plus, our average mean is the lowest in 4 SOS administrations –Under what conditions would we do nothing about this finding?

SUNY NP ranked #12 out of 12 in satisfaction with advising There is no clear trend in comparing means over time What should we do?

SUNY NP is #12 out of 12 in the “Purposes for which the student activity fee is used” What should we do? What was going on at the time?

Meaning & Action You can’t take action without meaning You can’t find meaning without a plan You can just administer the survey, and then be faced with a mess of responses that don’t yield much meaning The result: no action