Presentation is loading. Please wait.

Presentation is loading. Please wait.

Interpreting, Editing, and Communicating MISO Survey Results

Similar presentations


Presentation on theme: "Interpreting, Editing, and Communicating MISO Survey Results"— Presentation transcript:

1 Interpreting, Editing, and Communicating MISO Survey Results
Fred Folmer Research & Instruction Librarian / Special Projects Coordinator, Connecticut College This presentation leaves copyright of the content to the presenter. Unless otherwise noted in the materials, uploaded content carries the Creative Commons Attribution-NonCommercial-ShareAlike license, which grants usage to the general public with the stipulated criteria.

2 Quantitative Web-based survey helping libraries/technology organizations in higher education evaluate their services Stands for Measuring Information Service Outcomes Nonprofit survey provider, based at Bryn Mawr College In 2014, approximately 40 institutions participated What is MISO?

3 Asks about satisfaction with, and importance of, numerous services provided by libraries and information technology organizations Rates service providers on whether they are responsive, reliable, knowledgeable and friendly. Asks how well informed respondents feel about various topics Asks what skills respondents would be interested in learning Students: what devices owned; whether they’re used for academic or personal purposes What does MISO measure?

4 Questions/issues to be explored
Lots of data! Different results for different populations Longitudinal, comparative Sifting through it Communication issues, especially within a merged organization Questions/issues to be explored

5 Top 10, Satisfaction, Faculty, 2014

6 Top 10, Importance, Faculty, 2014

7 Before drawing a conclusion or getting too far, it’s wise to:
Look at the actual numbers — are tech numbers really problematic, or are they just lower than library numbers? By how much? Consider some of the differences between technical support services and library services. This is especially true in a merged organization. Take all of this in prior to communicating conclusions. Conclusions?

8 Very high levels of satisfaction
Respondents could rate satisfaction of services as “Dissatisfied” (1), “somewhat dissatisfied” (2), “somewhat satisfied” (3) or “satisfied” (4) More than 98 percent of the services received a mean satisfaction of 3, or at least “somewhat satisfied,” from all constituencies: faculty, staff and students Very high levels of satisfaction

9 Overall perceptions of staff were very positive
Respondents asked to consider whether they thought staff in various IS areas (archives, circulation, reference, instr. tech., computer support, phone support, IT Service Desk) were: Responsive Reliable Knowledgeable Friendly Some differences among areas, but majority agree. Overall perceptions of staff were very positive

10 Student perceptions of reference staff

11 Student perceptions of IT Service Desk staff

12 Overall perceptions of staff (cont’d.)
What we communicated: All staff areas received average score of more than 3 out of 4 (or “somewhat agree”) that criteria were met, across all respondent populations. Overall perceptions of staff (cont’d.)

13 Comparing satisfaction with importance

14 Relatively high importance/relatively low satisfaction quadrant, faculty

15 Other ways to discern patterns
Use past results for longitudinal data Do-it-yourself MISO’s results (has advantages and disadvantages) Combinations of results from different questions in the survey MISO’s tool to compare results with other institutions Combinations of all the above! Other ways to discern patterns

16 Several areas of tech service improvement over time
Among faculty, mean satisfaction with wireless performance improved over 2012 by 3.38 percent Among students, mean satisfaction with wireless performance improved over 2012 by 6.84 percent Among staff, mean satisfaction for the IT Service desk improved by percent Several areas of tech service improvement over time

17 Increased importance for digital image collections
OVER TIME: Faculty and students rated importance of digital image collections higher than in 2012. Faculty mean score increased 9.21 percent Student mean score increased 9.28 percent CROSS-INSTITUTIONAL: Faculty and students mean scores for importance of digital images were higher than those of peer institutions. Faculty means of 2.62 versus 2.3, respectively Student means of 2.59 versus 2.23 Increased importance for digital image collections

18 Faculty, staff and students named information security as an area about which they felt least informed. Respondents who said they felt either “not informed” or only “somewhat informed” included: 71.19 percent of faculty 70.39 percent of staff 74.83 percent of students Several indicators point toward information security awareness as an area for attention

19 Information security awareness, cont’d.
Other related categories, about which majorities of respondents said they were either “not informed” or “somewhat informed”: Computer viruses and spyware Technology-related privacy issues Data backup Information security awareness, cont’d.

20 Data backup also possibly an area for attention
42 percent of students and 39 percent of staff said they never back up their data. The most common answer for those who back up their data was “once or twice” a semester. Majorities either “interested” or “very interested” in learning more about data backup: 57.69 percent of students 60.59 percent of staff 64.41 percent of faculty Data backup also possibly an area for attention

21 Greater faculty interest in instructional technology
CURRENT QUESTIONS: Majorities interested in learning more about: Technology in meeting spaces/classrooms Our learning platform (Moodle) OVER TIME: 11.62 percent increase over 2012 in faculty interest in classroom technology ACROSS INSTITUTIONS: Higher mean scores than peer group in: Interest in learning about Moodle Importance of instructional technology support Interest in learning about technology in meeting spaces and classrooms Greater faculty interest in instructional technology

22 Communication strategies, spring 2014
Issued preliminary report to IS Committee (made up of faculty and selected IS staff) Issued similar report to leadership team of library and IT directors Some items (printing services, wireless) incorporated into strategic goals for the following year Communication strategies, spring 2014

23 Communication strategies, fall 2015
Issued revised/final report to IS staff at all-IS meeting “Good news,” trends Final report to IS Committee; detailed reports on website Communication strategies, fall 2015

24 VP wrote column for newsletter

25 Communication tips Be thorough when interpreting numbers
Don’t just settle for a first impression Be prudent about what’s shared and with whom Look for trends and aggregations Communication tips


Download ppt "Interpreting, Editing, and Communicating MISO Survey Results"

Similar presentations


Ads by Google