Presentation is loading. Please wait.

Presentation is loading. Please wait.

Corning CC and CCSSE: What We Experienced and How We Handled It Maren N. Hess Director of Institutional Research AIRPO Winter Conference Syracuse – January.

Similar presentations


Presentation on theme: "Corning CC and CCSSE: What We Experienced and How We Handled It Maren N. Hess Director of Institutional Research AIRPO Winter Conference Syracuse – January."— Presentation transcript:

1 Corning CC and CCSSE: What We Experienced and How We Handled It Maren N. Hess Director of Institutional Research AIRPO Winter Conference Syracuse – January 12, 2007

2 History of CCSSE participation for CCC –CCC participated in CCSSE in Spring 2004 as part of a CCTI consortium 152 other institutions in 2004 75 institutions were classified as “Small” (Fall 2002 IPEDS enrollment <4500) –Spring 2007, no consortium –Spring 2009, for SUNY’s SCBA

3 CCSSE Mechanics –Well organized by University of Texas at Austin –Communication Plan to Campus well defined Templates included Roles for Institutional Research easily discerned –Files required to submit to CCSSE Course master file of all credit bearing activity in Spring semester, including remedials, at census date Codebook

4 CCSSE Mechanics (cont.) –CCSSE selects the sections to be surveyed –CCSSE sends the survey packets, already divided for each section –IR schedules the dates/times for in-class survey administration –IR or faculty can administer the survey

5 CCSSE –Survey timeline starts in November –Web interface for survey results –CCSSE selects sections to be surveyed SOS –Survey timeline starts in January –No web interface for survey results –IR selects sections to be surveyed

6 CCSSE deliverables –“First Look” –Frequency Report: All students –CCSSE standard reports on means and frequencies –CCSSE benchmarks for the college –CD (including raw data in Excel format)

7 CCSSE Benchmarks –“Each benchmark score was computed by averaging the scores on survey items that comprise that benchmark. To compensate for disproportionately large numbers of full- time students in the sample, all means used in the creation of the benchmarks are weighted by full- and part-time status. Benchmark scores are standardized so that the weighted mean across all students is 50 and the standard deviation across all participating students is 25. Institutions’ benchmark scores are computed by taking the weighted average of their students’ standardized scores.”

8 CCSSE Benchmark Areas –Active and Collaborative Learning –Student Effort –Academic Challenge –Student-Faculty Interaction –Support for Learners

9 CCSSE Benchmark Areas –Active and Collaborative Learning Students learn more when they are actively involved in their education and have opportunities to think about and apply what they are learning in different settings. –Student Effort –Academic Challenge –Student-Faculty Interaction –Support for Learners

10 CCSSE Benchmark Areas –Active and Collaborative Learning –Student Effort Students’ own behaviors contribute significantly to their learning and the likelihood that they will successfully attain their educational goals. –Academic Challenge –Student-Faculty Interaction –Support for Learners

11 CCSSE Benchmark Areas –Active and Collaborative Learning –Student Effort –Academic Challenge Challenging intellectual and creative work is central to student learning and collegiate quality. –Student-Faculty Interaction –Support for Learners

12 CCSSE Benchmark Areas –Active and Collaborative Learning –Student Effort –Academic Challenge –Student-Faculty Interaction In general, the more contact students have with their teachers, the more likely they are to learn effectively and persist toward achievement of their educational goals. –Support for Learners

13 CCSSE Benchmark Areas –Active and Collaborative Learning –Student Effort –Academic Challenge –Student-Faculty Interaction –Support for Learners Students perform better and are more satisfied at colleges that are committed to their success and cultivate positive working and social relationships among different groups on campus.

14 Support for Learners Questions –Providing the support you need to help you succeed at this college –Encouraging contact among students from different economic, social, and racial or ethnic backgrounds –Helping you cope with your non-academic responsibilities (work, family, etc.) –Providing the support you need to thrive socially –Providing the financial support you need to afford your education –Frequency: Academic advising / planning –Frequency: Career Counseling

15

16 IR received 3” binder chock-full of data… where to begin? –Adopted Research Brief design from the Connecticut State System, seen in a NEAIR conference in Portsmouth, NH (Nov. 2004) –Series of 10 Research Briefs were shared with the college community over a period of several months First five briefs dealt with Benchmark areas Last five briefs dealt covered all other questions

17 Research Brief format –Introduction –Benchmark description –Summary Data All Respondents, FT only, PT only By Credit Hours Earned ( =30) –Individual Survey Items All Respondents, FT only, PT only Included Likert scale responses

18 All college employees can access the briefs on the internal portal.

19 What did we find? –All students: Student-Faculty Interaction at 100 th percentile for all colleges, at 90 th percentile for small colleges 90 th percentile for Part-time students 80 th percentile for Full-time students –The weighting that is incorporated by CCSSE to calculate the standardized means could disguise information in the detail –Data on rural colleges did not markedly differ from that of small colleges

20 What did we find? –All students: Academic Challenge at 90 th percentile for all colleges, at 80 th percentile for small colleges 30 th percentile for Part-time students at all colleges 90 th percentile for Full-time students at all colleges –The weighting that is incorporated by CCSSE to calculate the standardized means could disguise information in the detail –Data on rural colleges did not markedly differ from that of small colleges

21 How have we used CCSSE results? –Supported the review of student services delivery Created additional services to students at off-campus sites –Created “More After Four,” a program of accelerated delivery of evening courses for working adults –Encouraged collaboration with an external consultant for environmental scan; internal scan –Supported SUNY SOS results where items overlapped

22 Shared with external audiences –Requested by the League for Innovation in the Community College to create an Alert Report

23 Other Materials for this Session –CCSSE Instruction Manual 2004 –CCSSE Institutional Report 2004 –Community College Student Report 2004 (survey instrument) –CCC Administration Guides 2004 –CCC Research Brief 4: Student-Faculty Interaction –CCC Research Brief 8: Skills, Career Plans, Educational Goals, and Extra-curricular Activities

24 Questions? Maren N. Hess Director of Institutional Research (607) 962-9587 Maren.Hess@corning-cc.edu


Download ppt "Corning CC and CCSSE: What We Experienced and How We Handled It Maren N. Hess Director of Institutional Research AIRPO Winter Conference Syracuse – January."

Similar presentations


Ads by Google