Presentation is loading. Please wait.

Presentation is loading. Please wait.

Director, Institutional Research

Similar presentations


Presentation on theme: "Director, Institutional Research"— Presentation transcript:

1 Director, Institutional Research
Using Community College Survey of Student Engagement (CCSSE) for Institutional Improvement Daniel Martinez, PhD Director, Institutional Research 2/27/2014

2 CCSSE Background Started in 2001 Research Based
Student & Faculty Surveys Why student engagement? Learning Retention

3 CCSSE Survey at COD Administered in 2008, 2010, and 2012. Sample Size
Headcount 2008 895 10,255 2010 916 11,039 2012 1,021 9,579

4 CCSSE: A Tool for Improvement
There are 3 ways to use CCSSE: Benchmarks Direct Information Institutional measures

5 CCSSE Benchmarks

6 CCSSE Benchmarks for Effective Educational Practice
The five CCSSE benchmarks are: - Active and Collaborative Learning - Student Effort - Academic Challenge - Student-Faculty Interaction - Support for Learners

7 Benchmarking – and Reaching for Excellence
The most important comparison: Where you are now, compared with where you want to be. CCSSE offers five ways that colleges can use benchmarks to better understand their performance — and to reach for excellence. They can: Compare their performance to that of the national average — and at the same time, resist the average. Comparing themselves to the average of participating colleges (the 50 mark) is a start. But then colleges should assess their performance on the individual survey items that make up the benchmark. Most colleges will find areas for improvement at the item level. Compare themselves to high-performing colleges. A college might, for example, aspire to be at or above the 85th percentile on some or all benchmarks. Colleges also can learn by examining the practices of high-performing peers. Measure their overall performance against results for their least-engaged group. A college might aspire to make sure all subgroups (e.g., full-time and part-time students; developmental students; students across all racial, ethnic, and income groups; etc.) engage in their education at similarly high levels. Examine areas that their colleges value strongly. They might focus, for example, on survey items related to service to high-risk students or on those related to academic rigor (e.g., are they asking students to read and write enough?). Make the most important comparison: where they are now, compared with where they want to be. This is the mark of an institution committed to continuous improvement.

8 COD:

9 COD:

10 COD:

11 Active and Collaborative Learning

12 Student Effort

13 Academic Challenge

14 Students-Faculty Interaction

15 Support for Learners Survey results reveal both areas in which colleges are doing well and areas for improvement in creating multiple, intentional connections with students, beginning with the first point of contact with the college. For example, nearly nine in 10 SENSE respondents (88%) agree or strongly agree that they knew how to get in touch with their instructors outside of class. But more than two-thirds (68%) of SENSE respondents and 47% of CCSSE respondents report that they never discussed ideas from readings or classes with instructors outside of class. These results clearly indicate opportunities for colleges to increase their intentionality in seeking to build meaningful connections with students.

16 Active and Collaborative Learning

17 Student Effort

18 Academic Challenge

19 Student-Faculty Interaction

20 Support for Learners

21 Select Comparisons: COD vs. Others

22 Select Comparisons: COD vs. Others

23 Select Comparisons: COD vs. Others

24 Select Comparisons: COD vs. Others

25 Select Comparisons: COD vs. Others

26 Select Comparisons: COD vs. Others

27 Select Comparisons: COD vs. Others

28 Select Comparisons: COD vs. Scorecard Colleges

29 Select Comparisons: COD vs. Scorecard Colleges

30 Select Comparisons: COD vs. Scorecard Colleges

31 Select Comparisons: COD vs. Scorecard Colleges

32 Select Comparisons: COD vs. Scorecard Colleges

33 Select Comparisons: COD vs. Scorecard Colleges

34 Select Comparisons: COD vs. Scorecard Colleges

35 Direct Information

36 Academic advising/planning (2012)
Direct information is data taken from the survey itself. For example, questions 13a-k measure the frequency of use, satisfaction with, and importance of various student services. This is an example of one of the questions.

37 Institutional Measures

38 The questions on the CCSSE have been mapped to the WASC-ACCJC accreditation standards by CCSSE. However, this is their interpretation and is not a hard-and-fast rule. Questions can be added or deleted and standards not addressed here could be mapped to CCSSE if the institution felt it was a good match. This is a good way to see what standards are addressed via the CCSSE and by that token, which standards may need alternative forms of data/evidence. What this also means is that other college goals may be supported via the CCSSE. The questions just need to be mapped to the goal.

39 IIA1a When questions are mapped to a goal, the measures can be standardized for comparison purposes. For instance, CCSSE says that accreditation standard IIA1a is measured using the following survey items: 4l, 4o, 8b,8d, 8e, 8f, 8g, 8h, 8i, 9b, 9d, 12a,12b, 12c, 12d, 12e, 12f, 12g, 12h, 12i, 12n, 12o, 13d2&3, 13e2&3, 13h2&3, 13k1&2&3, 14c, 17a, 17b, 17d, 17e, 17f, 22. Not all of the items are on the same metric. When all of the responses to these items are added together, the range of scores is These various scales can be normalized (that is, recoded to range from 0 to 1) using the following formula: Normalized score = (raw score-[minimum score])/([maximum score] – [minimum score]). In this example, it would be (raw score-33)/(130-33). The mean raw score on this measure for 2010 and 2011 was and 79.79, respectively. The normalized score, shown here, is and 48.34, respectively.

40 Critical Thinking Analyzing and solving complex problems 5b. Analyzing the basic elements of an idea, experience, or theory 5e. Applying theories or concepts to practical problems or in new situations. Constructing sound arguments and evaluating the arguments of others 5d. Making judgments about the value or soundness of information, arguments, or methods Considering and evaluating rival hypotheses Recognizing and assessing evidence from a variety of sources Generalizing appropriately from specific cases Integrating knowledge across a range of academic and everyday contexts 4d. Worked on a paper or project that required integrating ideas or information from various sources 5c. Synthesizing and organizing ideas, information , or experiences in new ways Identifying your own and others’ assumptions, biases, and their consequences OVERALL 12e. Thinking critically and analytically The questions on the CCSSE have been mapped to the WASC-ACCJC accreditation standards by CCSSE. However, this is their interpretation and is not a hard-and-fast rule. Questions can be added or deleted and standards not addressed here could be mapped to CCSSE if the institution felt it was a good match. This is a good way to see what standards are addressed via the CCSSE and by that token, which standards may need alternative forms of data/evidence. What this also means is that other college goals may be supported via the CCSSE. The questions just need to be mapped to the goal.

41 Next Steps?


Download ppt "Director, Institutional Research"

Similar presentations


Ads by Google