Director, Institutional Research

Slides:



Advertisements
Similar presentations
Maximizing Your NSSE & CCSSE Results
Advertisements

Indiana State University Assessment of General Education Objectives Using Indicators From National Survey of Student Engagement (NSSE)
Engagement By Design: Focus on Developmental Education Community College Survey of Student Engagement 2004 Findings.
Critical Thinking In Every Classroom Teaching Academy: New Faculty Orientation August 11, 2007.
Achievement of Educational Outcomes: Seniors’ Self- evaluations from 2004 & 2007 National Surveys of Student Engagement (NSSE) Cathy Sanders Director of.
Benchmarking Effective Educational Practice Community Colleges of the State University of New York April, 2005.
BENCHMARKING EFFECTIVE EDUCATIONAL PRACTICE IN COMMUNITY COLLEGES What We’re Learning. What Lies Ahead.
Topic #3 - CRITICAL THINKING Key Evidence 1 Provided by Amarillo College Offices of Institutional Research and Outcomes Assessments.
Derek Herrmann & Ryan Smith University Assessment Services.
St. Petersburg College CCSSE 2011 Findings Board of Trustees Meeting.
SENSE 2013 Findings for College of Southern Idaho.
Community College Survey of Student Engagement CCSSE 2014.
Mountain View College Spring 2008 CCSSE Results Community College Survey of Student Engagement 2008 Findings.
MARTIN COMMUNITY COLLEGE ACHIEVING THE DREAM COMMUNITY COLLEGES COUNT IIPS Conference Charlotte, North Carolina July 24-26, 2006 Session: AtD – Use of.
CCSSE 2013 Findings for Cuesta College San Luis Obispo County Community College District.
Note: CCSSE survey items included in benchmarks are listed at the end of this presentation 1. Active and Collaborative Learning Students learn more when.
Gallaudet Institutional Research Report: National Survey of Student Engagement Pat Hulsebosch: Executive Director – Office of Academic Quality Faculty.
Strategic Conversation: A Commitment to Student Engagement.
CCSSE 2010: SVC Benchmark Data Note: Benchmark survey items are listed in the Appendix (slides 9-14)
NSSE 2013 How to Use Results (or “Why you should care about NSSE”) 8/26/
Community College Survey of Student Engagement (CCSSE) Benchmarks of Effective Educational Practice Summary Report Background: The Community College Survey.
Making Connections Dimensions of Student Engagement 2010 Findings.
Office of Institutional Research CCSSE & Active and Collaborative Learning.
Looking Inside The “Oakland Experience” Another way to look at NSSE Data April 20, 2009.
Student Engagement as Policy Direction: Community College Survey of Student Engagement (CCSSE) Skagit Valley College Board of Trustees Policy GP-4 – Education.
De Anza College 2009 Community College Survey of Student Engagement Presented to the Academic Senate February 28, 2011 Prepared by Mallory Newell Institutional.
Student Engagement and Academic Performance: Identifying Effective Practices to Improve Student Success Shuqi Wu Leeward Community College Hawaii Strategy.
De Anza College 2009 Community College Survey of Student Engagement Presented to the Academic Senate January 10, 2011 Prepared by Mallory Newell Institutional.
Assessment Presentation Day For Faculty Cindy J. Speaker, Ph.D. Wells College August 21, 2006.
RESULTS OF THE 2009 ADMINISTRATION OF THE COMMUNITYCOLLEGE SURVEY OF STUDENT ENGAGEMENT Office of Institutional Effectiveness, April 2010.
CCSSE 2015 Findings for OSU Institute of Technology.
CCSSE 2014 Findings Southern Crescent Technical College.
CCSSE 2012 Findings for Southern Crescent Technical College.
RESULTS OF THE 2009 ADMINISTRATION OF THE COMMUNITYCOLLEGE SURVEY OF STUDENT ENGAGEMENT Office of Institutional Effectiveness, September 2009.
Del Mar College Utilizing the Results of the 2007 Community College Survey of Student Engagement CCSSE Office of Institutional Research and Effectiveness.
Learning Communities at Ventura College. What are learning communities? Interdisciplinary learning Importance of sense of community for learning Student.
Good teaching for diverse learners
What Is This Intentional Learning Thing?
M-LANG project  Ref. n NO01-KA Interactive Exchange Workshop on how to use response systems and ICT tools for creating interactive learning.
“Bridging General Education and the Major: Critical Thinking, the Mid- Curriculum, and Learning Gains Assessment” Dr. Jane Detweiler, Associate Dean, College.
Closing the Experience Gap March 30, 2017
Student Engagement Data in the UK: Policy and Practice
Jackson College CCSSE & CCFSSE Findings Community College Survey of Student Engagement Community College Faculty Survey of Student Engagement Administered:
Student Engagement at Orange Coast College
NSSE Results for Faculty
The University of Texas-Pan American
Reflecting on Your Teaching – Using Learning outcomes to Critically Inform Your Teaching Content Alan Somerville ASSOCIATE DEAN LEARNING AND TEACHING,
General Education Assessment
Formative assessment through class discussion
Critical Thinking Skills In English
Derek Herrmann & Ryan Smith University Assessment Services
Your Institutional Report Step by Step
Helping US Become Knowledge-Able About Student Engagement
North Seattle College All College Meeting
NISD Leadership Academy
Imagine Success Engaging Entering Students Innovations 2009
Assessment of Learning in Student-Centered Courses
Closing the Loop on Student Feedback
Workforce Engagement Survey
Director, Institutional Research
ILOs, CCSSE, & Student Engagement
Student Equity Planning August 28, rd Meeting
The Heart of Student Success
Comparison of Learning Paradigms: Learner-Centered vs
Faculty In-Service Week
2013 NSSE Results.
Learning Community II Survey
Student Learning Outcomes at CSUDH
Characteristics of a historian
CCSSE 2015 Findings for OSU Institute of Technology
Presentation transcript:

Director, Institutional Research Using Community College Survey of Student Engagement (CCSSE) for Institutional Improvement Daniel Martinez, PhD Director, Institutional Research 2/27/2014

CCSSE Background Started in 2001 Research Based Student & Faculty Surveys Why student engagement? Learning Retention

CCSSE Survey at COD Administered in 2008, 2010, and 2012. Sample Size Headcount 2008 895 10,255 2010 916 11,039 2012 1,021 9,579

CCSSE: A Tool for Improvement There are 3 ways to use CCSSE: Benchmarks Direct Information Institutional measures

CCSSE Benchmarks

CCSSE Benchmarks for Effective Educational Practice The five CCSSE benchmarks are: - Active and Collaborative Learning - Student Effort - Academic Challenge - Student-Faculty Interaction - Support for Learners

Benchmarking – and Reaching for Excellence The most important comparison: Where you are now, compared with where you want to be. CCSSE offers five ways that colleges can use benchmarks to better understand their performance — and to reach for excellence. They can: Compare their performance to that of the national average — and at the same time, resist the average. Comparing themselves to the average of participating colleges (the 50 mark) is a start. But then colleges should assess their performance on the individual survey items that make up the benchmark. Most colleges will find areas for improvement at the item level. Compare themselves to high-performing colleges. A college might, for example, aspire to be at or above the 85th percentile on some or all benchmarks. Colleges also can learn by examining the practices of high-performing peers. Measure their overall performance against results for their least-engaged group. A college might aspire to make sure all subgroups (e.g., full-time and part-time students; developmental students; students across all racial, ethnic, and income groups; etc.) engage in their education at similarly high levels. Examine areas that their colleges value strongly. They might focus, for example, on survey items related to service to high-risk students or on those related to academic rigor (e.g., are they asking students to read and write enough?). Make the most important comparison: where they are now, compared with where they want to be. This is the mark of an institution committed to continuous improvement.

COD: 2008-2012

COD: 2008-2012

COD: 2008-2012

Active and Collaborative Learning

Student Effort

Academic Challenge

Students-Faculty Interaction

Support for Learners Survey results reveal both areas in which colleges are doing well and areas for improvement in creating multiple, intentional connections with students, beginning with the first point of contact with the college.   For example, nearly nine in 10 SENSE respondents (88%) agree or strongly agree that they knew how to get in touch with their instructors outside of class. But more than two-thirds (68%) of SENSE respondents and 47% of CCSSE respondents report that they never discussed ideas from readings or classes with instructors outside of class. These results clearly indicate opportunities for colleges to increase their intentionality in seeking to build meaningful connections with students.

Active and Collaborative Learning

Student Effort

Academic Challenge

Student-Faculty Interaction

Support for Learners

Select Comparisons: COD vs. Others

Select Comparisons: COD vs. Others

Select Comparisons: COD vs. Others

Select Comparisons: COD vs. Others

Select Comparisons: COD vs. Others

Select Comparisons: COD vs. Others

Select Comparisons: COD vs. Others

Select Comparisons: COD vs. Scorecard Colleges

Select Comparisons: COD vs. Scorecard Colleges

Select Comparisons: COD vs. Scorecard Colleges

Select Comparisons: COD vs. Scorecard Colleges

Select Comparisons: COD vs. Scorecard Colleges

Select Comparisons: COD vs. Scorecard Colleges

Select Comparisons: COD vs. Scorecard Colleges

Direct Information

Academic advising/planning (2012) Direct information is data taken from the survey itself. For example, questions 13a-k measure the frequency of use, satisfaction with, and importance of various student services. This is an example of one of the questions.

Institutional Measures

The questions on the CCSSE have been mapped to the WASC-ACCJC accreditation standards by CCSSE. However, this is their interpretation and is not a hard-and-fast rule. Questions can be added or deleted and standards not addressed here could be mapped to CCSSE if the institution felt it was a good match. This is a good way to see what standards are addressed via the CCSSE and by that token, which standards may need alternative forms of data/evidence. What this also means is that other college goals may be supported via the CCSSE. The questions just need to be mapped to the goal.

IIA1a When questions are mapped to a goal, the measures can be standardized for comparison purposes. For instance, CCSSE says that accreditation standard IIA1a is measured using the following survey items: 4l, 4o, 8b,8d, 8e, 8f, 8g, 8h, 8i, 9b, 9d, 12a,12b, 12c, 12d, 12e, 12f, 12g, 12h, 12i, 12n, 12o, 13d2&3, 13e2&3, 13h2&3, 13k1&2&3, 14c, 17a, 17b, 17d, 17e, 17f, 22. Not all of the items are on the same metric. When all of the responses to these items are added together, the range of scores is 33-130. These various scales can be normalized (that is, recoded to range from 0 to 1) using the following formula: Normalized score = (raw score-[minimum score])/([maximum score] – [minimum score]). In this example, it would be (raw score-33)/(130-33). The mean raw score on this measure for 2010 and 2011 was 79.48 and 79.79, respectively. The normalized score, shown here, is 47.90 and 48.34, respectively.

Critical Thinking Analyzing and solving complex problems 5b. Analyzing the basic elements of an idea, experience, or theory 5e. Applying theories or concepts to practical problems or in new situations. Constructing sound arguments and evaluating the arguments of others 5d. Making judgments about the value or soundness of information, arguments, or methods Considering and evaluating rival hypotheses Recognizing and assessing evidence from a variety of sources Generalizing appropriately from specific cases Integrating knowledge across a range of academic and everyday contexts 4d. Worked on a paper or project that required integrating ideas or information from various sources 5c. Synthesizing and organizing ideas, information , or experiences in new ways Identifying your own and others’ assumptions, biases, and their consequences OVERALL 12e. Thinking critically and analytically The questions on the CCSSE have been mapped to the WASC-ACCJC accreditation standards by CCSSE. However, this is their interpretation and is not a hard-and-fast rule. Questions can be added or deleted and standards not addressed here could be mapped to CCSSE if the institution felt it was a good match. This is a good way to see what standards are addressed via the CCSSE and by that token, which standards may need alternative forms of data/evidence. What this also means is that other college goals may be supported via the CCSSE. The questions just need to be mapped to the goal.

Next Steps?