Assessing SAGES with NSSE data Office of Institutional Research September 25 th, 2007.

Slides:



Advertisements
Similar presentations
Now That They Stay, What Next?: Using NSSE Results to Enhance the Impact of the Undergraduate Experience.
Advertisements

Maximizing Your NSSE & CCSSE Results
Gary Whisenand Director, Institutional Research August 26, 2011.
Gallaudet Institutional Research Report: Annual Campus Climate Survey: 2010 Pat Hulsebosch: Executive Director – Office of Academic Quality Faculty Senate.
Prepared by: Fawn Skarsten Director Institutional Analysis.
Preliminary Analysis of the 2013 Canadian Graduate and Professional Student Survey (CGPSS) - York University Results Richard Smith Acting Director Office.
DATA UPDATES FACULTY PRESENTATION September 2009.
Indiana State University Assessment of General Education Objectives Using Indicators From National Survey of Student Engagement (NSSE)
Gallaudet University Results on National Survey of Student Engagement Office of Institutional Research August, 2007.
2012 National Survey of Student Engagement Jeremy D. Penn & John D. Hathcoat.
Engagement By Design: Focus on Developmental Education Community College Survey of Student Engagement 2004 Findings.
NSSE and MSU Retention Chris Fastnow Office of Planning and Analysis December 4, 2008.
College of Engineering. Table of Contents Introduction about the National Survey of Student engagement. NSSE response rate Benchmarking areas Areas of.
Presentation to Student Affairs Directors November, 2010 Marcia Belcheir, Ph.D. Institutional Analysis, Assessment, & Reporting.
Mind the Gap: Overview of FSSE and BCSSE Jillian Kinzie NSSE.
Urban Universities: Student Characteristics and Engagement Donna Hawley Martha Shawver.
1 Student Characteristics And Measurements of Student Satisfaction Prepared for: The Faculty Council Subcommittee on Retention The Office of Institutional.
Benchmarking Effective Educational Practice Community Colleges of the State University of New York April, 2005.
National Survey of Student Engagement University of Minnesota, Morris NSSE 2004.
BENCHMARKING EFFECTIVE EDUCATIONAL PRACTICE IN COMMUNITY COLLEGES What We’re Learning. What Lies Ahead.
1 National Survey of Student Engagement (NSSE) 2013 Tiffany Franks Assistant Director, Office of Planning & Analysis October 25, 2013.
Results of AUC’s NSSE Administration in 2011 Office of Institutional Research February 9, 2012.
Community College Survey of Student Engagement CCSSE 2014.
NSSE – Results & Connections Institutional Research & Academic Resources California State Polytechnic University, Pomona October 2, 2013 – Academic Senate.
Selected Results of NSSE 2003: University of Kentucky December 3, 2003.
Mountain View College Spring 2008 CCSSE Results Community College Survey of Student Engagement 2008 Findings.
National Survey of Student Engagement 2006 Marcia Belcheir Institutional Analysis, Assessment & Reporting.
Derek Herrmann & Ryan Smith University Assessment Services.
An Introduction: NSSE and the Concept of Student Engagement.
CCSSE 2013 Findings for Cuesta College San Luis Obispo County Community College District.
Note: CCSSE survey items included in benchmarks are listed at the end of this presentation 1. Active and Collaborative Learning Students learn more when.
2009 National Survey of Student Engagement (NSSE) Report Institutional Research & Information November 18, 2009.
Student Engagement: 2008 National Survey of Student Engagement (NSSE) Office of Institutional Research and Planning Presentation to Senate November 2008.
Gallaudet Institutional Research Report: National Survey of Student Engagement Pat Hulsebosch: Executive Director – Office of Academic Quality Faculty.
APSU 2009 National Survey of Student Engagement Patricia Mulkeen Office of Institutional Research and Effectiveness.
2009 National Survey of Student Engagement (NSSE) Report Institutional Research & Information November 18, 2009.
Maryland Consortium Findings from the 2006 CCSSE Survey.
NSSE and the College of Letters and Sciences Chris Fastnow Office of Planning and Analysis November 7, 2008.
2009 National Survey of Student Engagement (NSSE) Report Institutional Research & Information November 18, 2009.
ESU’s NSSE 2013 Overview Joann Stryker Office of Institutional Research and Assessment University Senate, March 2014.
National Survey of Student Engagement 2009 Missouri Valley College January 6, 2010.
CCSSE 2010: SVC Benchmark Data Note: Benchmark survey items are listed in the Appendix (slides 9-14)
National Survey of Student Engagement 2007 Results for Students in Graduate and Professional Studies.
BEAMS – Using NSSE Data: Understanding the Benchmark Reports.
NATIONAL SURVEY OF STUDENT ENGAGEMENT AT IU KOKOMO Administrative Council 26 September 2007.
Community College Survey of Student Engagement (CCSSE) Benchmarks of Effective Educational Practice Summary Report Background: The Community College Survey.
NSSE 2005 CSUMB Report California State University at Monterey Bay Office of Institutional Effectiveness Office of Assessment and Research.
Looking Inside The “Oakland Experience” Another way to look at NSSE Data April 20, 2009.
SASSE South African Survey of Student Engagement Studente Ontwikkeling en Sukses Student Development and Success UNIVERSITEIT VAN DIE VRYSTAAT UNIVERSITY.
Student Engagement as Policy Direction: Community College Survey of Student Engagement (CCSSE) Skagit Valley College Board of Trustees Policy GP-4 – Education.
Student Engagement and Academic Performance: Identifying Effective Practices to Improve Student Success Shuqi Wu Leeward Community College Hawaii Strategy.
The Satisfied Student October 4 th, Today’s Presentation  Present data from Case’s Senior Survey and the National Survey of Student Engagement.
Jennifer Ballard George Kuh September 19, Overview  NSSE and the Concept of Student Engagement  Select Linfield results:  NSSE 2011  Brief explanation.
NSSE Working Student Study Assessment Day Presentation Office of Assessment Fitchburg State College.
UNDERSTANDING 2012 NATIONAL SURVEY OF STUDENT ENGAGEMENT (NSSE) RESULTS Nicholls State University October 17, 2012.
RESULTS OF THE 2009 ADMINISTRATION OF THE COMMUNITYCOLLEGE SURVEY OF STUDENT ENGAGEMENT Office of Institutional Effectiveness, April 2010.
GGC and Student Engagement.  NSSE  Overall: 27% (down 5%)  First Year: 25% (down 5%)  Seniors: 28% (down 5%)  GGC  Overall: 35% (up 7%)  First.
RESULTS OF THE 2009 ADMINISTRATION OF THE COMMUNITYCOLLEGE SURVEY OF STUDENT ENGAGEMENT Office of Institutional Effectiveness, September 2009.
Del Mar College Utilizing the Results of the 2007 Community College Survey of Student Engagement CCSSE Office of Institutional Research and Effectiveness.
Office of Institutional Research and Effectiveness 1 The University of Texas-Pan American National Survey of Student Engagement 2003, 2004, 2005, 2006.
The University of Texas-Pan American Susan Griffith, Ph.D. Executive Director National Survey of Student Engagement 2003 Results & Recommendations Presented.
The University of Texas-Pan American National Survey of Student Engagement 2013 Presented by: November 2013 Office of Institutional Research & Effectiveness.
National Survey of Student Engagement Executive Snapshot 2007.
The University of Texas-Pan American National Survey of Student Engagement 2014 Presented by: October 2014 Office of Institutional Research & Effectiveness.
Faculty Senate Pat Hulsebosch, Office of Academic Quality 11/17/08.
The University of Texas-Pan American
The University of Texas-Pan American
Derek Herrmann & Ryan Smith University Assessment Services
Director, Institutional Research
The Heart of Student Success
Presentation transcript:

Assessing SAGES with NSSE data Office of Institutional Research September 25 th, 2007

Introduction Today’s talk presents an analysis of the impact of SAGES participation on a number of NSSE survey items Focus on first-year responses only as no full- implementation classes have graduated Report will first focus on NSSE benchmark scales followed by a discussion of individual NSSE items related to SAGES learning goals

Analysis The seven years of NSSE data collected by Case has been divided into four groups: ◦ Pre-SAGES (2000, 2001) ◦ Students in the SAGES pilot ( ) ◦ Students not in the SAGES pilot ( ) ◦ Full-Implementation Classes ( ) An analysis of covariance (ANCOVA) was conducted to determine group differences on NSSE benchmarks and survey items All analyses control for first-year reported major

NSSE Benchmarks In order to condense NSSE’s 80+ survey items into easily discussed and analyzed scales, NSSE has developed five conceptually and statistically valid “benchmark” scales. Scales include “Active and Collaborative Learning,” “Academic Challenge,” “Student-Faculty Interactions,” and “Supportive Campus Environment.” In 2005 NSSE changed the calculation of the fifth benchmark, “Enriching Educational Experiences,” making pre-2005 longitudinal comparisons of this scale unreliable.

Active and Collaborative Learning Measures the extent to which students engage in classroom activities and collaborate with others to solve problems. Items include: asked questions in class, made class presentations, worked with other students on projects during class, worked with other students outside of class.

Active and Collaborative Learning Results indicated that those in the SAGES pilot had significantly higher scores than all other students. Additionally, those in the full-implementation of SAGES had, on average, significantly higher scores than pre- SAGES students.

Active and Collaborative Learning Results indicated that those in the SAGES pilot had significantly higher scores than all other students. Additionally, those in the full-implementation of SAGES had significantly higher scores than pre-SAGES students.

Academic Challenge Measures the extent to which students exert—and institutions demand— academic effort. Items include: time spent preparing for class, number of assigned textbooks, and the extent to which the campus environment is perceived to emphasize academics.

Academic Challenge Results revealed no group differences on the Academic Challenge measure

Student-Faculty Interaction Measures the extent to which students learn and solve problems by interacting with faculty members. Items include: the extent to which students discussed grades with faculty, worked with faculty members on activities other than coursework, and discussed career plans with faculty.

Student-Faculty Interaction Results indicated that those in the SAGES pilot had significantly higher scores than those in the full- implementation class There were no significant differences among the remaining three groups

Student-Faculty Interaction Results indicated that those in the SAGES pilot had significantly higher scores than those in the full- implementation classes There were no significant differences among the remaining three groups

Supportive Campus Environment Measures the extent to which students believe that the institution is committed to their success and cultivates positive relations among different groups on campus. Items include: quality of relationships with faculty, quality of relationships with peers, and the extent to which the campus environment is perceived to provide support to succeed socially and academically.

Supportive Campus Environment Results indicated that those in the SAGES pilot had significantly higher scores than pre-SAGES students and students in the full-implementation classes There were no significant differences among the remaining three groups

Supportive Campus Environment Results indicated that those in the SAGES pilot had significantly higher scores than pre-SAGES students and students in the full-implementation classes There were no significant differences among the remaining three groups

Individual Items SAGES goals focus on classroom participation, developing writing and speaking skills, and academic advising. The following slides examine 6 individual NSSE items relevant to these goals: ◦ Asked questions in class ◦ Gave a class presentation ◦ My experience at Case has contributed to my ability to write clearly and effectively ◦ My experience at Case has contributed to my ability to speak clearly and effectively ◦ Satisfaction with University administration ◦ Satisfaction with academic advising

Individual Items For both of these items, those in the SAGES pilot outperformed all other groups; however, students in the full-implementation classes had significantly higher scores than pre-SAGES students.

Individual Items For both of these items, there were no differences between those in the SAGES pilot and those in the full-implementation classes. Those in the SAGES pilot and the full-implementation classes outperformed the other two groups.

Individual Items For “Satisfaction with Advising” there were no year-to-year differences. For “Satisfaction with Administration,” those in the SAGES pilot had significantly higher scores than all other groups. Those in the full- implementation classes had significantly lower scores than all other groups

Conclusions – Benchmarks For three of the four benchmark scales, students in the SAGES pilot significantly outperformed at least one other group. Of the four benchmark scales, only one— Active and Collaborative Learning— significantly increased from pre-SAGES to the full-implementation class.

Conclusions – Individual items Students in the full-implementation classes scored significantly higher than pre-SAGES students on a number of items: ◦ Asked questions in class ◦ Gave a class presentation ◦ My experience at Case has contributed to my ability to write clearly and effectively ◦ My experience at Case has contributed to my ability to speak clearly and effectively There were no group differences on ratings of satisfaction with advising There was a significant drop in ratings of satisfaction with administration from pre-SAGES to the full-implementation classes.

Limitations NSSE is a survey of the entire experience at Case, not an assessment of SAGES. These analyses statistically control for first- year expected major only. Significant differences from pre- to post-full- implementation of SAGES can be inferred to be due in part to the change in curriculum, but may also be due to unmeasured—or un- measurable—influences.

Thank You! Questions? Concerns? Contact: Tom Geaghan