Updating the National Survey of Student Engagement: Analyses of the NSSE 2.0 Pilots Allison BrckaLorenz Bob Gonyea Angie Miller.

Slides:



Advertisements
Similar presentations
Week 6: Differentiated instruction, engaged learning, teaching for understanding, community of inquiry.
Advertisements

Student Experiences with Information Technology and their Relationship to Other Aspects of Student Engagement Thomas F. Nelson Laird and George D. Kuh.
Frameworks for Assessment of Student Learning: Questions of Concept, Practice, and Detail Christine Siegel, Ph.D. Associate Professor of School Psychology.
ANDREA BROWN DIRECTOR, PROGRAM ASSESSMENT AND INSTITUTIONAL RESEARCH How to Use Student Survey Data (NSSE) to Improve Your Teaching.
Now That They Stay, What Next?: Using NSSE Results to Enhance the Impact of the Undergraduate Experience.
2008 National Survey of Student Engagement – SUNY Oneonta Patty Francis Steve Perry Fall 2008.
Maximizing Your NSSE & CCSSE Results
Gary Whisenand Director, Institutional Research August 26, 2011.
Prepared by: Fawn Skarsten Director Institutional Analysis.
DATA UPDATES FACULTY PRESENTATION September 2009.
Gallaudet University Results on National Survey of Student Engagement Office of Institutional Research August, 2007.
2012 National Survey of Student Engagement Jeremy D. Penn & John D. Hathcoat.
Shimon Sarraf, Research Analyst Center for Postsecondary Research, Indiana University Bloomington Session for NSSE “Veterans” Regional NSSE User’s Workshop.
Students who are… …engaged in the classroom – pass; …engaged in their academic program - return; …engaged in deep learning – graduate. What constitutes.
NSSE 2014: Accolades and Action Items Faculty Senate Nov. 20, 2014 Patrick Barlow, Ph.D., Assessment Coordinator.
Faculty Lend a Helping Hand to Student Success: Measuring Student-Faculty Interactions Amber D. Lambert, Ph.D. Louis M. Rocconi, Ph.D. Amy K. Ribera, Ph.D.
Global Awareness and Student Engagement Allison BrckaLorenz Jim Gieser Program presented at the 2011 ASHE Annual Conference in Charlotte, North Carolina.
Basic Reports and Data Dissemination Strategies Regional Users’ Workshop October 6-7, 2005.
Urban Universities: Student Characteristics and Engagement Donna Hawley Martha Shawver.
Faculty Survey of Student Engagement Using What Faculty Say about Improving Their Teaching Thomas F. Nelson Laird, IUB Jennifer Buckley, IUB Megan Palmer,
1 Student Learning Assessment Assessment is an ongoing process aimed at understanding & improving student learning Formative Assessment – Ongoing feedback.
Shimon Sarraf Center for Postsecondary Research, Indiana University Bloomington Using NSSE to Answer Assessment Questions Regional User’s Workshop October.
Visioning and Fostering Quality Teaching and Learning in Higher Education in Ontario Council of Ontario Educational Developers: Judy Britnell (Ryerson)
INACOL National Standards for Quality Online Teaching, Version 2.
Faculty Survey of Student Engagement Getting Faculty Involved in the Student Engagement Conversation: The Faculty Survey of Student Engagement Thomas.
Want to be first in your CLASSE? Investigating Student Engagement in Your Courses Want to be first in your CLASSE? Investigating Student Engagement in.
2008 – 2014 Results Chris Willis East Stroudsburg University Office of Assessment and Accreditation Spring 2015
Report of the Results of the Faculty Survey of Student Engagement William E. Knight and Jie Wu Office of Institutional Research Presentation to the Faculty.
St. Petersburg College CCSSE 2011 Findings Board of Trustees Meeting.
BCSSE 2013 Institutional Report Concordia University Chicago BCSSE 2013 Institutional Report Concordia University Chicago Elizabeth Owolabi, Ph.D. Director.
Results of AUC’s NSSE Administration in 2011 Office of Institutional Research February 9, 2012.
NSSE – Results & Connections Institutional Research & Academic Resources California State Polytechnic University, Pomona October 2, 2013 – Academic Senate.
Presentation of Results NSSE 2003 Florida Gulf Coast University Office of Planning and Institutional Performance.
Selected Results of NSSE 2003: University of Kentucky December 3, 2003.
Mountain View College Spring 2008 CCSSE Results Community College Survey of Student Engagement 2008 Findings.
MARTIN COMMUNITY COLLEGE ACHIEVING THE DREAM COMMUNITY COLLEGES COUNT IIPS Conference Charlotte, North Carolina July 24-26, 2006 Session: AtD – Use of.
Results from The College at Brockport 2014 NSSE Survey Presentation to President’s Advisory Council– 3/4/15.
An Introduction: NSSE and the Concept of Student Engagement.
2009 National Survey of Student Engagement (NSSE) Report Institutional Research & Information November 18, 2009.
Before & After: What Undergraduates and Alumni Say About Their College Experience and Outcomes Angie L. Miller, NSSE & SNAAP Research Analyst Amber D.
Gallaudet Institutional Research Report: National Survey of Student Engagement Pat Hulsebosch: Executive Director – Office of Academic Quality Faculty.
APSU 2009 National Survey of Student Engagement Patricia Mulkeen Office of Institutional Research and Effectiveness.
2009 National Survey of Student Engagement (NSSE) Report Institutional Research & Information November 18, 2009.
2009 National Survey of Student Engagement (NSSE) Report Institutional Research & Information November 18, 2009.
Results from the National Survey of Student Engagement Margaret Harrigan Office of Academic Planning and Analysis University of Wisconsin-Madison.
Assessing SAGES with NSSE data Office of Institutional Research September 25 th, 2007.
ESU’s NSSE 2013 Overview Joann Stryker Office of Institutional Research and Assessment University Senate, March 2014.
National Survey of Student Engagement 2009 Missouri Valley College January 6, 2010.
BEAMS – Using NSSE Data: Understanding the Benchmark Reports.
1 This CCFSSE Drop-In Overview Presentation Template can be customized using your college’s CCFSSE/CCSSE results. Please review the “Notes” section accompanying.
NSSE 2005 CSUMB Report California State University at Monterey Bay Office of Institutional Effectiveness Office of Assessment and Research.
Looking Inside The “Oakland Experience” Another way to look at NSSE Data April 20, 2009.
Highlights of NSSE 2001: University of Kentucky December 10, 2001.
Student Engagement and Academic Performance: Identifying Effective Practices to Improve Student Success Shuqi Wu Leeward Community College Hawaii Strategy.
NSSE Working Student Study Assessment Day Presentation Office of Assessment Fitchburg State College.
UNDERSTANDING 2012 NATIONAL SURVEY OF STUDENT ENGAGEMENT (NSSE) RESULTS Nicholls State University October 17, 2012.
RESULTS OF THE 2009 ADMINISTRATION OF THE COMMUNITYCOLLEGE SURVEY OF STUDENT ENGAGEMENT Office of Institutional Effectiveness, April 2010.
CCSSE 2014 Findings Southern Crescent Technical College.
Center for Institutional Effectiveness LaMont Rouse, Ph.D. Fall 2015.
RESULTS OF THE 2009 ADMINISTRATION OF THE COMMUNITYCOLLEGE SURVEY OF STUDENT ENGAGEMENT Office of Institutional Effectiveness, September 2009.
Office of Institutional Research and Effectiveness 1 The University of Texas-Pan American National Survey of Student Engagement 2003, 2004, 2005, 2006.
The University of Texas-Pan American Susan Griffith, Ph.D. Executive Director National Survey of Student Engagement 2003 Results & Recommendations Presented.
The University of Texas-Pan American National Survey of Student Engagement 2013 Presented by: November 2013 Office of Institutional Research & Effectiveness.
National Survey of Student Engagement Executive Snapshot 2007.
The University of Texas-Pan American National Survey of Student Engagement 2014 Presented by: October 2014 Office of Institutional Research & Effectiveness.
Pam Arroway Allison BrckaLorenz Kevin Guidry
The University of Texas-Pan American
2017 National Survey of Student Engagement (NSSE)
Derek Herrmann & Ryan Smith University Assessment Services
2013 NSSE Results.
Presentation transcript:

Updating the National Survey of Student Engagement: Analyses of the NSSE 2.0 Pilots Allison BrckaLorenz Bob Gonyea Angie Miller

Goals and Purposes To continue in our core purpose of assessing student engagement in effective educational practices to inform improvement efforts; To stay current with movements and trends in higher education; To improve the clarity, consistency, and applicability of the survey; To improve the properties of existing measures; and To incorporate new measures relevant to effective teaching and learning

Pilot Instruments 2011: new items about quantitative reasoning, effective teaching practices, collaborative learning, technology, global awareness, diverse perspectives, learning strategies, and reading comprehension 2012: from the original NSSE instrument 24 items were deleted, 36 were new. Of the items that stayed a third did not change, a third had minor changes, a third had major changes

Pilot Administrations Institutions were selected to cover a range of institutions by Carnegie type, size, selectivity, minority-serving status, religious affiliation, urban status, geographic region, and online instruction 2011 – 19 institutions; 20,000 students – Institutional response rate average of 35% 2012 – 55 institutions; 50,000 students – Institutional response rate average of 28%

Pilot Samples Two-thirds women Mostly under 24 years old Half earning mostly “A” grades Two-thirds White Nearly all full-time enrolled Half first-generation More men in business and engineering; more women in education, social sciences, and other professions 57% of seniors were transfers in 2011 compared to 45% in 2012

Methods: Qualitative Qualitative information – In 2011 and 2012, 120 students in cognitive interviews, 79 students in 10 focus groups at 12 different campuses, phone interviews for specific questions, write-in responses from students completing the pilots, feedback from outside sources and institutional users – Using Cognitive Interviews to Improve Survey Instruments, Tuesday 1:55

Methods: Individual Items Item descriptives included frequencies, means, standard deviations, standard errors, skewness, kurtosis, and percent missing – Calculated by class level, gender, and major Comparisons between pilots, pilot to the institution’s last standard administration, and co-administration at 7 institutions in 2012

Methods: Content Areas Standard NSSE Level of Academic Challenge Active and Collaborative Learning Student-Faculty Interaction Enriching Educational Experiences Supportive Campus Environment Deep Approaches to Learning Self-Reported Student Gains Updated NSSE Academic Challenge Deep Approaches to Learning Collaborative Learning Experiences with Faculty Diverse Interactions High-Impact Practices Campus Environment Self-Reported Gains

Methods: Indicators Exploratory factor analysis Confirmatory factor analysis Aggregate descriptives Validity differences by groups (2011) Concurrent validity (2011) Predictive validity (2011) Reliability Item response theory Generalizability theory (2012) – The Dependability of the New NSSE: A Generalizability Study, Monday 2:15

Results: Content Areas & Indicators Academic Challenge – Quantitative Reasoning – Learning Strategies Deep Approaches to Learning – Higher Order Learning – Reflective and Integrative Learning Collaborative Learning – Collaborative Learning Experiences with Faculty – Student-Faculty Interaction – Good Teaching Practices Diverse Interactions – Interactions with Diverse Others Campus Environment – Quality of Interactions – Campus Support Student-Reported Gains – Student-Reported Gains High Impact Practices – Individual items

Academic Challenge: Quantitative Reasoning, Learning Strategies QRLS Item Descriptives Item Version Comparisons Qualitative Information Exploratory Factor Analysis Confirmatory Factor Analysis Aggregate Descriptives Internal Consistency Reliability Item Response Theory Generalizability Theory Appropriate for Online Students? Writing, reading, quantitative reasoning, use of learning strategies, perception of challenging coursework, time spent preparing for class Future indices of writing and challenge in the future Generalizability issues: emphasizes the importance of looking within

Deep Approaches to Learning: Higher Order Learning, Reflective and Integrative Learning HOLRIL Item Descriptives Item Version Comparisons Qualitative Information Exploratory Factor Analysis Confirmatory Factor Analysis Aggregate Descriptives Internal Consistency Reliability Item Response Theory Generalizability Theory Appropriate for Online Students? Integrating diverse perspectives, reflection on understandings, higher-order tasks such as application or evaluation Content area likely to merge with Academic Challenge in the future

Collaborative Learning CL Item Descriptives Item Version Comparisons Qualitative Information Exploratory Factor Analysis Confirmatory Factor Analysis Aggregate Descriptives Internal Consistency Reliability Item Response Theory Generalizability Theory Appropriate for Online Students? Working with peers, helping peers, receiving help from peers Results from the 2011 pilot showed large differences for online students 2012 results showed that these items are appropriate for online students despite collaborating less with peers

Experiences with Faculty: Student-Faculty Interaction, Good Teaching Practices SFIGTP Item Descriptives Item Version Comparisons Qualitative Information Exploratory Factor Analysis Confirmatory Factor Analysis Aggregate Descriptives Internal Consistency Reliability Item Response Theory Generalizability Theory Appropriate for Online Students? Instructors’ use of clear teaching behaviors, faculty mentoring, working with faculty outside of class, in- class interactions with faculty Online students report fewer experiences with faculty but items are still appropriate for online learners Some issues with part/full- time students answering “In how many of your courses” so items will be reframed in 2013

Diverse Interactions DI Item Descriptives Item Version Comparisons Qualitative Information Exploratory Factor Analysis Confirmatory Factor Analysis Aggregate Descriptives Internal Consistency Reliability Item Response Theory Generalizability Theory Appropriate for Online Students? Having serious discussions with people who are different from you Qualitative Issues: Using Cognitive Interviews to Improve Survey Instruments, Tuesday 1:55 Items rewritten for clarity in 2013

High-Impact Practices HIP Item Descriptives Item Version Comparisons Qualitative Information Students’ participation in, or plans to participate in a variety of high-impact educational experiences: – Learning community – Internship – Study abroad – Research with faculty – Culminating senior experiences – Service learning – Formal leadership experiences

Campus Environment: Quality of Interactions, Campus Support QOICS Item Descriptives Item Version Comparisons Qualitative Information Exploratory Factor Analysis Confirmatory Factor Analysis Aggregate Descriptives Internal Consistency Reliability Item Response Theory Generalizability Theory Appropriate for Online Students? Perceptions of the quality of interactions with various people on campus, perceptions of different ways their institution supports success or encourages beneficial activities Small differences for online students but items are still appropriate

Self-Reported Gains SRG Item Descriptives Item Version Comparisons Qualitative Information Exploratory Factor Analysis Confirmatory Factor Analysis Aggregate Descriptives Internal Consistency Reliability Item Response Theory Generalizability Theory Appropriate for Online Students? Students’ general perception of their learning in a variety of areas Diverse grouping of items should not be interpreted as a unidimensional construct An item from the 2011 pilot about becoming an active and informed citizen was removed in 2012 but added to the 2013 survey

Looking Ahead Updated survey content with both new and modified items New groupings of items to serve as indicators of engagement New items within optional modules – Academic Advising, Civic Engagement, Development of Transferable Skills, Experiences with Diverse Perspectives, Learning with Technology, Experiences with Writing

Questions? Paper, presentation, and more information about NSSE at nsse.iub.edu Special thanks to our research team: Jim Cole, Yiran Dong, Kevin Fosnacht, Kevin Guidry, Heather Haeger, Amber D. Lambert, Thomas Nelson Laird, Wen Qi, Amy Ribera, Louis Rocconi, Shimon Sarraf, Rick Shoup, Malika Tukibayeva