Presentation is loading. Please wait.

Presentation is loading. Please wait.

Updating the National Survey of Student Engagement: Analyses of the NSSE 2.0 Pilots Allison BrckaLorenz Bob Gonyea Angie Miller.

Similar presentations


Presentation on theme: "Updating the National Survey of Student Engagement: Analyses of the NSSE 2.0 Pilots Allison BrckaLorenz Bob Gonyea Angie Miller."— Presentation transcript:

1 Updating the National Survey of Student Engagement: Analyses of the NSSE 2.0 Pilots Allison BrckaLorenz Bob Gonyea Angie Miller

2 Goals and Purposes To continue in our core purpose of assessing student engagement in effective educational practices to inform improvement efforts; To stay current with movements and trends in higher education; To improve the clarity, consistency, and applicability of the survey; To improve the properties of existing measures; and To incorporate new measures relevant to effective teaching and learning

3 Pilot Instruments 2011: new items about quantitative reasoning, effective teaching practices, collaborative learning, technology, global awareness, diverse perspectives, learning strategies, and reading comprehension 2012: from the original NSSE instrument 24 items were deleted, 36 were new. Of the items that stayed a third did not change, a third had minor changes, a third had major changes

4 Pilot Administrations Institutions were selected to cover a range of institutions by Carnegie type, size, selectivity, minority-serving status, religious affiliation, urban status, geographic region, and online instruction 2011 – 19 institutions; 20,000 students – Institutional response rate average of 35% 2012 – 55 institutions; 50,000 students – Institutional response rate average of 28%

5 Pilot Samples Two-thirds women Mostly under 24 years old Half earning mostly “A” grades Two-thirds White Nearly all full-time enrolled Half first-generation More men in business and engineering; more women in education, social sciences, and other professions 57% of seniors were transfers in 2011 compared to 45% in 2012

6 Methods: Qualitative Qualitative information – In 2011 and 2012, 120 students in cognitive interviews, 79 students in 10 focus groups at 12 different campuses, phone interviews for specific questions, write-in responses from students completing the pilots, feedback from outside sources and institutional users – Using Cognitive Interviews to Improve Survey Instruments, Tuesday 1:55

7 Methods: Individual Items Item descriptives included frequencies, means, standard deviations, standard errors, skewness, kurtosis, and percent missing – Calculated by class level, gender, and major Comparisons between pilots, pilot to the institution’s last standard administration, and co-administration at 7 institutions in 2012

8 Methods: Content Areas Standard NSSE Level of Academic Challenge Active and Collaborative Learning Student-Faculty Interaction Enriching Educational Experiences Supportive Campus Environment Deep Approaches to Learning Self-Reported Student Gains Updated NSSE Academic Challenge Deep Approaches to Learning Collaborative Learning Experiences with Faculty Diverse Interactions High-Impact Practices Campus Environment Self-Reported Gains

9 Methods: Indicators Exploratory factor analysis Confirmatory factor analysis Aggregate descriptives Validity differences by groups (2011) Concurrent validity (2011) Predictive validity (2011) Reliability Item response theory Generalizability theory (2012) – The Dependability of the New NSSE: A Generalizability Study, Monday 2:15

10 Results: Content Areas & Indicators Academic Challenge – Quantitative Reasoning – Learning Strategies Deep Approaches to Learning – Higher Order Learning – Reflective and Integrative Learning Collaborative Learning – Collaborative Learning Experiences with Faculty – Student-Faculty Interaction – Good Teaching Practices Diverse Interactions – Interactions with Diverse Others Campus Environment – Quality of Interactions – Campus Support Student-Reported Gains – Student-Reported Gains High Impact Practices – Individual items

11 Academic Challenge: Quantitative Reasoning, Learning Strategies QRLS Item Descriptives Item Version Comparisons Qualitative Information Exploratory Factor Analysis Confirmatory Factor Analysis Aggregate Descriptives Internal Consistency Reliability Item Response Theory Generalizability Theory Appropriate for Online Students? Writing, reading, quantitative reasoning, use of learning strategies, perception of challenging coursework, time spent preparing for class Future indices of writing and challenge in the future Generalizability issues: emphasizes the importance of looking within

12 Deep Approaches to Learning: Higher Order Learning, Reflective and Integrative Learning HOLRIL Item Descriptives Item Version Comparisons Qualitative Information Exploratory Factor Analysis Confirmatory Factor Analysis Aggregate Descriptives Internal Consistency Reliability Item Response Theory Generalizability Theory Appropriate for Online Students? Integrating diverse perspectives, reflection on understandings, higher-order tasks such as application or evaluation Content area likely to merge with Academic Challenge in the future

13 Collaborative Learning CL Item Descriptives Item Version Comparisons Qualitative Information Exploratory Factor Analysis Confirmatory Factor Analysis Aggregate Descriptives Internal Consistency Reliability Item Response Theory Generalizability Theory Appropriate for Online Students? Working with peers, helping peers, receiving help from peers Results from the 2011 pilot showed large differences for online students 2012 results showed that these items are appropriate for online students despite collaborating less with peers

14 Experiences with Faculty: Student-Faculty Interaction, Good Teaching Practices SFIGTP Item Descriptives Item Version Comparisons Qualitative Information Exploratory Factor Analysis Confirmatory Factor Analysis Aggregate Descriptives Internal Consistency Reliability Item Response Theory Generalizability Theory Appropriate for Online Students? Instructors’ use of clear teaching behaviors, faculty mentoring, working with faculty outside of class, in- class interactions with faculty Online students report fewer experiences with faculty but items are still appropriate for online learners Some issues with part/full- time students answering “In how many of your courses” so items will be reframed in 2013

15 Diverse Interactions DI Item Descriptives Item Version Comparisons Qualitative Information Exploratory Factor Analysis Confirmatory Factor Analysis Aggregate Descriptives Internal Consistency Reliability Item Response Theory Generalizability Theory Appropriate for Online Students? Having serious discussions with people who are different from you Qualitative Issues: Using Cognitive Interviews to Improve Survey Instruments, Tuesday 1:55 Items rewritten for clarity in 2013

16 High-Impact Practices HIP Item Descriptives Item Version Comparisons Qualitative Information Students’ participation in, or plans to participate in a variety of high-impact educational experiences: – Learning community – Internship – Study abroad – Research with faculty – Culminating senior experiences – Service learning – Formal leadership experiences

17 Campus Environment: Quality of Interactions, Campus Support QOICS Item Descriptives Item Version Comparisons Qualitative Information Exploratory Factor Analysis Confirmatory Factor Analysis Aggregate Descriptives Internal Consistency Reliability Item Response Theory Generalizability Theory Appropriate for Online Students? Perceptions of the quality of interactions with various people on campus, perceptions of different ways their institution supports success or encourages beneficial activities Small differences for online students but items are still appropriate

18 Self-Reported Gains SRG Item Descriptives Item Version Comparisons Qualitative Information Exploratory Factor Analysis Confirmatory Factor Analysis Aggregate Descriptives Internal Consistency Reliability Item Response Theory Generalizability Theory Appropriate for Online Students? Students’ general perception of their learning in a variety of areas Diverse grouping of items should not be interpreted as a unidimensional construct An item from the 2011 pilot about becoming an active and informed citizen was removed in 2012 but added to the 2013 survey

19 Looking Ahead Updated survey content with both new and modified items New groupings of items to serve as indicators of engagement New items within optional modules – Academic Advising, Civic Engagement, Development of Transferable Skills, Experiences with Diverse Perspectives, Learning with Technology, Experiences with Writing

20 Questions? Paper, presentation, and more information about NSSE at nsse.iub.edu abrckalo@indiana.edu rgonyea@indiana.edu anglmill@indiana.edu Special thanks to our research team: Jim Cole, Yiran Dong, Kevin Fosnacht, Kevin Guidry, Heather Haeger, Amber D. Lambert, Thomas Nelson Laird, Wen Qi, Amy Ribera, Louis Rocconi, Shimon Sarraf, Rick Shoup, Malika Tukibayeva


Download ppt "Updating the National Survey of Student Engagement: Analyses of the NSSE 2.0 Pilots Allison BrckaLorenz Bob Gonyea Angie Miller."

Similar presentations


Ads by Google