Are those Rose-Colored Glasses You are Wearing?: Student and Alumni Survey Responses Amber D. Lambert, Ph.D. Angie L. Miller, Ph.D. Association for the.

Slides:



Advertisements
Similar presentations
Now That They Stay, What Next?: Using NSSE Results to Enhance the Impact of the Undergraduate Experience.
Advertisements

2006 Student Opinion Survey Summary November 2006 GUSTO Town Meeting on Accreditation & Assessment Genesee Community College Presented by: Carol Marriott.
Maximizing Your NSSE & CCSSE Results
Assessment of the Impact of Ubiquitous Computing on Learning Ross A. Griffith Wake Forest University Ubiquitous Computing Conference Seton Hall University.
Oklahoma Association for Institutional Research & Planning Spring Conference April 1, 2005 National Assessment of College-Level Learning Debra L. Stuart.
Assessment Institute October 2013 Amber D. Lambert, Ph.D. Angie L. Miller, Ph.D. Center for Postsecondary Research, Indiana University Assessment with.
Assessment of General Education Patricia E. Beeson Vice Provost for Graduate and Undergraduate Studies February 11, 2008 University of Pittsburgh.
2012 National Survey of Student Engagement Jeremy D. Penn & John D. Hathcoat.
Association for Institutional Research Annual Forum May 2013 Angie L. Miller, Ph.D. Amber D. Lambert, Ph.D. Center for Postsecondary Research, Indiana.
American College Personnel Association March 31, 2014 Angie L. Miller, Ph.D. Amber D. Lambert, Ph.D. Center for Postsecondary Research, Indiana University.
Student and Faculty Perceptions on Student Engagement: ISU’s NSSE and FSSE Results 2013 Ruth Cain, Assessment Coordinator Dan Clark, Department of History.
Using SNAAP Data for Positive Change 3 Million Stories March 8-9, 2013 Sally Gaskill, Director Amber D. Lambert, Research Analyst Angie L. Miller, Research.
Measuring the Impact of Service-Learning on Student Retention and Civic Skills Matthew Roy, Ph.D. Assistant Vice Chancellor for Civic Engagement University.
OUTCOMES ASSESSMENT Developing and Implementing an Effective Plan.
FAMU ASSESSMENT PLAN PhD Degree Program in Entomology Dr. Lambert Kanga / CESTA.
4/16/07 Assessment of the Core – Social Inquiry Charlyne L. Walker Director of Educational Research and Evaluation, Arts and Sciences.
Graduate Program Assessment Report. University of Central Florida Mission Communication M.A. Program is dedicated to serving its students, faculty, the.
Educational Outcomes: The Role of Competencies and The Importance of Assessment.
Global Awareness and Student Engagement Allison BrckaLorenz Jim Gieser Program presented at the 2011 ASHE Annual Conference in Charlotte, North Carolina.
School of Business University of Bridgeport Admissions Presentation Robert Gilmore, Ph.D. Associate Dean School of Business.
Assessment of Student Affairs Initiatives for First-Year Students National Conference on First-Year Assessment October 12-14, 2008 San Antonio, Texas Jennifer.
Mind the Gap: Overview of FSSE and BCSSE Jillian Kinzie NSSE.
Urban Universities: Student Characteristics and Engagement Donna Hawley Martha Shawver.
1 Student Characteristics And Measurements of Student Satisfaction Prepared for: The Faculty Council Subcommittee on Retention The Office of Institutional.
Home Economics Teachers’ Readiness for Teaching STEM
American Educational Research Association April 2013 Angie L. Miller, Ph.D. Amber D. Lambert, Ph.D. Center for Postsecondary Research, Indiana University.
Abstract Institutions may be interested in using alumni career success as evidence of institutional effectiveness, but the current study suggests that.
Barbara Hauptman Sally Gaskill Carly Rush AAAE June 1, 2012.
CAA’s IBHE Program Review Presentation April 22, 2011.
Assessment Surveys July 22, 2004 Chancellor’s Meeting.
2008 – 2014 Results Chris Willis East Stroudsburg University Office of Assessment and Accreditation Spring 2015
SURVEY OF RECENT GRADUATES AS AN INSTRUMENT FOR ASSISTING IN THE ASSESSMENT OF INSTITUTIONAL EFFECTIVENESS Dr. Teresa Ward Ms. Beth Katz Office of Institutional.
Enhancing Parents’ Role in Higher Education Assessment Anne Marie Delaney Director of Institutional Research, Babson College.
LEARNING PROFILE Title of Degree Program PROGRAM CHARACTERISTICS (Description, Unique Experiences, Inputs, Outcomes) (EXAMPLES) Year Established. Accreditation.
Assessment of Student Learning Faculty In-service June 5, 2006.
Implication of Gender and Perception of Self- Competence on Educational Aspiration among Graduates in Taiwan Wan-Chen Hsu and Chia- Hsun Chiang Presenter.
Derek Herrmann & Ryan Smith University Assessment Services.
1 National Survey of Student Engagement (NSSE) 2013 Tiffany Franks Assistant Director, Office of Planning & Analysis October 25, 2013.
Results from the 2013 Undergraduate Alumni Survey Karen Gil, Dean and Distinguished Professor, College of Arts & Sciences Lynn Williford, Assistant Provost,
Results of AUC’s NSSE Administration in 2011 Office of Institutional Research February 9, 2012.
Before & After: What Undergraduates and Alumni Say About Their College Experience and Outcomes Angie L. Miller, NSSE & SNAAP Research Analyst Amber D.
Student Engagement: 2008 National Survey of Student Engagement (NSSE) Office of Institutional Research and Planning Presentation to Senate November 2008.
39 th International Conference on Social Theory, Politics, and the Arts October 25, 2013 Amber D. Lambert, Ph.D. Angie L. Miller, Ph.D. Center for Postsecondary.
LEARNING OUTCOMES & SNAAP ATHE Leadership Institute Montreal, July 2015 Sally Gaskill, Director Strategic National Arts Alumni Project Indiana University.
Assessing SAGES with NSSE data Office of Institutional Research September 25 th, 2007.
ESU’s NSSE 2013 Overview Joann Stryker Office of Institutional Research and Assessment University Senate, March 2014.
UCF University-wide System for Assessing Student Learning Outcomes Dr. Julia Pet-Armacost Assistant VP, Information, Planning, and Assessment University.
NSSE 2013 How to Use Results (or “Why you should care about NSSE”) 8/26/
NATIONAL SURVEY OF STUDENT ENGAGEMENT AT IU KOKOMO Administrative Council 26 September 2007.
Community College Survey of Student Engagement (CCSSE) Benchmarks of Effective Educational Practice Summary Report Background: The Community College Survey.
University Senate Meeting January 25, General Issues Required to report on 14 Standards, including all the Federal Requirements Core requirements:
The Satisfied Student October 4 th, Today’s Presentation  Present data from Case’s Senior Survey and the National Survey of Student Engagement.
Bonds and bridges: The relative importance of relations with peers and faculty for college student achievement Sandra Dika, PhD Assistant Research Professor.
Online students’ perceived self-efficacy: Does it change? Presenter: Jenny Tseng Professor: Ming-Puu Chen Date: July 11, 2007 C. Y. Lee & E. L. Witta (2001).
The Academic Library Experiences & Information Seeking Behavior of Undergraduates Ethelene Whitmire Assistant Professor University of California – Los.
MT ENGAGE Student Learning Outcomes and Assessment April 27, 2015.
HELEN ROSENBERG UNIVERSITY OF WISCONSIN-PARKSIDE SUSAN REED DEPAUL UNIVERSITY ANNE STATHAM UNIVERSITY OF SOUTHERN INDIANA HOWARD ROSING DEPAUL UNIVERSITY.
Council for the Advancement of Standards in Higher Education.
1 ASN Partners with SNAAP: An Opportunity for Arts High Schools Presenters: Sally Gaskill, SNAAP Director and Rebecca F. Houghton, SNAAP Project Coordinator.
The University of Texas-Pan American National Survey of Student Engagement 2013 Presented by: November 2013 Office of Institutional Research & Effectiveness.
Mid Michigan Community College Prepared by President Christine Hammond March 31, 2016 PACE Survey Results Summary.
C OLLEGIATE L EARNING A SSESSMENT Dr. Pedro Reyes, Executive Vice Chancellor for Academic Affairs January 2014.
AUB Alumni Survey Report 2016
First-Year Experiences & Senior Transitions
Kate Cossa, Matthew Drilling, Abby Hahn, Katy Leichsenring
Student Engagement Data in the UK: Policy and Practice
NSSE Results for Faculty
Derek Herrmann & Ryan Smith University Assessment Services
The Heart of Student Success
Spring 19 Master of Science in Health Care Administration (MS-HCA)
Presentation transcript:

Are those Rose-Colored Glasses You are Wearing?: Student and Alumni Survey Responses Amber D. Lambert, Ph.D. Angie L. Miller, Ph.D. Association for the Study of Higher Education 38 th Annual Conference November 16 th, 2013

Introduction & Literature Review Surveys are a common means of assessment in higher education (Kuh & Ikenberry, 2009) Student surveys are conducted on a variety of topics, from student engagement to use of campus resources to faculty evaluations (Kuh & Ewell, 2010) Alumni surveys are used to gather information about satisfaction, acquired skills, and career attainment (Cabrera, Weerts, & Zulick, 2005)

Introduction & Literature Review Institutions claim to prepare students with a multitude of transferable skills in addition to pure content knowledge from major (Tait & Godfrey, 1999) Effective communication Analytical & creative thinking AAC&U has recently addressed many of these types of skills as essential learning outcomes for higher education Mastery of these skills should lead to success in the workplace (Stasz, 2001)

Research Questions Question 1: Are there differences in how students and alumni perceive aspects of their institutional experiences and the skills and competencies that they acquire at their institutions? Question 2: Do alumni evaluate their institutions with rose-colored glasses, or do they evaluate their education more harshly once they gain a more practical knowledge of the working world? Question 3: Finally, if differences between students and alumni do exist, whose account should be given precedence?

Methodology This study uses data from the 2011 Strategic National Arts Alumni Project (SNAAP) and the 2012 National Survey of Student Engagement (NSSE) What is SNAAP? On-line annual survey of arts graduates Investigates educational experiences and career paths Provides findings to educators and policymakers to improve arts training, inform cultural policy, and support artists

Who does SNAAP Survey? Graduates of: Arts schools, departments, or programs in colleges and universities Independent arts colleges Arts high schools Both graduate and undergraduate degree recipients All arts disciplines

SNAAP Questionnaire Topics 1. Formal education and degrees 2. Institutional experience and satisfaction 3. Postgraduate resources for artists 4. Career 5. Arts engagement 6. Income and debt 7. Demographics

SNAAP 2011 Administration Information Administered in Fall participating institutions 58 postsecondary and 8 high schools Over 36,000 total respondents

What is NSSE? National Survey of Student Engagement NSSE gives a snapshot of college student experiences in and outside of the classroom by surveying first-year and senior students NSSE items represent good practices related to desirable college outcomes Indirect, process measures of student learning and development

NSSE 2012 Administration Information Administered in Spring participating U.S. institutions Over 285,000 total respondents Each year, experimental item sets appended at end of core survey

Methodology: Sample For this study, participants came from 6 institutions that participated in both SNAAP11 and NSSE12 administrations Senior NSSE respondents from arts majors in corresponding SNAAP participating programs (n = 222) Alumni of undergraduate SNAAP programs from graduating cohorts of (n = 593)

Variables: SNAAP items

Variables: NSSE items

Data Analysis Analysis of covariance (ANCOVA) was conducted to determine whether differences of reported satisfaction and skill development exist between graduating seniors and alumni Control variables included gender, race, U.S. citizenship status, and first-generation status Adjusted means and statistical significance Cohen’s d as measure of effect size

Results Adjusted means comparison for overall rating of institutional experience (4-point scale from “Poor” to “Excellent”) suggests that alumni give higher general appraisals Student Mean Alumni Mean Sig.Effect size (d) Overall experience *.17 *p<.05; **p<.01; ***p<.001

Results (cont.) Adjusted means comparisons for satisfaction with aspects of time at institution (4-point scale after removing “Not Relevant” option) suggest that alumni give lower specific appraisals for certain aspects Student Mean Alumni Mean Sig.Effect size (d) Academic advising *-.16 Career advising ***-.35 Opp. for internships **-.27 *p<.05; **p<.01; ***p<.001

Results (cont.) Adjusted means comparisons for amount of institutional contribution to acquired skills and competencies (4-point scale from “Not at all” to “Very much”) show a similar pattern, with alumni giving lower specific appraisals for certain skills

*p<.05; **p<.01; ***p<.001 Student Mean Alumni Mean Sig.Effect size (d) Research skills **-.23 Clear writing ***-.30 Persuasive speaking *-.21 Project management *-.21 Financial & business ***-.38 Entrepreneurial **-.27 Leadership skills *-.21 Networking ***-.28

Discussion Alumni may be viewing their institutional experience as a whole through rose-colored glasses when they think about “the good old days” However, when considering more nuanced aspects of their educational experiences, alumni perceptions may have a more lackluster pallor

Discussion Post-graduation experiences in the workplace may better enable alumni to reflect on certain aspects of their time Alumni were less satisfied than graduating seniors in the areas of academic advising, career advising, and opportunities for internships or degree-related work May be the case that as students, respondents do not realize the need for better advising or internships until they enter the workforce

Discussion Alumni may also learn that they needed to develop some skills more once they have gained work experience Writing, speaking, networking, and leadership are important aspects of communication that may be experienced differently in an applied setting, such as the workplace, in comparison to a classroom situation Some task-based procedural skills like research, project management, finance, and entrepreneurship may also be more completely understood and valued once an individual transitions from student to employee Also possible that once alumni enter the workforce, they reference skill levels in comparison with colleagues who are quite advanced in these skills after years (or decades) of actual use

Limitations Effect sizes small in magnitude May not represent ALL students and alumni, data only available for those participating in both SNAAP and NSSE (and those receiving experimental NSSE items) Relies on self-reported data However, most studies looking at student self- reports in higher education suggest that self- reports and actual abilities are positively related (Anaya, 1999; Hayek et al., 2002; Laing et al., 1988; Pace, 1985; Pike, 1995).

Conclusions Important institutional information can be gained through surveying both students and alumni Students may be better able to provide information about affective components of their experience, while alumni may be better judges of specific things needed in the workplace Being closer in time to the experience may have advantage in terms of memory accuracy, but temporal distance may have advantage of reflective insight

References Anaya, G. (1999). College impact on student learning: Comparing the use of self-reported gains, standardized test scores, and college grades. Research in Higher Education, 40, Cabrera, A.F., Weerts, D.J., & Zulick, B.J. (2005). Making an impact with alumni surveys. New Directions for Institutional Research, 2005: doi: /ir.144 Hayek, J. C., Carini, R. M., O’Day, P. T., & Kuh, G. D. (2002). Triumph or tragedy: Comparing student engagement levels of members of Greek-letter organizations and other students. Journal of College Student Development, 43(5), Kuh, G. D. & Ewell, P. T. (2010). The state of learning outcomes assessment in the United States. Higher Education Management and Policy, 22(1), Kuh, G. D. & Ikenberry, S. O. (2009). More than you think, less than we need: Learning outcomes assessment in American higher education, Urbana, IL: University of Illinois and Indiana University, National Institute of Learning Outcomes Assessment. Laing, J., Swayer, R., & Noble, J. (1989). Accuracy of self-reported activities and accomplishments of college-bound seniors. Journal of College Student Development, 29(4), Pace, C. R. (1985). The credibility of student self-reports. Los Angeles: The Center for the Study of Evaluation, Graduate School of Education, University of California at Los Angeles. Pike, G. R. (1995). The relationship between self-reports of college experiences and achievement test scores. Research in Higher Education, 36(1), Stasz, C. (2001). Assessing skills for work: Two perspectives. Oxford Economic Papers, 3, Tait, H., & Godfrey, H. (1999). Defining and assessing competence in generic skills. Quality in Higher Education, 5(3),

Questions or Comments? Contact Information: Amber D. Lambert Angie L. Miller Strategic National Arts Alumni Project National Survey of Student Engagement