Presentation is loading. Please wait.

Presentation is loading. Please wait.

2007 NASPA Assessment & Retention Conference Different Perspectives on the Assessment Mandate: The Results of a Survey Neil Pagano Associate Dean Columbia.

Similar presentations


Presentation on theme: "2007 NASPA Assessment & Retention Conference Different Perspectives on the Assessment Mandate: The Results of a Survey Neil Pagano Associate Dean Columbia."— Presentation transcript:

1 2007 NASPA Assessment & Retention Conference Different Perspectives on the Assessment Mandate: The Results of a Survey Neil Pagano Associate Dean Columbia College Chicago npagano@colum.edu

2 2007 NASPA Assessment & Retention Conference 139 th Belmont Stakes

3 2007 NASPA Assessment & Retention Conference Question Posted to Assess Listserv (2004) “Is there any evidence that a higher education assessment/evaluation of student learning program has indeed produced positive (or negative) change in the quantity or quality of what students actually learn. There seems to be a lot of anecdotal information about how assessment/evaluation programs were created and implemented, but not any actual empirical support. Considering the logistical and personnel-related ramifications of such an undertaking, any success in getting a program off the ground and moving is certainly noteworthy and to be commended. However, I am trying to prepare a report on the actual effectiveness of assessment/evaluation programs. Is there a program that is doing what it is purported to do: improving student learning. If so, is there any (weak, so-so, or solid) empirical evidence to this effect? Any good studies?”

4 2007 NASPA Assessment & Retention Conference Two Responses: “Assessment done well is effective and assessment done poorly is not effective.” Three possible explanations: 1)Assessment is still relatively new 2)Assessment is “decentralized or course- embedded” 3)“Faculty already do a darned good job teaching, and their assessment results simply document that.”

5 2007 NASPA Assessment & Retention Conference Prior Research Peterson, Einarson, Augustine, & Vaughan (1999), Institutional Support for Student Assessment: Methodology and Results of a National Survey Survey - ISSA: Purposes, Methods, Structures, & Impact –Preparing for self-study or accreditation (1 st in importance) –Improving the achievement of undergraduate students (2 nd in importance)

6 2007 NASPA Assessment & Retention Conference Conclusions… “Institutions do not routinely use student assessment data in internal decision-making or monitor its impact on important areas of institutional and student performance. Given the extensive claims made for the value of students’ assessment and the substantial human and financial resources invested in student assessment activities, institutions need to give greater priority to examining how student assessment data is used, and how it impacts the performance of individual students and the institution itself.”

7 2007 NASPA Assessment & Retention Conference Follow-Up Study Peterson, Vaughan, & Perorazio (2002). Student Assessment in Higher Education: A Comparative Study of Seven Institutions –“Exemplary” institutions for “benchmarking” –Ten domains, including Initiating Conditions, Institutional Approach, Culture, and Utilization –Only one institution (Wake Forest University) used assessment results “extensively”

8 2007 NASPA Assessment & Retention Conference Research Questions 1.What are the reasons for undertaking assessment? 2.What assessment methods are used and which are valued? 3.How effective have these assessment efforts been? 4.What variables (institution-type, control, respondent position) impact responses to Qs 1, 2 & 3?

9 2007 NASPA Assessment & Retention Conference Survey: Four Sections I.Purpose II.Methods Used III.Methods Valued IV.Effect of Assessment Efforts

10 2007 NASPA Assessment & Retention Conference Survey Distribution and Responses Two Listservs: Assess (University of Kentucky) and Communities of Practice Snowball Sampling for Further Coverage 331 Total Completes

11 2007 NASPA Assessment & Retention Conference Limitations Sampling Method not Random –Purposive method likely to recruit the “choir” Mixture of Respondents –Some from same institution Basic Statistical Analysis – ANOVA, T- Tests, and Chi Square

12 2007 NASPA Assessment & Retention Conference Survey Respondents by Institutional Type and Position

13 2007 NASPA Assessment & Retention Conference Purposes of Assessment by Institution Type - ANOVA

14 2007 NASPA Assessment & Retention Conference Purposes of Assessment by Position - ANOVA

15 2007 NASPA Assessment & Retention Conference Purposes: A Comparison to the ISSA

16 2007 NASPA Assessment & Retention Conference Assessment Methods Used Scale: 1 = not used; 2 = used in some areas; 3 = used in most areas; 4 = used in all areas

17 2007 NASPA Assessment & Retention Conference Assessment Methods Used by Institution - ANOVA

18 2007 NASPA Assessment & Retention Conference Assessment Methods Used by Institution – ANOVA (cont.)

19 2007 NASPA Assessment & Retention Conference Assessment Methods Valued

20 2007 NASPA Assessment & Retention Conference Methods Used vs. Methods Valued

21 2007 NASPA Assessment & Retention Conference Methods Valued by Institution Type - Chi Square Associate Institutions placed less value in: –Alumni Surveys (p =.008) –Capstone Courses (p =.002) Baccalaureate Institutions placed less value in Employer Surveys (p =.008)

22 2007 NASPA Assessment & Retention Conference Methods Valued by Position – Chi Square Faculty placed relatively less value in 9 of the 12 Methods. Statistically significant differences in: –Departmental Exams (p =.006) –Student Papers (p =.001) –Student Portfolios (p =.016) –Capstone Courses (p <.000) –Commercial Exams (p =.046) –Student Interviews/Focus Groups (p =.023) Faculty placed more value in Anecdotal Evidence (p =.132)

23 2007 NASPA Assessment & Retention Conference Perspectives on the Effects of the Assessment Mandate: 4 Survey Items 1.Our institutional assessment efforts have been effective. 2.Our institutional assessment efforts have identified areas where we need to make curricular/programmatic changes. 3.We have made curricular/programmatic changes as a result of our assessment. 4.It is important that every institution have an assessment plan. No differences in Institutional Type or Control

24 2007 NASPA Assessment & Retention Conference Perspective on Effect of Assessment: by Position - ANOVA

25 2007 NASPA Assessment & Retention Conference Closing Comments Institution Type Matters –Different institutions have different priorities and purposes Position Matters –Faculty, Assessment Leaders and Administrators differ on purposes, methods valued, and the ultimate effect of the mandate

26 2007 NASPA Assessment & Retention Conference Closing Comments Accreditation is an Important Lever –Effects of revised expectations Need to Know More –US Higher Ed Post-Spellings Commission –What is “Assessment?”

27 2007 NASPA Assessment & Retention Conference Different Perspectives on the Assessment Mandate: The Results of a Survey Neil Pagano Associate Dean Columbia College Chicago npagano@colum.edu


Download ppt "2007 NASPA Assessment & Retention Conference Different Perspectives on the Assessment Mandate: The Results of a Survey Neil Pagano Associate Dean Columbia."

Similar presentations


Ads by Google