Presentation is loading. Please wait.

Presentation is loading. Please wait.

Three Decades Later: A Day With Peter Ewell Peter T. Ewell National Center for Higher Education Management Systems (NCHEMS) WASC Assessment Institute June.

Similar presentations


Presentation on theme: "Three Decades Later: A Day With Peter Ewell Peter T. Ewell National Center for Higher Education Management Systems (NCHEMS) WASC Assessment Institute June."— Presentation transcript:

1 Three Decades Later: A Day With Peter Ewell Peter T. Ewell National Center for Higher Education Management Systems (NCHEMS) WASC Assessment Institute June 16, 2011

2 What We Will Discuss Today History of Assessment and Some of Its Major Philosophical Dilemmas The Policy Environment for Assessment: Who Wants What and Why? The Accreditation Connection and Implications for the WASC Region

3 Topic 1 History of Assessment and Some of Its Major Philosophical Dilemmas

4 Foundations and Forerunners of Assessment in the United States Scholarly Research on College Student Development and Student Flow (Retention and Graduation) Standardized Cognitive Testing: A Uniquely American Tradition Evaluation and “Scientific” Management in Higher Education Mastery Learning Approaches

5 Origins of the “Assessment Movement” in the 1980s Undergraduate Reform Reports of 1985-86 Internal Stimulus: Call for More Coherent Teaching/Learning Approaches and Information to Improve Them External Stimulus: Stakeholder Demands for Information on “Return on Investment” Tensions in Motive and Message Ever Since

6 Episodes and Debates What to Call It? “Ineffability,” “Measurement,” and “Evidence” The “Value-Added” Debate The “TQM” Episode Goals 2000 and an Attempt at National Assessment FERPA and the Federal Student Unit Record System Proposal

7 Why Didn’t Assessment Go Away? Pressure to Produce Evidence of Student Learning Outcomes Never Let Up By Early 1990s, Accreditors Replace States as Primary External Stimulus to Get Started Intermittent Federal Interest in Assessment as an Element of National Accountability But Resulting Faculty Ambivalence About a Process Seen as “External” and “Administrative”

8 Looking Back: What’s Been Accomplished? Assessment Is for the Most Part Perceived as Inevitable and Legitimate Vast Majority of Institutions Have Statements of Learning Outcomes (General and Programmatic) and Most are “Doing Assessment” A “Semi-Profession” of Folks Involved in Assessment, Conferences/Workshops, and an Implementation Literature Steadily Growing Sophistication with Respect to Methods of Gathering Evidence

9 Looking Back: What Hasn’t Happened? Authentic Integration of Assessment into Faculty Cultures and Behaviors Assessment Activities Still Largely “Added On” to the Curriculum Instead of Being Embedded In It Systematic and Widespread Use of Assessment Results for Institutional and Curricular Improvement National Standards or Frameworks for Benchmarking Student Achievement and Assessment Practices Across Institutions

10 A Basic Dichotomy of Approaches Accountability-Based: Assessments Intended to “Check Up” on the System in the Aggregate Scholarship and Continuous Improvement: Assessments Intended to Assure Standards and Provide Feedback on Collective Performance for Improvement

11 Continuous Improvement Continuous Improvement Accountability Accountability Strategic dimensions PurposeFormative (improvement)Summative (judgment) OrientationInternalExternal MotivationEngagementCompliance Implementation InstrumentationMultiple/triangulationStandardized Nature of evidenceQuantitative and qualitativeQuantitative Reference points Over time, comparative, established goal Comparative or fixed standard Communication of results Multiple internal channels Public communication, media Use of resultsMultiple feedback loopsReporting Some Details of the Dichotomy Ewell, Peter T. (2007). Assessment and Accountability in America Today: Background and Context. In Assessing and Accounting for Student Learning: Beyond the Spellings Commission. Victor M. H. Borden and Gary R. Pike, Eds. Jossey-Bass: San Francisco.

12 A Basic Dichotomy of Approaches Accountability-Based: Assessments Intended to “Check Up” on the System in the Aggregate Scholarship and Continuous Improvement: Assessments Intended to Assure Standards and Provide Feedback on Collective Performance for Improvement  Can Be Applied at Any Level of Analysis

13 A Taxonomy of Approaches to Assessment Learning/Teaching (Formative) Instruction Individual Tests Portfolios Placement Diagnostic Tests Advanced Placement Tests Vocational Tests Accountability (Summative) “Gate-keeping” Rising Junior Exams Comprehensive Exams Certification Exams Capstone Performances Program Enhancement Individual assessment results may be aggregated to serve program evaluation purposes Campus and Program Evaluation Productivity Reviews Performance Indicators Individual Group Assessment Level Assessment Focus or Purpose

14 A Cross-Cutting Dichotomy Exo-Skeletal: Assessments Added onto Instruction as a Distinct Set of Entities Embedded: Assessments Built into Instruction through the Regular Process of Grading and Assignments

15 Looking at the Dichotomies and Your Own Assessment Program Where Would You Place Your Own Program on the Continuum of Each of the Two Dichotomies? External vs. Internal Focus or Direction Exo-skeletal vs. Embedded Assessment Approaches In an Ideal World, Where Would You Like It To Be? List a Couple of Steps that You Might Take in the Next Three Months to Move Your Program in the Desired Direction

16 Topic 2 The Policy Environment for Assessment: Who Wants What and Why?

17 Assessment and Public Policy The Current Policy Imperative for Higher Education The Policy Players: Their Motives and Basic Approaches to Assessment Enduring Policy Issues and How They Are Typically Addressed

18 The Policy Imperative for Higher Education Global Competitiveness in Collegiate Attainment

19 Differences in College Attainment (Associate & Higher) Between Younger & Older Adults—U.S. & OECD Countries, 2008 Source: Organisation for Economic Co-operation and Development (OECD), Education at a Glance 2010 slide 19

20 The Policy Imperative for Higher Education Global Competitiveness in Collegiate Attainment The “New Majority” and Demographic Achievement Gaps

21 Source: U.S. Census Bureau’s 2005 American Community Survey; OECD

22 The Policy Imperative for Higher Education Global Competitiveness in Collegiate Attainment The “New Majority” and Demographic Achievement Gaps Questionable Levels of Graduate Achievement

23

24 The Policy Imperative for Higher Education Global Competitiveness in Collegiate Attainment The “New Majority” and Demographic Achievement Gaps Questionable Levels of Graduate Achievement In an Environment of Continuing Fiscal Strain

25 Long-term Federal Debt

26 The Policy Imperative for Higher Education Global Competitiveness in Collegiate Attainment The “New Majority” and Demographic Achievement Gaps Questionable Levels of Graduate Achievement In an Environment of Continuing Fiscal Strain  These are Now Urgent and It’s Not Just Our Conversation Any More

27 Assessment Policy: The Players States The Federal Government Institutional Accrediting Organizations “Third-Party” Actors

28 State Policy Approaches to Assessment Regulating Student Flow (SD “Rising Junior”) Assessing Institutional Performance (WV CLA) Performance Funding (TN Schedule) “Institution-Centered” Mandate (VA) Aligning Standards (“Tuning USA”) Technical Assistance (WV Accreditation Review) Assessing “Educational Capital”

29 Federal Interest in Assessment Centered on Institutional “Stewardship” of Financial Aid Funds Action Mostly Indirect Through Accreditation Focus on Institutional Reporting for Consumer Information and Protection Occasional Interest in National Testing Goals 2000 “NAEP for College”

30 The Spellings Commission and Its Accountability Aftermath Buildup: The Reports of 2005-2007 The Assault on Accreditation Around Learning Outcomes Spellings Testimony and Discussion “NegReg” and NACIQI The Academy Responds The VSA and its Cousins The “New Leadership Alliance for Student Learning and Accountability”

31 Some “Third Party” Actors in the Quality Review Arena U.S. News and World Report Measuring Up and the Pew Forum on Undergraduate Learning National Survey of Student Engagement (NSSE) and Community College Survey of Student Engagement (CCSSE) National Student Clearinghouse ETS Culture of Evidence Reviews

32 Enduring Issues of Assessment Policy Balancing Accountability and Improvement Standardized or Non-Comparable Measures Absolute Achievement or “Value-Added” Balancing Incentives and Consequences Accounting for Differences in Context that May Affect Performance Nature and Extent of Public Reporting

33 Looking at Your Own Approach in the Light of Two of These Issues… Do You Look at “Value Added?” If You Do, Why Do You Do It? If You Don’t, Should You? In Doing Program-Level Assessment, How Do You Handle Differing Circumstances Across Departments (e.g. Size, Student Preparation Levels, Demographics, etc.)?

34 Topic 3 The Accreditation Connection and Implications for the WASC Region

35 Accreditation and Assessment The Evolution of Accreditation: Regional and Specialized Taking Stock of Accreditation: Strengths and Challenges Recent Changes in Accreditation Potential Implications of these Changes for Assessment Practice and How to Address Them

36 Evolution of Regional Accreditation Original Stimulus: “What Is a College?” Original Focus on Resources and Processes Judged by Peer Evaluators Mission-Centered Standards and Review (the “Golden Age”) Second GI Bill and the Adoption of the Federal “Gatekeeper” Role Mandatory Federal Focus on Student Learning Outcomes

37 Evolution of Specialized Accreditation Original Stimulus: Flexner Report and the Rise of Professional Licensure and Identity Steady Proliferation in Numbers to the Current Total of 61 Specialized Accreditors Historically More Attention to Student Academic Performance than Regionals, Usually Through Performance on Licensure Examinations Specific Attributes of a Graduate to be Assessed (e.g. ABET and AACSB International)

38 Taking Stock of Regional Accreditation Strengths Widely Accepted “Signal” of Quality Opportunity for Self-Improvement Sharing Practices Through Mutual Visitation Challenges Providing Information to the Public Consistency Across Reviews “All or Nothing” Outcomes Perceived Inefficiency and Institutional Burden

39 Accreditation and Assessment: The Current Situation with the Regionals Steady Increase in Prominence of Assessment But Reluctance to Actually Sanction Institutions Institutions Free to Choose Learning Goals and Ways to Gather Assessment Evidence Focus on the Assessment Process, Not the Actual Results of Assessment Focus on Institutional Transparency in Reporting the Results of Assessment Trying Constantly to Increase Institutional Capacity

40 Recent Changes in Accreditation Practice (which WASC Pioneered) Forces Driving Change in Accreditation Pressure from the DOE and Congress Demands from Institutions to “Add Value” Need to Respond to New Ways of Teaching Resulting “New Looks” in Accreditation Practice Focus on Outcomes and Explicit Standards of Performance Presenting Evidence [e.g. “Institutional Portfolios”] Review Approaches [e.g. “Academic Audits”]

41 Potential New Developments for Assessment in Accreditation Context Focus Review on Quality of Actual Learning Outcomes, Not Just the Adequacy of the Assessment Process Look for External Benchmarks and Points of Comparison for Assessment Results: How Do You Know They Are Good Enough? Focus More on the Use and Application of Assessment Results Find Better Ways to Communicate Assessment Results (and Improvements) to Stakeholders

42 Focus on Assessment Results What Are Particular Areas of Strength or Weakness Across Particular Student Populations? Across Particular Dimensions of Performance? Connect Datasets to See What Experiences Drive Particular Outcomes (e.g. NSSE to CLA, Experience Inventory to Portfolio, etc.) Be Prepared for a Widely-Participatory Conversation About These Things

43 Types of External Benchmarks Nationally-Normed Assessments and Surveys (and Remember that Peer Comparisons are Best) Industry Standards on Licensure Examinations Consortia of Institutions with Similar Assessment Processes (e.g. Using Electronic Portfolios or the AACUI VALUE Rubrics) External Reviews of Assessments and Assessment Results (e.g. External Examiners, Program Reviewers)

44 Using Assessment Results Expectation Exercises Package Results Around Real Problems Link Assessment to Regularly-Occurring Processes and Decisions (e.g. Program Review, Budget Hearings) Create Collective Opportunities to Review Assessment Results and Reflect on What They Mean Avoid the “Perfect Data Fallacy”

45 Communicating Assessment Results Websites and “Institutional Portfolios” [NILOA Website Review Tool] Dashboards and Interactive Data Sites Present Proposed Solutions and Improvements Together With Assessment Results Create a Format that Shows Long-Term Trends and is Consistent Over Time Keep It Simple

46 Looking at Your Own Assessment Program in the Light of Accreditation How Does What WASC Requires with Respect to Assessment Compare with What Other Regions Require? [Draw on Your Homework] What are the Implications for Your Own Assessment Program? How Far Should WASC Move Along these Possible Paths? How Far Can It?


Download ppt "Three Decades Later: A Day With Peter Ewell Peter T. Ewell National Center for Higher Education Management Systems (NCHEMS) WASC Assessment Institute June."

Similar presentations


Ads by Google