Presentation is loading. Please wait.

Presentation is loading. Please wait.

March 2, 2007AAC&U Conference, Miami, FL1 The Bigger Picture: The Next Level of Assessment Practice Barbara D. Wright Associate Director, Western Association.

Similar presentations


Presentation on theme: "March 2, 2007AAC&U Conference, Miami, FL1 The Bigger Picture: The Next Level of Assessment Practice Barbara D. Wright Associate Director, Western Association."— Presentation transcript:

1 March 2, 2007AAC&U Conference, Miami, FL1 The Bigger Picture: The Next Level of Assessment Practice Barbara D. Wright Associate Director, Western Association of Schools and Colleges bwright@wascsenior.org

2 March 2, 2007AAC&U Conference, Miami, FL2 Our roadmap... What are your goals for this workshop? Questions? Some key points about assessment The hierarchy of specificity: an exercise Break Supporting structures Case studies Your plans Wrap-up, workshop evaluation

3 March 2, 2007AAC&U Conference, Miami, FL3 Goals, questions Gathering evidence Interpretation Use The Assessment Loop

4 March 2, 2007AAC&U Conference, Miami, FL4 What exactly is assessment? Its a systematic process of 1) setting goals for or asking questions about student learning, 2) gathering evidence, 3) interpreting it, and 4) using it to improve the effects of college on students learning and development – at any level of analysis from the individual student to the course, program, or institution.

5 March 2, 2007AAC&U Conference, Miami, FL5 Other (subordinate) steps in the assessment process... Planning Mapping goals onto curriculum Adding outcomes to syllabi Offering faculty development Reporting Communicating Adding assessment to program review Assessing the assessment

6 March 2, 2007AAC&U Conference, Miami, FL6 Mapping outcomes onto curriculum and pedagogy -- it can reveal... where the skill is taught how it is taught how consistently it is reinforced where there are intervention points (But dont obsess on syllabi or course descriptions. Ultimately, theyre just inputs, not outcomes.)

7 March 2, 2007AAC&U Conference, Miami, FL7 A faculty lament... We already test and assign grades. We flunk the ones who dont measure up. Why do we have to do assessment?

8 March 2, 2007AAC&U Conference, Miami, FL8 Testing and Grading vs. Assessment Evaluation first, feedback second Quality assurance Individuals Private Follow-up random, serendipitous Follow-up not supported Focus on the student, the course Feedback first, evaluation second Quality improvement Samples Collective, collegial Follow-up systematic, expected Follow-up supported, rewarded Focus on the program. the institution

9 March 2, 2007AAC&U Conference, Miami, FL9 Levels of Assessment Individual student learning within courses Individual student learning across courses Courses Programs The institution From: Levels of Assessment, Miller and Leskes, AAC&U, 2005

10 March 2, 2007AAC&U Conference, Miami, FL10 From Levels of Assessment, Miller and Leskes, AAC&U, 2005 Evidence of student learning should be used for multiple levels of assessment … The best evidence … comes from direct observation of student work rather than from an input inventory (e.g., list of courses completed) or summary of self reports … Course-embedded assignments provide the most valid evidence for all levels of analysis.

11 March 2, 2007AAC&U Conference, Miami, FL11 From Levels of Assessment, Miller and Leskes, AAC&U, 2005 The ways of sampling, aggregating, and grouping the evidence for analysis … depend on the original questions posed. The questions will also determine how the data are interpreted to produce action. Sample questions at the institutional level: Just how information-literate are our graduates? Do they make steady progress throughout their college career?

12 March 2, 2007AAC&U Conference, Miami, FL12 From Levels of Assessment, Miller and Leskes, AAC&U, 2005 Faculty members and staff accomplish aggregation by describing standards, translating them into consistent scoring scales, and anonymously applying the resulting rubrics to the evidence at hand. Such a process does not assign a grade to an individual student but rather attempts to understand better the learning process and how to improve.

13 March 2, 2007AAC&U Conference, Miami, FL13 Some complex learning goals -- Communication Critical thinking Information literacy Quantitative problem-solving Team and leadership skills Intercultural competence Ability to transfer knowledge, skills Exercise of civic, social responsibility

14 March 2, 2007AAC&U Conference, Miami, FL14 Methods for complex outcomes are open-ended pose authentic, compelling tasks stimulate student engagement, creativity require integration of knowledge, skills, dispositions demonstrate cumulative learning are educative for students and educators alike provide meaningful info for improvement

15 March 2, 2007AAC&U Conference, Miami, FL15 Methods for complex outcomes … at any level of analysis -- Portfolios Capstones Performances Common assignments Secondary readings Course management programs Local tests Student self-assessment

16 March 2, 2007AAC&U Conference, Miami, FL16 Four dimensions of learning -- What students learn (cognitive learning, skills, dispositions) How well? (thoroughness, complexity, subtlety, agility, transferability) What happens over time? (cumulative, developmental effects) Is this good enough? (federal concern)

17 March 2, 2007AAC&U Conference, Miami, FL17 The rubric – it defines what were looking for offers a set of scoring guidelines tells where to look tells what to look for (criteria) and provides descriptors of each level of quality In other words, its a tool for determining what and how well.

18 March 2, 2007AAC&U Conference, Miami, FL18 The hierarchy of specificity Institution-wide goals College-wide goals Department- & program- wide goals Course-level goals

19 March 2, 2007AAC&U Conference, Miami, FL19 The hierarchy of specificity Oral & written communication Professional communication Ability to write for business Ability to write a business plan

20 March 2, 2007AAC&U Conference, Miami, FL20 Now you try it -- Communication Critical thinking Information literacy Quantitative problem-solving Team and leadership skills Intercultural competence Ability to transfer knowledge, skills Exercise of civic, social responsibility ?

21 March 2, 2007AAC&U Conference, Miami, FL21 Think horizontally as well as vertically... Oral & written communication Professional communication Ability to write for business Ability to write a business plan Internship * Student government * Business courses * Gen Ed

22 March 2, 2007AAC&U Conference, Miami, FL22 Now you try it... Communication Critical thinking Information literacy Quantitative problem-solving Team and leadership skills Intercultural competence Ability to transfer knowledge, skills Exercise of civic, social responsibility ?... across the college experience

23 March 2, 2007AAC&U Conference, Miami, FL23 Remember – when youve got data, youre only halfway there Data are not information; information is not knowledge an inclusive community of interpretation is needed to make meaning of the findings Interpretation should lead to plans for improvement, shared commitment Reports, plans, and recommendations do not equal action Action requires resources Faculty cant do it all on their own

24 March 2, 2007AAC&U Conference, Miami, FL24 Who needs to be involved? Faculty Student affairs, library, academic support, IR, director of TLC, etc. Students Chairs, deans, VPAA Advisory board members, external experts, faculty ?

25 March 2, 2007AAC&U Conference, Miami, FL25 Institutionalizing assessment – 2 aspects: The PLAN for assessment (i.e. shared definition, purpose, values, vocabulary, communication, use of findings) The STRUCTURES and RESOURCES that make the plan doable

26 March 2, 2007AAC&U Conference, Miami, FL26 Three alternatives for institutionalizing assessment: Centralized Ability to focus on desired priorities, level of analysis Administrative control Efficiency Economies of scale Central administration grows There are administrative, PR costs Connection to grassroots lacking Faculty are alienated or disinterested: Assessment is an administrators job

27 March 2, 2007AAC&U Conference, Miami, FL27 Three alternatives for institutionalizing assessment: Decentralized Minimal additional costs Close to classroom, sites where learning occurs High faculty ownership, involvement Discipline-appropriate approaches Perception: Sure, just dump it on the faculty Communication challenging Duplication, inefficiency are likely There are opportunity costs There is no uniform approach Higher levels of analysis problematic

28 March 2, 2007AAC&U Conference, Miami, FL28 Three alternatives for introducing and institutionalizing assessment: A middle course Identify elements that can be handled at institutional level, e.g. Overall coordination Data storage, reporting (IR) Articulation of policy, parameters, levels Implementation (e.g. budget for software, consultants, conferences) Provision of rewards (individual, program) Communication (faculty, publics, accreditors, state, etc.)

29 March 2, 2007AAC&U Conference, Miami, FL29 Three alternatives for introducing and institutionalizing assessment: A middle course Delegate elements best handled on the ground in individual programs, e.g. Definition of course-level outcomes Development of methods Interpretation of findings Recommendations for improvement Reflection on what has been learned Contributions to higher-level analysis

30 March 2, 2007AAC&U Conference, Miami, FL30 Assessment at higher levels requires - A formal structure Assessment acknowledged in policy, contracts, other documents Dedicated resources Budget Personnel Location Accountability Etc.

31 March 2, 2007AAC&U Conference, Miami, FL31 How to institutionalize -- Make assessment a freestanding function Attach to an existing function, e.g. Accreditation Academic program review Annual reporting process Center for Teaching Excellence Institutional Research

32 March 2, 2007AAC&U Conference, Miami, FL32 Make assessment freestanding -- Maximum flexibility Minimum threat, upset A way to start Little impact Little sustainability Requires formalization eventually, e.g. Office of Assessment Positives and Negatives May be difficult to connect to higher levels of analysis

33 March 2, 2007AAC&U Conference, Miami, FL33 Attach to accreditation -- Maximum motivation Likely compliance Resources available Staff, faculty assigned Clear cause/effect Resentment of external pressure Us/them dynamic Episodic, not ongoing Reporting, gaming, not improving Little faculty involvement Little connection to the classroom, learning Main focus: inputs Positives and Negatives May be easier to connect

34 March 2, 2007AAC&U Conference, Miami, FL34 Attach to Center for Teaching Excellence -- Strong impact possible Ongoing Close connection to faculty, classroom, learning Maximum responsiveness to use phase Impact depends on how broadly assessment is done No enforcement Little/no reporting, communicating Rewards, recognition vary, may be lip service Positives and Negatives

35 March 2, 2007AAC&U Conference, Miami, FL35 Attach to program review -- Some impact (depending on stakes) Some compliance Some resources available Staff, faculty assigned Cause/effect varies Impact depends on how well PR is done Episodic, not ongoing Inputs, not outcomes Reporting, not improving Generally low faculty involvement Weak connection to the classroom, learning Positives and Negatives

36 March 2, 2007AAC&U Conference, Miami, FL36 Attach to annual report -- Some impact (depending on stakes) Ongoing Some compliance Habit, expectation Closer connection to classroom, learning Cause/effect possible Allows flexibility Impact depends on how seriously, how well AR is done No resources Reporting, not improving, unless specified Chair writes; faculty involvement varies Positives and Negatives

37 March 2, 2007AAC&U Conference, Miami, FL37 How can we increase weighting of learning & assessment in PR? E.g., Optional part One small part of total PR process Assessment vague, left to program Various PR elements of equal value (or no value indicated) Little faculty involvement Required Core of the process (emphasized in instructions) Assessment expectations defined Points assigned to PR elements; student learning gets 50% or more Broad involvement From to

38 March 2, 2007AAC&U Conference, Miami, FL38 Dont confuse program-level assessment and program review Program-level assessment means we look at learning on the program level (not the individual student or course level) and ask what the key learning experiences of a program add up to. Program review looks for program-level assessment of student learning but goes beyond it, also examining other components of the program, e.g. mission, faculty, facilities, demand, etc.

39 March 2, 2007AAC&U Conference, Miami, FL39 New trends for PR/assessment (cf. WASC accreditation process) Create a program portfolio Keep program data continuously updated Do assessment on annual cycle Enter assessment findings, uses, by semester or annually For periodic PR, review portfolio and write reflective essay on student AND faculty learning

40 March 2, 2007AAC&U Conference, Miami, FL40 Budget items Released time, support staff time Equipment, supplies (e.g. purchased instruments, software, computers, servers, books, photocopying) Development (e.g., consultants, workshops) Incentives and rewards (e.g. faculty mini-grants, travel, stipends, merit) Communications

41 March 2, 2007AAC&U Conference, Miami, FL41 So whats your plan? What learning goal is a priority for higher-level analysis? Whats your question? Whats already in place? What do you need? Who needs to be involved? Whats the timeline?


Download ppt "March 2, 2007AAC&U Conference, Miami, FL1 The Bigger Picture: The Next Level of Assessment Practice Barbara D. Wright Associate Director, Western Association."

Similar presentations


Ads by Google