Presentation is loading. Please wait.

Presentation is loading. Please wait.

August 22, 2008San Jose, CA1 NHU Assessment Workshop Barbara Wright Associate Director, WASC

Similar presentations


Presentation on theme: "August 22, 2008San Jose, CA1 NHU Assessment Workshop Barbara Wright Associate Director, WASC"— Presentation transcript:

1 August 22, 2008San Jose, CA1 NHU Assessment Workshop Barbara Wright Associate Director, WASC

2 August 22, 2008San Jose, CA2 Our Roadmap  Meaning & Goals of Assessment ( minutes)  Best Practices (30-45 minutes)  Break (15 minutes)  Challenges of scoring with rubrics (30 minutes)  Closing the loop (30-45 minutes)

3 August 22, 2008San Jose, CA3 I. Meaning and Goals

4 August 22, 2008San Jose, CA4 1. Goals, questions 2. Gathering evidence 3. Interpretation 4. Use The Assessment Loop

5 August 22, 2008San Jose, CA5 So what is assessment? A systematic process of 1) setting goals for or asking questions about student learning, 2) gathering evidence, 3) interpreting it, and 4) using it to improve the effects of college on students’ learning and development.

6 August 22, 2008San Jose, CA6 Other (subordinate) steps in the assessment process...  Planning  Mapping goals onto curriculum  Adding outcomes to syllabi  Offering faculty development  Reporting  Communicating  Adding assessment to program review  Assessing the assessment

7 August 22, 2008San Jose, CA7 Mapping outcomes onto curriculum and pedagogy -- it can reveal...  where the skill is taught  how it is taught  how consistently it is reinforced  where there are intervention points  (But don’t obsess on syllabi or course descriptions. Ultimately, they’re just inputs, not outcomes.)

8 August 22, 2008San Jose, CA8 More definitions...  Inputs, outputs  Measurement, evidence, documentation  Direct, indirect  Cross-sectional, longitudinal  Reliability, validity  Authentic, academic  Grades, standards  Outcomes, baselines  Accountability, improvement  Summative, formative  Assessment, evaluation  Criteria, rubrics  Pump, filter

9 August 22, 2008San Jose, CA9 Our understanding of assessment has evolved...  Isolated facts, skills  Memorization, reproduction  Comparing performance against other students  A full range of knowledge, skills, dispositions  Problem solving, investigating, reasoning, applying, communicating  Comparing performance to established criteria from... to

10 August 22, 2008San Jose, CA10 Shifts in assessment, cont.  Scoring right, wrong answers  a single way to demonstrate knowledge, e.g. m/c or short- answer test  Simplified evidence  Looking at the whole reasoning process  Multiple methods & opportunities, e.g., open-ended tasks, projects, observations  Complex evidence

11 August 22, 2008San Jose, CA11 Shifts in assessment, cont.  A secret, exclusive & fixed process  Reporting only group means, normed scores  Scientific  A filter  open, public & participatory  Disaggregation, analysis, feedback  Educative  A pump

12 August 22, 2008San Jose, CA12 Shifts in assessment, cont.  “teacher-proof” assessment  Students as objects of measurement  episodic, conclusive  Reliability  Respect, support for faculty judgment  Students as participants, beneficiaries of feedback  continual, integrative, developmental  Validity

13 August 22, 2008San Jose, CA13 II. Best Practices

14 August 22, 2008San Jose, CA14 Some typical outcomes...  Traditional, e.g., Communication Critical thinking Quantitative analysis Knowledge in a range of disciplines

15 August 22, 2008San Jose, CA15 More outcomes...  Newer, e.g., Inquiry, synthesizing skills Ability to work in diverse teams Information literacy Lifelong learning Self-assessment Civic engagement

16 August 22, 2008San Jose, CA16 How can we possibly measure these things? We...  look for qualitative as well as quantitative evidence  balance reliability with validity  align our methods with our outcomes  remember the whole cycle and the role of human judgment  come as close as we can  do no harm

17 August 22, 2008San Jose, CA17 Choice of assessment method matters.  Students value and learn what we teach and test.  How we teach and test matters as much as what  What and how we assess also matters.  We get more of what we test or assess, less of what we don’t.

18 August 22, 2008San Jose, CA18 Descriptive data include …  Retention, graduation rates, time to degree  Percentage going to grad school  Percentage employed in their field of study SAT scores, class rank of incoming students  Percentage completing a capstone, internship, semester abroad, etc. Problem: Information about learning is inferential at best.

19 August 22, 2008San Jose, CA19 Indirect methods include …  Surveys  Interviews  Focus groups  Inventories  “Ethnographic research” Problem: These are reports about perceptions of learning, not evidence of learning itself.

20 August 22, 2008San Jose, CA20 Direct methods include …  Portfolios  Capstones  Performances  Common assignments, secondary readings  Course management programs  Local tests, comps in the major  Commercial tests  Student self-assessment Advantage: You’re looking directly at evidence of what students know and can do.

21 August 22, 2008San Jose, CA21 Good assessment, like good teaching and learning, avoids the disconnects :  Across-the- curriculum skills  Higher-order thinking  Intellectual curiosity, questions  Isolated course assignments  Focus on teaching, testing of facts  Acceptance, reproduction of answers VS.

22 August 22, 2008San Jose, CA22 The disconnects, cont.:  Complexity, nuance  Going beyond the textbook, the formula  Confidence, courage, joy in learning  “Right” answers – or total relativism  Fear of overstepping bounds  Insecurity, risk- aversion, anger VS.

23 August 22, 2008San Jose, CA23 Higher-order thinking … ( adapted from L. Resnick, 1987)  It’s nonalgorithmic, i.e., the path of action is not fully specified in advance.  It’s complex, i.e., the total path is not “visible” from any single vantage point.  It often yields multiple solutions, each with costs and benefits.  It requires nuanced judgment and interpretation  It involves application of multiple criteria, which may conflict with one another

24 August 22, 2008San Jose, CA24 Higher-order thinking, cont …  It often involves uncertainty; not everything about the task is known or can be.  It requires self-regulation; someone else is not giving directions.  It involves making meaning, discerning patterns in apparent disorder.  It is effortful: the elaborations and judgments required entail considerable mental work and are likely to take time.

25 August 22, 2008San Jose, CA25 Bloom’s taxonomy of cognitive processes (1956)  Knowledge – recall of information (tell, list, locate, name)  Comprehension – understanding of information (explain, interpret, distinguish, restate, compare)  Application – use of knowledge, rule, or method in a new situation (solve, construct, complete, classify)

26 August 22, 2008San Jose, CA26 Bloom’s taxonomy, cont.  Analysis – breaking information into components to understand and explore relationships (compare/ contrast, distinguish, separate, classify)  Synthesis – putting ideas together in a new or unique way (create, invent, compose, predict, design, imagine, devise, formulate)  Evaluation – judging the value of products or ideas based on criteria (judge, justify, verify, debate, recommend, prioritize, rate)

27 August 22, 2008San Jose, CA27 Stages of Intellectual Development (William Perry, 1970):  Dualism (Good vs. bad, right vs. wrong, us vs. them. Knowledge, agency resides in external authority.)  Multiplicity (Acceptance of diverse opinions, but w/o pattern or system; “everyone has a right to their own opinion.”)  Relativism (Opinions, values rest on evidence, logic. Some things are unknowable and people may reasonably disagree. Knowledge depends on context.)  Commitment (Choices and decisions made in awareness of relativism but based on an internalized, coherent set of values. Agency resides within the person.)

28 August 22, 2008San Jose, CA28 “Surface” vs. “deep” learning (adapted from Noel Entwistle, 2000; see also John Biggs)  Unrelated bits of knowledge  Memorization, formulas  Difficulty “making sense”  Relationships, connections  Patterns, principles, active integration  Logic, evidence, conclusions

29 August 22, 2008San Jose, CA29 “Surface” vs. “deep” learning, cont.  Study without reflection, strategy  Little meaning, value in course, tasks  Feelings of pressure, worry, anger  Understanding, metacognition  Active interest, engagement  Pleasure, satisfaction

30 August 22, 2008San Jose, CA30 Methods for complex outcomes … are open-ended focus on essentials, principles pose authentic, engaging tasks require meaning making, judgment require active expression are scored for understanding, not just regurgitation show development over time

31 August 22, 2008San Jose, CA31 Methods for complex outcomes …  Portfolios  Capstones  Performances  Common assignments, secondary readings  Course management programs  Local tests  Student self-assessment

32 August 22, 2008San Jose, CA32 III. Challenges of Scoring with Rubrics

33 August 22, 2008San Jose, CA33 For analysis and interpretation of evidence, you need  Outcomes and standards  Rubrics or another rating tool  An inclusive community of judgment  Time for the task  A structure and institution-wide expectation that this task will be done

34 August 22, 2008San Jose, CA34 Four dimensions of learning --  What students learn (cognitive as well as affective, social, civic, professional, spiritual and other dimensions)  How well (thoroughness, complexity, subtlety, agility, transferability)  What happens over time (cumulative, developmental effects)  Is this good enough? (the ED question)

35 August 22, 2008San Jose, CA35 Thinking About Standards...  Absolute standards: the knowledge/skill level of champions, award winners, top experts  Contextual standards: appropriate expectations for, e.g., a 10-year old, a college student, an experienced professional  Developmental standards: amount of growth, progress over time, e.g., 2 years of college, 5 years  (Institutional, regional, national standards?)

36 August 22, 2008San Jose, CA36 Next steps...  Unpack goals; define subgoals and performance indicators (“criteria”)  Describe skill levels (“rubrics”)  Define appropriate standards  Define appropriate development  Be inductive as well as deductive  Train raters, run a pilot, then expand  Keep refining

37 August 22, 2008San Jose, CA37 What does inter-rater reliability training look like?  Develop a draft rubric  Select student work of varying quality  Gather the raters  Have raters use rubric to rate, then explain their ratings  Establish ground rules (e.g., no inferences)  Select “anchor papers”  Re-calibrate as needed  Revise rubric as needed

38 August 22, 2008San Jose, CA38 IV. Closing the Loop

39 August 22, 2008San Jose, CA39 What is Assessment? March 1, A systematic process of 1) setting goals for or asking questions about student learning, 2) gathering evidence, 3) interpreting it, and 4) using it to improve the effects of college on students’ learning and development. In other words, it’s the whole process, not just steps 1 and 2.

40 August 22, 2008San Jose, CA40 Using Findings Requires ► Choice of methods (step 2) that are likely to lead to actionable information ► An inclusive “community of judgment” that analyzes and interprets the data (step 3) ► A judgment about whether learning results are “good enough” or require improvement ► Agreement on actions to follow, with necessary support, resources ► Action (“consequential validity”), oversight (step 4) ► Communication, synergy throughout March 1,

41 August 22, 2008San Jose, CA41 Four dimensions of learning --  What students learn (cognitive as well as affective, social, civic, professional, spiritual and other dimensions)  How well (thoroughness, complexity, subtlety, agility, transferability)  What happens over time (cumulative, developmental effects)  Is this good enough? (the ED question)

42 August 22, 2008San Jose, CA42 Getting to “Good Enough” ► Program/department judgments ► Benchmarking ► Collaboration, standard-setting with peer institutions ► Input from advisory boards, professional associations ► Use of external examiners March 1,

43 August 22, 2008San Jose, CA43 Planning, step 1: Defining goals  Do they include what’s important, not just what’s easy to measure, e.g., Knowledge? Skills? Values and dispositions?  Are they formulated in two/three dimensions: What? How well? What happens over time?  And lead to compelling questions?

44 August 22, 2008San Jose, CA44 Step 2: Choosing methods...  Are they adequate to complex outcomes?  Are some or all local?  Do they have content validity?  Are they cumulative, integrative?  Are they cost-effective?  Will the evidence answer those compelling questions?  Do they have consequential validity?

45 August 22, 2008San Jose, CA45 Step 3: Interpretation, or making meaning out of data...  Is it included as an explicit step?  Is the “community of interpretation” inclusive?  Is it characterized by courage & candor?  Is this a process of inquiry and understanding, not blaming?

46 August 22, 2008San Jose, CA46 Interpretation, cont....  What did we find?  What do the findings mean?  Who can shed light on them?  What do we want to do about them?  Is the plan feasible?  What will it cost?  Will it do no harm?

47 August 22, 2008San Jose, CA47 Step 4: Using findings...  Does the notion of “use” go beyond Collecting data? Writing reports? Making recommendations? Planning for changes? Refining the assessment process  And “close the loop,” i.e., Implement recommended changes? Revisit to determine the effect of changes?

48 August 22, 2008San Jose, CA48 Using findings, cont. – do we have  Credible recommendations?  Communication, broad buy-in (faculty, administrators, students)?  Budget?  Mechanisms?  Oversight?  A schedule for revisiting; moving on?  Record-keeping?

49 August 22, 2008San Jose, CA49 Benefits to students...  Clarity about what’s expected  Accurate feedback on own performance  Greater engagement, commitment  Enhanced confidence  Enhanced retention, completion

50 August 22, 2008San Jose, CA50 Benefits to faculty  Greater collegiality, support  Enhanced efficacy, satisfaction  Shared responsibility for student learning, success  Stronger programs  Meaningful scholarship

51 August 22, 2008San Jose, CA51 Benefits to the institution...  Great narratives for PR  Happier employers  Happier alums  More successful fundraising, grant- getting  Successful reaccreditation


Download ppt "August 22, 2008San Jose, CA1 NHU Assessment Workshop Barbara Wright Associate Director, WASC"

Similar presentations


Ads by Google