Presentation is loading. Please wait.

Presentation is loading. Please wait.

Building Rubrics For Large-scale, Campus-wide Assessment Afternoon Session Thomas W. Zane Diane L. Johnson

Similar presentations


Presentation on theme: "Building Rubrics For Large-scale, Campus-wide Assessment Afternoon Session Thomas W. Zane Diane L. Johnson"— Presentation transcript:

1 Building Rubrics For Large-scale, Campus-wide Assessment Afternoon Session Thomas W. Zane tom.zane@slcc.edutom.zane@slcc.edu Diane L. Johnson djohnson@new.edudjohnson@new.edu Jodi Robison jrobison@new.edujrobison@new.edu

2 Afternoon Session Agenda 1. Build a critical thinking rubric designed for adjuncts who teach a GE course. 2. Build a written communications literacy rubric designed for a group of senior portfolio reviewers. 3. Build a rubric based on the generic criteria method. 4. Q&A and Wrap-up.

3 Afternoon Session Assumptions You are tasked to take the lead on a large-scale assessment project at your institution. You face two big issues: What are the steps for building the scoring rubrics? How could you guide faculty through the process?

4 Case #1: Critical Thinking Our institution has decided to measure critical thinking across the curriculum. Together we form the committee charged with building a way to measure this college-wide outcome.

5 Critical Thinking Rubric – Early Decisions Purpose – Measure critical thinking ability based on student submissions in MANY types of courses. Target of Measurement – Ability level as perceived from written submissions. Define Critical Thinking Critical thinking is the conscious and deliberate use of thinking skills and strategies used for guiding what to think, believe, or do.

6 Critical Thinking Rubric – Search for Criteria – 1. Found thousands of articles and rubrics. 2. Identified primary sources. 3. Collected criteria. 4. Defined the criteria.

7 Critical Thinking – Define the Criteria 1. Interpretation The primary definition of interpretation is the act of making sense of various inputs. Interpretation requires that we clarify the purpose, issue, problem/question, meaning, etc. 2. Analysis Analysis means to break down, examine, or otherwise explore the issues, available information, arguments, etc. With analysis, we must manipulate, process, or otherwise make active changes to the inputs to make better sense of them. 3. Evaluation To evaluate means to determine the merit, value, efficacy, advantages, worth, authenticity, validity, impact, or significance, of something (e.g., the evidence, claims, assumptions, biases, perspectives, etc.) 4. Inference This broad term covers reasoning coupled with the use of evidence and standards that together are necessary for synthesizing, coming to a conclusion, making decisions, identifying alternatives, generalizing, planning, predicting, etc. 5. Explanation (Communication) Communicate the outcomes of thinking such as stating results, justifying procedures, explaining meaning, presenting arguments, etc. This is considered CT because of the mental processes involved in designing a well-written (or spoken) message. 6. Self-regulation (Metacognition) During all of the above (and sometimes following the thinking as well), reflect, self- examine, pose questions about thinking, self-correct, etc.

8 Critical Thinking Rubric – Design The Scale CT is a human ability that increases in cognitive demand So…build a scale that reflects this! Four points (standard for our online systems). Define the scale points.

9 Scale Definitions Determine what each score level means to you. Define the level as best you can.

10 Critical Thinking – Select the Specific Aspects for Each Row Each of the six areas of critical thinking are still too broad. We need to break them down into smaller aspects. We went back to the Internet to find rows in rubrics that fit our definitions.

11 Created a sample of rows for faculty to use. Used the same six categories to allow for aggregation of data across the campus. Split each of the six into different sorts of approaches to fit various disciplines. Promised to come to the department if a row could not be found. (None have requested this.)

12 Sample (starting point) Rows ClassificationLists information. Incorporates information. Classifies information. Examines reasoning for information classifications. Consider Biases Does not detect any biases. Identifies biases. Contests biases. Overcomes biases. Decision Making Based On Support Does not make a decision. Makes a decision but does not provide backing. Makes and supports a good decision. Justifies why a given decision is the best.

13 Build Your Rubric Assume you are a department assessment coordinator now. Assume our signature assignment is to write a paper about off-roading equipment. (Assignment is in the workbook.) Review Appendix F to select a row for each of the six categories.

14 Localize the Descriptors Add assignment-specific language to the rubric descriptors to support more natural feedback to students. Original Example Clarifying Questions Does not ask questions. Identifies some questions. Asks good questions. Analyzes insightful questions. Adapted Version Clarifying Questions Does not ask questions about the budget problem. Identifies some basic common or obvious questions about the budget problem. Asks relevant questions that guide further research into the budget problem. Analyzes insightful questions showing a deep understanding of how the questions can guide the research.

15 Case #2: Written Communications Use the same assumptions. You need a college-wide measure, but you need buy-in from ALL departments. Lets build a new rubric using the same methods.

16 Written Communications Literacy – Early Decisions Purpose – measure written communications quality from student submissions. Target of Measurement – writing qualities (trait-based). Define Written Communications Literacy.

17 Written Communications – Define the Construct A search for rubrics on this trait yielded over 100K hits. So, we dropped back and punted. We used the AAC&U VALUE rubric as a starting point. In addition, we consulted the commonly used 6+1 Traits Rubric. From these, we found criteria and potential wording for some rubric rows.

18 Scale Definitions We worked with our English teachers to narrow down what the scores should mean. 1. Major writing errors necessitating major revision or rewrite 2. Minor quality errors that could be resolved with minor revisions 3. Competent writing that would pass as is 4. Excellent writing that went beyond minimum standards

19 Define the Criteria Content Development Genre and Discipline-Specific Conventions Claims Credible Evidence Analysis Control of Syntax and Mechanics Overall Impact

20 Written Communications Rubric Development Procedure 1. Print/Open a copy of the Starting Point Rubric for Written Communication. 2. Review the section of the document that corresponds to each row in the rubric template. 3. Select ONE approach for measuring that criterion (row) from the Descriptor Categories in column one of the examples tables. 4. Place the description into the 3 cell on that row. 5. Write a descriptor for the remaining column cells along that row.

21 Review Quality of Our Rubric Turn back to section 7 in the morning session notes. Review our rubric in light of the checklist.

22 Case #3 Build From Generic Criteria Take a look in your workbooks at the generic criteria listing. We argue that these criteria could lead you to most of the criteria you might want to use in future assignment-based rubrics.

23 Examples (Workbook) 2 Examples in the workbook. Bolded criteria were selected as the most important. Which do you feel were more important?

24 Now It Is Your Turn Select a favorite assignment (or one of the examples). Select 3-4 important criteria. Complete a rubric with 3-4 rows including: Criteria definitions. Score scale definitions. Descriptors.

25 Wrap Up Questions? What Worked? What Needs Work? Reminder: If you want feedback on your first attempts at a rubric, send them to tom.zane@slcc.edu tom.zane@slcc.edu


Download ppt "Building Rubrics For Large-scale, Campus-wide Assessment Afternoon Session Thomas W. Zane Diane L. Johnson"

Similar presentations


Ads by Google