Presentation is loading. Please wait.

Presentation is loading. Please wait.

Scoring Rubrics Margaret Kasimatis, PhD VP for Academic Planning & Effectiveness.

Similar presentations


Presentation on theme: "Scoring Rubrics Margaret Kasimatis, PhD VP for Academic Planning & Effectiveness."— Presentation transcript:

1 Scoring Rubrics Margaret Kasimatis, PhD VP for Academic Planning & Effectiveness

2 What is a Scoring Rubric? A scheme for evaluating student work along certain dimensions –Specific skills or aspects of a general learning outcome –Concrete descriptors of levels performance Can be applied to a variety of student products or performances (e.g., written work, presentations, efforts, etc.) Good for measuring higher-order skills or outcomes not easily measured by tests (e.g., oral communication, integration)

3 EXAMPLE OF ONE DIMENSION OF AN ORAL PRESENTATION SCORING RUBRIC Delivery – This scale focuses on the transmission of the message. It is concerned with volume, rate, and articulation. SuperiorSpeaker uses delivery to emphasize and enhance the meaning of the message. Speaker delivers the message in a lively, enthusiastic fashion. Volume varies to add emphasis and interest. Pronunciation and enunciation are very clear. AdequateRate is not too fast or too slow. Pauses are not too long or at inappropriate spots. Pronunciation and enunciation are clear. MinimalVolume is too low or too loud. Rate is too fast or too slow. Pauses are too long or at inappropriate spots. Pronunciation & enunciation are unclear. Speaker exhibits several disfluencies. InadequateDelivery interferes with communication of message; e.g., volume is so low that you cannot understand most of the message. Or rate is so fast that you cannot understand most of the message. Or, the pronunciation and enunciation are so unclear that you cannot understand most of the message.

4 What is a Scoring Rubric? How is this different from grading? –Multidimensional vs. holistic –Concerned more with skills than “right answer” –Results are aggregated across students

5 Constructing Rubrics Select learning outcome or competency Identify specific dimensions/skills Develop concreted descriptors of levels of performance –Need to keep in mind the nature of the work product –Ask yourself, “What level of performance should a graduating senior in my department have?” Best you can expect is highest rating Unacceptable product is lowest rating Worst acceptable product is lowest acceptable rating

6 EXAMPLE LEARNING OUTCOME: “Can…analyze and interpret data” Student Product: Lab Report Skill #1: Can describe the trend indicated by the results of statistical analyses. Superior4 Presents the appropriate statistics, and thoroughly and accurately describes the trend indicated by those results. Good3 Presents the appropriate statistics and accurately describes trend, but does not elaborate. Adequate2 States the statistics and indicates some trend, but fails to indicate direction or is not entirely accurate in describing trend. Inadequate1 Simply states the statistics without reference to a trend, or incorrectly states the trend. Skill #2: Can draw appropriate conclusions about the causal nature of relationships between variables. Superior4 Draws appropriate conclusion, and thoroughly and accurately explains why the conclusion is drawn. Good3 Draws appropriate conclusion, but only briefly explains why the conclusion is drawn. Adequate2 Draws appropriate conclusion, but either does not explain or is not entirely accurate in the explanation. Inadequate1 Either draws no conclusion or draws an inappropriate conclusion.

7 EXAMPLE LEARNING OUTCOME: “Understanding the impact of engineering solutions in a global and societal context” Student Product: Final Paper in Integrative Experience Demonstrates an understanding of the impact of science or technology on society Superior4 Impacts of science or technology that are mentioned are significant or substantive, and the explanation of those impacts is complete. Average3 Impacts mentioned are fairly obvious or only somewhat significant, but they are fully explained; or, the impacts are significant, but the explanation is incomplete. Minimal2 Impacts mentioned are somewhat significant, and explanation is superficial. Inadequate1 Impacts mentioned are either very obvious or not important, and there is no explanation of them.

8 Applying Rubrics Ideal Case: Train a pair of raters (from outside course or department) to use the rubric (test inter-rater reliability) Each rater independently scores the work This can be done “live” (e.g., with oral presentations), or copies of student products could be made and retained for later scoring

9 Using Rubrics for Program Assessment Where in curriculum is outcome addressed and at what level? (refer to curriculum map) –Ideally, look for courses that “Introduce” as well as those that “Emphasize” (to test improvement across program) –Or look at required senior-level courses Identify student work products/performances that should demonstrate the outcome –E.g., written assignments, research reports or posters, oral presentations, creative performances or products –If you want to test improvement, look for similar products in lower and upper division courses –Cautionary note about apples and oranges

10 Using Rubrics for Program Assessment To analyze and interpret the data: –Average scores across raters (if you used two raters) –Aggregate those scores across students for each rubric dimension –Present data in user-friendly way and have discussion of what it means It helps to already have a criterion/standard in mind

11

12 Advantages of Scoring Rubrics Direct evidence of student learning Good for measuring higher-order skills or evaluating complex tasks Summaries of results can reveal patterns of student strengths and areas of concern Can be unobtrusive to students Can generate great discussions of student learning among faculty, especially regarding expectations

13 QUESTIONS?


Download ppt "Scoring Rubrics Margaret Kasimatis, PhD VP for Academic Planning & Effectiveness."

Similar presentations


Ads by Google