Descriptive Rubrics Ursula Waln, Director of Student Learning Assessment Central New Mexico Community College.

Slides:



Advertisements
Similar presentations
College of Health Sciences Lunch and Learn Series February 12, 2010 Rena Murphy & Sharon Stewart.
Advertisements

CT is a self-directed process by which we take deliberate steps to think at the highest level of quality. CT is skillful, responsible thinking that is.
Evaluating Thinking Through Intellectual Standards
A Guide for College Assessment Leaders Ursula Waln, Director of Student Learning Assessment Central New Mexico Community College.
How to Norm Rubrics Ursula Waln, Director of Student Learning Assessment Central New Mexico Community College.
Learning Objectives, Performance Tasks and Rubrics: Demonstrating Understanding and Defining What Good Is Brenda Lyseng Minnesota State Colleges.
Using RUBRICS to Assess Program Learning Outcomes By Dr. Ibrahim Al-Jabri Director, Program Assessment Center April 16, 2007.
Consistency of Assessment
CHAPTER 3 ~~~~~ INFORMAL ASSESSMENT: SELECTING, SCORING, REPORTING.
University of Delaware Introduction to Assessment Institute for Transforming Undergraduate Education Courtesy of Sue Groh.
HOW TO EXCEL ON ESSAY EXAMS San José State University Writing Center Dr. Jim Lobdell.
On Scoring Guides everything you were afraid to ask PART TWO.
University of Delaware Assessment of Learning in Student-Centered Courses Institute for Transforming Undergraduate Education Courtesy of Sue Groh and Barb.
Uses of Language Tests.
Principles of High Quality Assessment
Rubrics Dr. Bruce. F. Brodney St. Petersburg College.
University of Delaware Introduction to Assessment Institute for Transforming Undergraduate Education Courtesy of Sue Groh.
Measuring Learning Outcomes Evaluation
University of Delaware Introduction to Assessment Institute for Transforming Undergraduate Education Contributions by Sue Groh and Hal White.
KPIs: Definition and Real Examples
OCTOBER ED DIRECTOR PROFESSIONAL DEVELOPMENT 10/1/14 POWERFUL & PURPOSEFUL FEEDBACK.
Assessment of Ethics Jones College of Business MTSU July 29, 2015.
Adapted from Growing Success (Ontario Schools) by K. Gibson
Pierce College CSUN-Pierce Paths Project Outcomes Report 2013.
January 29, 2010ART Beach Retreat ART Beach Retreat 2010 Assessment Rubric for Critical Thinking First Scoring Session Summary ART Beach Retreat.
Classroom Assessments Checklists, Rating Scales, and Rubrics
Assessing Standards Through Rubrics Milton Hershey School Houseparent Professional Development Session #1 Denise G. Meister, Ph.D. Penn State.
Classroom Assessment A Practical Guide for Educators by Craig A
Assessing General Education Workshop for College of the Redwoods Fred Trapp August 18, 2008.
Threshold Concepts & Assessment Ahmed Alwan, American University of Sharjah Threshold Concepts For Information Literacy: The Good, the Bad and the Ugly.
Measuring Complex Achievement
Assessment in College Teaching Ursula Waln, Director of Student Learning Assessment Central New Mexico Community College.
Teaching Today: An Introduction to Education 8th edition
“Outcomification”: Development and Use of Student Learning Outcomes Noelle C. Griffin, PhD Director, Assessment and Data Analysis Loyola Marymount University.
Performance-Based Assessment HPHE 3150 Dr. Ayers.
SHOW US YOUR RUBRICS A FACULTY DEVELOPMENT WORKSHOP SERIES Material for this workshop comes from the Schreyer Institute for Innovation in Learning.
VALUE/Multi-State Collaborative (MSC) to Advance Learning Outcomes Assessment Pilot Year Study Findings and Summary These slides summarize results from.
The selection of appropriate assessment methods in a course is influenced by many factors: the intended learning outcomes, the discipline and related professional.
 An article review is written for an audience who is knowledgeable in the subject matter instead of a general audience  When writing an article review,
360 Feedback A Tool For Improving Individual And Organizational Effectiveness.
What Are the Characteristics of an Effective Portfolio? By Jay Barrett.
Criterion-Referenced Testing and Curriculum-Based Assessment EDPI 344.
MAVILLE ALASTRE-DIZON Philippine Normal University
Rubrics Staff development workshop 19/9/2014 Dr Ruth Fazakerley.
Assessment My favorite topic (after grammar, of course)
Planning for and Attending an Important Meeting Advanced Social Communication High School: Lesson Seven.
QCC General Education Assessment Task Force March 21 and 22, 2016 Faculty Forum on General Education Outcomes.
第一版 Rubric( 從前陳執行長 ) 去年六月挑選幾個科目做,非常費事。 去年九月向系上同仁宣告每個科目都要做 Rubric 去年十月向陳執行長請教 : 經院長與周副等決定 整個 Program 做 assessment.
Assessment Instruments and Rubrics Workshop Series Part 1: What is a rubric? What are the required elements? What are VALUE rubrics? February 24, 2016.
If I hear, I forget. If I see, I remember. If I do, I understand. Rubrics.
How can we use CATs in tutoring? Lee Helbert-Gordon Director, Institutional Research & Student Success Center Prairie State College.
Stages of Research and Development
Classroom Assessments Checklists, Rating Scales, and Rubrics
Descriptive Writing.
Designing Valid Reliable Grading Tools Using VALUE Rubrics
CRITICAL CORE: Straight Talk.
Classroom Assessment A Practical Guide for Educators by Craig A
Presenter: Dr. Molly Hayes Sauder
Assessment in Language Teaching: part 1 Lecture # 23
Introduction to Assessment in PBL
EDU704 – Assessment and Evaluation
Classroom Assessments Checklists, Rating Scales, and Rubrics
What Are Rubrics? Rubrics are components of:
CT is a self-directed process by which we take deliberate steps to think at the highest level of quality. CT is skillful, responsible thinking that is.
Effective Use of Rubrics to Assess Student Learning
Introduction to Assessment in PBL
Assessment of Learning in Student-Centered Courses
Developing a Rubric for Assessment
Introduction to Assessment
Designing a Performance Assessment
Presentation transcript:

Descriptive Rubrics Ursula Waln, Director of Student Learning Assessment Central New Mexico Community College

Rubric: Just Another Word for Scoring Guide A rubric is any scoring guide that lists specific criteria, such as a checklist or a rating scale. A rubric is any scoring guide that lists specific criteria, such as a checklist or a rating scale. Checklists are used for objective evaluation (did it or did not do it). Checklists are used for objective evaluation (did it or did not do it). Rating scales are used for subjective evaluation (gradations of quality). Rating scales are used for subjective evaluation (gradations of quality). Descriptive rubrics are rating scales that contain descriptions of what constitutes each level of performance. Descriptive rubrics are rating scales that contain descriptions of what constitutes each level of performance. Maybe call them descriptive scoring guides if you don’t like the word rubric. Maybe call them descriptive scoring guides if you don’t like the word rubric. Most people who talk about rubrics are referring to descriptive rubrics, not checklists or rating scales. Most people who talk about rubrics are referring to descriptive rubrics, not checklists or rating scales. Click to go to: CATs Item Analysis Item Analysis

The Purpose of Descriptive Rubrics Descriptive rubrics are used to lend objectivity to evaluations that are inherently subjective, e.g.: Descriptive rubrics are used to lend objectivity to evaluations that are inherently subjective, e.g.: Grading of artwork, papers, performances, projects, speeches, etc. Grading of artwork, papers, performances, projects, speeches, etc. Assessing overall student progress toward specific learning outcomes (course and/or program level) Assessing overall student progress toward specific learning outcomes (course and/or program level) Monitoring developmental levels of individuals as they progress through a program (‘developmental rubrics’). Monitoring developmental levels of individuals as they progress through a program (‘developmental rubrics’). Conducting employee performance evaluations. Conducting employee performance evaluations. Assessing group progress toward a goal. Assessing group progress toward a goal. When used by multiple evaluators, descriptive rubrics can minimize differences in rater thresholds (especially if normed). When used by multiple evaluators, descriptive rubrics can minimize differences in rater thresholds (especially if normed). Click to go to: CATs Item Analysis Item Analysis

Why Use Descriptive Rubrics in Class? In giving assignments, descriptive rubrics can help clarify the instructor’s expectations and grading criteria for students. In giving assignments, descriptive rubrics can help clarify the instructor’s expectations and grading criteria for students. Students can ask more informed questions about the assignment. Students can ask more informed questions about the assignment. A clear sense of what is expected can inspire students to achieve more. A clear sense of what is expected can inspire students to achieve more. The rubric helps explain to students why they received the grade they did. The rubric helps explain to students why they received the grade they did. Descriptive rubrics help instructors remain fair and consistent in their scoring of student work (more so than rating scales). Descriptive rubrics help instructors remain fair and consistent in their scoring of student work (more so than rating scales). Scoring is easier and faster when descriptions clearly distinguish levels. Scoring is easier and faster when descriptions clearly distinguish levels. The effects of scoring fatigue (e.g., grading more generously toward the bottom of a stack due to disappointed expectations) are minimized. The effects of scoring fatigue (e.g., grading more generously toward the bottom of a stack due to disappointed expectations) are minimized. Click to go to: CATs Item Analysis Item Analysis

Why Use Descriptive Rubrics for Assessment? Clearly identifying benchmark levels of performance and describing what learning looks like at each level establishes a solid framework for interpreting multiple measures of performance. Clearly identifying benchmark levels of performance and describing what learning looks like at each level establishes a solid framework for interpreting multiple measures of performance. Student performance on different types of assignments and at different points in the learning process can be interpreted for analysis using a descriptive rubric as a central reference. Student performance on different types of assignments and at different points in the learning process can be interpreted for analysis using a descriptive rubric as a central reference. With rubrics that describe what goal achievement looks like, instructors can more readily identify and assess the strength of connections between: With rubrics that describe what goal achievement looks like, instructors can more readily identify and assess the strength of connections between: Course assignments and course goals Course assignments and course goals Course assignments and program goals Course assignments and program goals Click to go to: CATs Item Analysis Item Analysis

Two Common Types of Descriptive Rubrics Holistic Each level of performance has just one comprehensive description. Each level of performance has just one comprehensive description. Descriptions may be organized in columns or rows. Descriptions may be organized in columns or rows. Useful for quick and general assessment and feedback. Useful for quick and general assessment and feedback. Analytic Each level of performance has descriptions for each of the performance criteria. Each level of performance has descriptions for each of the performance criteria. Descriptions are organized in a matrix. Descriptions are organized in a matrix. Useful for detailed assessment and feedback. Useful for detailed assessment and feedback. Click to go to: CATs Item Analysis Item Analysis

Example of a Holistic Rubric Performance Levels Descriptions Proficient (10 points) Ideas are expressed clearly and succinctly. Arguments are developed logically and with sensitivity to audience and context. Original and interesting concepts and/or unique perspectives are introduced. Intermediate (6 points) Ideas are clearly expressed but not fully developed or supported by logic and may lack originality, interest, and/or consideration of alternative points of view. Emerging (3 points) Expression of ideas is either undeveloped or significantly hindered by errors in logic, grammatical and/or mechanical errors, and/or over- reliance on jargon and/or idioms.

Example of an Analytic Rubric Delve, Mintz, and Stewart’s (1990) Service Learning Model Developmental Variables Phase 1 Exploration Phase 2 Clarification Phase 3 Realization Phase 4 Activation Phase 5 InternalizationInterventionMode GroupGroup (beginning to identify with group) Group that shares focus or independently Individual Setting Minimal community interaction—Prefers on- campus activities Trying many types of contact Direct contact with community Direct contact with community—intense focus on issue or cause Frequent and committed involvement CommitmentFrequency One TimeSeveral Activities or Sites Consistently at One SiteConsistently at One Site or with one issue Consistently at One Site or focused on particular issues Duration Short TermLong Term Commitment to Group Long Term Commitment to Activity, Site, or Issue Lifelong Commitment to Issue (beginnings of Civic Responsibility) Lifelong Commitment to Social Justice BehaviorNeeds Participate in Incentive Activities Identify with Group Camaraderie Commit to Activity, Site, or Issue Advocate for Issue(s)Promote Values in self and others Outcomes Feeling GoodBelonging to a GroupUnderstanding Activity, Site, or Issue Changing LifestyleLiving One’s Values BalanceChallenges Becoming Involved Concern about new environments Choosing from Multiple Opportunities/Group Process Confronting Diversity and Breaking from Group Questioning Authority/Adjusting to Peer Pressure Living Consistently with Values Supports Activities are Non- threatening and Structured Group Setting, Identification and Activities are Structured Reflective-Supervisors, Coordinators, Faculty, and Other Volunteers Reflective-Partners, Clients, and Other Volunteers Community—Have Achieved a Considerable Inner Support System

Another Example of an Analytic Rubric AAC&U Ethical Reasoning Value Rubric Capstone4Milestones 32 Benchmark1 Ethical Self- Awareness Student discusses in detail/analyzes both core beliefs and the origins of the core beliefs and discussion has greater depth and clarity. Student discusses in detail/analyzes both core beliefs and the origins of the core beliefs. Student states both core beliefs and the origins of the core beliefs. Student states either their core beliefs or articulates the origins of the core beliefs but not both. Understanding Different Ethical Perspectives/ Concepts Student names the theory or theories, can present the gist of said theory or theories, and accurately explains the details of the theory or theories used. Student can name the major theory or theories she/he uses, can present the gist of said theory or theories, and attempts to explain the details of the theory or theories used, but has some inaccuracies. Student can name the major theory she/he uses, and is only able to present the gist of the named theory. Student only names the major theory she/he uses. Ethical Issue Recognition Student can recognize ethical issues when presented in a complex, multilayered (gray) context AND can recognize cross- relationships among the issues. Student can recognize ethical issues when issues are presented in a complex, multilayered (gray) context OR can grasp cross-relationships among the issues. Student can recognize basic and obvious ethical issues and grasp (incompletely) the complexities or interrelationships among the issues. Student can recognize basic and obvious ethical issues but fails to grasp complexity or interrelationships. Application of Ethical Perspectives/ Concepts Student can independently apply ethical perspectives/concepts to an ethical question, accurately, and is able to consider full implications of the application. Student can independently apply ethical perspectives/concepts to an ethical question, accurately, but does not consider the specific implications of the application. Student can apply ethical perspectives/concepts to an ethical question, independently (to a new example) and the application is inaccurate. Student can apply ethical perspectives/concepts to an ethical question with support (using examples, in a class, in a group, or a fixed-choice setting) but is unable to apply ethical perspectives/concepts independently (to a new example.). Evaluation of Different Ethical Perspectives/ Concepts Student states a position and can state the objections to, assumptions and implications of and can reasonably defend against the objections to, assumptions and implications of different ethical perspectives/concepts, and the student's defense is adequate and effective. Student states a position and can state the objections to, assumptions and implications of, and respond to the objections to, assumptions and implications of different ethical perspectives/concepts, but the student's response is inadequate. Student states a position and can state the objections to, assumptions and implications of different ethical perspectives/concepts but does not respond to them (and ultimately objections, assumptions, and implications are compartmentalized by student and do not affect student's position.) Student states a position but cannot state the objections to and assumptions and limitations of the different perspectives/concepts.

There are No Rules for Developing Rubrics Form typically follows function, so how one sets up a descriptive rubric is usually determined by how one plans to use it. Form typically follows function, so how one sets up a descriptive rubric is usually determined by how one plans to use it. Performance levels are usually column headings but can function just as wall as row headings. Performance levels are usually column headings but can function just as wall as row headings. Performance levels can be arranged in ascending or descending order, and one can include as many levels as one wants. Performance levels can be arranged in ascending or descending order, and one can include as many levels as one wants. Descriptions can focus only on positive manifestations or include references to missing or negative characteristics. Descriptions can focus only on positive manifestations or include references to missing or negative characteristics. Some use grid lines while others do not. Some use grid lines while others do not. Click to go to: CATs Item Analysis Item Analysis

Rubric Artifact Analyses Interviews Observations Surveys Written Tests Descriptive rubrics can help pull together results from multiple measures for a more comprehensive picture of student learning.

Using the Model To pull together multiple measures for an overall assessment of student learning: To pull together multiple measures for an overall assessment of student learning: Take a random sample from each assignment and re-score those using the rubric (instead of the grading criteria), or rate the students as a group based on overall performance on each assignment. Take a random sample from each assignment and re-score those using the rubric (instead of the grading criteria), or rate the students as a group based on overall performance on each assignment. Then, combine the results, weighting their relative importance based on: Then, combine the results, weighting their relative importance based on: At what stage in the learning process the results were obtained At what stage in the learning process the results were obtained How well you think students understood the assignment or testing process How well you think students understood the assignment or testing process How closely the learning measured relates to the instructional objectives How closely the learning measured relates to the instructional objectives Factors that could have biased the results Factors that could have biased the results Your own observations, knowledge of the situations, and professional judgment Your own observations, knowledge of the situations, and professional judgment Click to go to: CATs Item Analysis Item Analysis

“ ” Remember that when you do assessment, whether in the department, the general education program, or at the institutional level, you are not trying to achieve the perfect research design; you are trying to gather enough data to provide a reasonable basis for action. You are looking for something to work on. Woolvard, B. E. (2010). Assessment clear and simple: Apractical guide for institutions, departments, and general education. San Fransisco, CA: Jossey-Bass.