Using Rubrics for Evaluating Student Learning. Purpose To review the development of rubrics for the purpose of assessment To share an example of how a.

Slides:



Advertisements
Similar presentations
Writing an NCATE/IRA Program Report
Advertisements

Leon County Schools Performance Feedback Process August 2006 For more information
Academic Quality How do you measure up? Rubrics. Levels Basic Effective Exemplary.
Capstone Assessment An Introduction Office of Assessment and Accreditation Indiana State University.
© 2008 Brigham Young University–Idaho. © 2010 Brigham Young University–Idaho COURSE LEAD RESPONSIBILITIES TRAINING Feb. 7,
Developing an Outcomes Assessment Plan. Part One: Asking a Meaningful Question OA is not hard science as we are doing it. Data that you collect is only.
Pre and Post Assessments A quick and easy way to assess your Student Learning Outcomes.
You can use this presentation to: Gain an overall understanding of the purpose of the revised tool Learn about the changes that have been made Find advice.
Designing Scoring Rubrics. What is a Rubric? Guidelines by which a product is judged Guidelines by which a product is judged Explain the standards for.
Student Learning Targets (SLT) You Can Do This! Getting Ready for the School Year.
STUDENT LEARNING OUTCOMES ASSESSMENT. Cycle of Assessment Course Goals/ Intended Outcomes Means Of Assessment And Criteria For Success Summary of Data.
Designing and Using Rubrics Marilyn Greer David Kale.
Susan Malone Mercer University.  “The unit has taken effective steps to eliminate bias in assessments and is working to establish the fairness, accuracy,
Making Your Assessments More Meaningful Flex Day 2015.
Modeling Data Use for Teacher and School Leader Candidates By: Dr. Jessica Zarian Assistant Professor, Metropolitan College of New York.
Focus Groups and Student Learning Assessment. What is a Focus Group? A focus group is a guided discussion whose intent is to gather open-ended comments.
Writing Effective Assessment Plans. Why Assessment Plans? Facilitates periodic, not episodic assessment of student learning and program outcomes Serves.
The Program Review Process: NCATE and the State of Indiana Richard Frisbie and T. J. Oakes March 8, 2007 (source:NCATE, February 2007)
Using Rubrics for Evaluating Student Learning Office of Assessment and Accreditation Indiana State University.
Writing Program Assessment Report Fall 2002 through Spring 2004 Laurence Musgrove Writing Program Director Department of English and Foreign Languages.
1 COE Rubrics Assessment Process Dr. Marwan Abu-Amara April 5 th, 2008.
Writing Effective Assessment Plans Office of Assessment and Accreditation Indiana State University.
ASSESSMENT SYSTEMS FOR TSPC ACCREDITATION Assessment and Work Sample Conference January 13, 2012 Hilda Rosselli, Western Oregon University.
R U There? Looking for those Teaching Moments in Chat Transcripts Frances Devlin, John Stratton and Lea Currie University of Kansas ALA Annual Conference.
IB Diploma Program Exams – Semester Report Cards
Process and Report Guidelines Concordia University Wisconsin Spring 2008.
ICT TEACHERS` COMPETENCIES FOR THE KNOWLEDGE SOCIETY
Purpose Program The purpose of this presentation is to clarify the process for conducting Student Learning Outcomes Assessment at the Program Level. At.
How to Fill Out the CARD Form (Course Assessment Reporting Data Form)
OCTOBER ED DIRECTOR PROFESSIONAL DEVELOPMENT 10/1/14 POWERFUL & PURPOSEFUL FEEDBACK.
Student Growth Goals: How Principals can Support Teachers in the Process Jenny Ray PGES Consultant KDE/NKCES.
Assessment Report School of The Arts, Humanities and Social Sciences________________ Department: Political Science and International Studies.
Everything you wanted to know about Assessment… Dr. Joanne Coté-Bonanno Barbara Ritola September 2009 but were afraid to ask!
DDMs for School Counselors RTTT Final Summit April 7, 2014 Craig Waterman & Kate Ducharme.
Study Session   The purpose of the Comprehensive Examination is for Graduate students to synthesize in writing the knowledge, skills, and competencies.
Dissertation Theme “The incidence of using WebQuests on the teaching-learning process of English Foreign Language (EFL) for students attending the seventh.
Classroom Assessments Checklists, Rating Scales, and Rubrics
Using Rubrics as a Means of Performance Assessment Sarah Miller FDN 5560, Spring 2005 Click HERE to return to the DocumentationHERE.
EDU 385 Education Assessment in the Classroom
Assessing General Education Workshop for College of the Redwoods Fred Trapp August 18, 2008.
How Proficient am I? Emily Bryant - Enka High Candler, NC Outcomes Hints Struggles Strategies.
A User-Friendly Approach to Streamlining the Collection and Analysis of SLO Evidence Dave Karp & Tom Vitzelio.
 This prepares educators to work in P-12 schools (1)  It provides direction (1)  It is knowledge-based, articulated, shared, coherent, consistent with.
ASSESSMENT OF CORE SKILLS/ GENERAL EDUCATION OUTCOMES Angelina Hill, PhD Associate Director, Office of Academic Assessment.
GENERAL EDUCATION COURSE- BASED ASSESSMENT – IT’S EASIER THAN YOU THINK! C. Griffin.
Linking a Comprehensive Professional Development Literacy Program to Student Achievement Edmonds School District WERA December 4, 2008.
Educators’ Attitudes about the Accessibility and Integration of Technology into the Secondary Curriculum Dr. Christal C. Pritchett Auburn University
What could we learn from learning outcomes assessment programs in the U.S public research universities? Samuel S. Peng Center for Educational Research.
Lecture PowerPoint Slides Basic Practice of Statistics 7 th Edition.
Evaluating Impacts of MSP Grants Hilary Rhodes, PhD Ellen Bobronnikov February 22, 2010 Common Issues and Recommendations.
The Conceptual Framework: What It Is and How It Works Linda Bradley, James Madison University Monica Minor, NCATE April 2008.
Evaluating Impacts of MSP Grants Ellen Bobronnikov Hilary Rhodes January 11, 2010 Common Issues and Recommendations.
Sampling Design and Analysis MTH 494 Lecture-22 Ossam Chohan Assistant Professor CIIT Abbottabad.
Lexile Project Guidelines for Data Collection and Analysis.
Teacher Incentive Fund U.S. Department of Education.
 Teaching: Chapter 14. Assessments provide feedback about students’ learning as it is occurring and evaluates students’ learning after instruction has.
AUTHOR: NADIRAN TANYELI PRESENTER: SAMANTHA INSTRUCTOR: KATE CHEN DATE: MARCH 10, 2010 The Efficiency of Online English Language Instruction on Students’
Understanding How Evaluations are Calculated Professional Practices, Measures of Student Learning/ Outcomes- Calculating Scores & Translating SLOs/SOOs.
NOTE: To change the image on this slide, select the picture and delete it. Then click the Pictures icon in the placeholder to insert your own image. COMMON.
Chapter 6 Assessing Science Learning Updated Spring 2012 – D. Fulton.
Designing Quality Assessment and Rubrics
Designing Scoring Rubrics
DIRECTED ADMINISTRATIVE PORTFOLIO
CRITICAL CORE: Straight Talk.
Consider Your Audience
CUR 723 Innovative Education-- snaptutorial.com
Understanding How Evaluations are Calculated
Program Assessment Plans Step by Step
Rubrics for academic assessment
Assessing Academic Programs at IPFW
Presentation transcript:

Using Rubrics for Evaluating Student Learning

Purpose To review the development of rubrics for the purpose of assessment To share an example of how a rubric can be developed for use in assessment To show how rubric assessment can be quantified and shared To encourage use of this method as a form of direct assessment of student learning

Why Rubric Assessment? Rubrics… Provide standardized information about student learning on specifically defined student learning objectives Emphasize department/program control Can link to other programs (i.e., general education program objectives) Form of direct assessment of student learning Can be directly linked to program objectives Results are more easily reported

Defining Assignment To Be Assessed Should be a critical assignment that includes several student learning/program objectives Faculty should agree with the utility of the assessment Should allow for simple measurement (i.e., descriptive statistics, etc.) Some additional comparisons could be made (i.e., comparisons between freshman and senior performance, evening and day courses, on-line and traditional courses, etc.)

Match Assignments With Program Objectives Rubric elements should match with student learning objectives Faculty should play significant role in developing and implementing rubric Critical question: Is rubric comprehensive enough to assess several objectives? Caution: Do not “shoehorn” a rubric by including categories that are difficult to assess in the first place. It is fine if all objectives are not measured by one instrument. When developing a rubric, it is always (therefore) a good idea to make sure that a program is assessing as much as it can, given the amount of effort it may take to develop and then implement the rubric on a regular cycle. A fair amount of work is involved, but it is worth it!

Brief Case Study Case: A program has a number of learning objectives it needs to assess. It has chosen a research project students must complete during their senior year. The assignment requires students to put forward a point of view, and support the argument by critically evaluating competing points of view, use an effective research method, and apply appropriate content knowledge specific to their subject area. The faculty feel that they are clear on the purposes of the assignment, and what each purpose means, so it collaboratively creates a rubric to evaluate these projects.

Example Categories chosen according to student learning/program objectives of specific department or program. Each category is assigned different values (or weights) in order to arrive at final score. This is not necessary if the purpose of the rubric is to examine each individual category, however. Columns chosen according to degree. In this hypothetical case, the faculty chose “target,” “satisfactory,” and “unsatisfactory.” When describing what these different degrees mean, it is important to discuss what is ideal, what is satisfactory, and what is unsatisfactory. When filling in each cell in a rubric, it may be best to define the extremes first—target and unsatisfactory—and then fill in the satisfactory cell (Harder and Harper, ISU Assessment Workshop, April 2004) When filling in each cell, care must be taken to be very specific about expectations. When discussing this, faculty and evaluators may want to discuss the meaning of key words, avoiding vague wording and sentences that may contradict. For example, the wording within each of these cells may be unclear. 2-3 faculty members may want to review it to clarify some of the terms.

Choosing How and When To Use Rubrics When using a rubric as a method of assessing department/program goals, one may want to perform the following actions: Require the use of the rubric in all course sections in which the assignment is required Capstone courses can be used in this respect Requires cooperation among all faculty teaching the course A department may use selected courses (not all sections), although it is important to know which courses are not being included to anticipate any bias that might result from such an arrangement. Choose a random sample Collect samples from all students and store in a central location Select a random sample from this collection Define periodicity (every year, every two years, three?) Select evaluators (faculty or outside qualified evaluators) Pre-test the rubric Test rubric for inconsistencies by evaluating a couple of assignments first Later test for inter-rater reliability

Use of Rubric Data Final totals can be used to observe what grades students have received Sub-category totals are most useful because they assess student learning on each objective (especially if each row in rubric corresponds with a program/student learning objective)

Example In the following example, a random sample of thirty final papers was chosen and graded according to the rubric developed in a prior slide. Totals were inputted on an excel spreadsheet. Percentages were then calculated by dividing the average number of points for each category by the total number of assignments graded (in this case, 30). This presents a percentage between 0 and 1, allowing for comparison among the categories.

Rubric Results Rubric categories are included on a graph (a table is also fine, of course). Categories can be compared by taking the average for each category and dividing by the total number of responses. The differences among the categories can then be discussed by faculty in a faculty meeting. For example, in this group of histograms, faculty might conclude that students appear to perform well on content knowledge, advocacy, and critical evaluation, but may need to improve in respect to using statistical knowledge and use of research methodologies. Of course, as always, it is also helpful to use additional forms of assessment to “triangulate” this finding. A total average is included here to compare with each sub-category in case there is interest.

Demonstrating Value Added It may be a good idea to compare freshman (or entering student) performance with that of seniors. One can perform statistical tests to see if there is improvement between students at entry in a program and senior students Assumes that entering and exiting students complete similar assignments, requiring a good deal of coordination Excellent method of demonstrating impact of a program on student learning

Conclusion Rubric-based assessment requires a good deal of coordination among faculty Requires specific identification of student learning goals and how they are to be measured. Despite high coordination costs, very valuable method of direct assessment Once implemented, comparatively easy to implement down the road assuming consistent leadership and implementation. As always, any assessment has little value until findings are discussed with faculty and results incorporated into some result, such as the program review process, enhancement of program offerings, changes in teaching emphases, etc.

Contact Information Dr. Sean McKitrick, Assistant Provost for Curriculum, Instruction, and Assessment (607)