Aligning VALUE Rubrics to Institutional Purposes: Sharing Practical Approaches that Promote Consistent Scoring and Program Improvement Linda Siefert

Slides:



Advertisements
Similar presentations
About AAC&U AAC&U is the leading national association concerned with the quality of student learning in college 1,300 institutional members—including.
Advertisements

Aligning and Integrating General Education and the Majors at a Large Public Research University Anthony Ciccone William Keith Jeffrey Merrick University.
Our Institutions CSB/SJU Two schools, one academic program Liberal Arts, Residential, Catholic, Benedictine 3,900+ undergraduates 300+ FTE faculty Distribution.
Refocusing General Education: A Dialectical Interplay of Accreditation Requirements, Assessment Methods, and Student Learning Competencies Bobby L. Matthews.
GENERAL EDUCATION ASSESSMENT Nathan Lindsay January 22-23,
SLO Course Assessment in 5 Easy Steps Vivian Mun, Ed.D.
What is LEAP? Roundtable Discussions October 19 & 20.
GENERAL EDUCATION COURSE- BASED ASSESSMENT – IT’S EASIER THAN YOU THINK! C. Griffin.
Carol Ann Gittens, Gail Gradowski & Christa Bailey Santa Clara University WASC Academic Resource Conference Session D1 April 25, 2014.
Academic Senate November 6, 2013 STUDENT AFFAIRS INTEGRATED CO-CURRICULAR MODEL Presented by Dilcie D. Perez, Dean of Students.
AMY DOLING, CORYANNE HARRIGAN, AND STEVEN J. GRIFFITH SIMPSON COLLEGE, INDIANOLA, IOWA Widening the Circle: Creating Momentum for Curricular and Structural.
Working with Rubrics: Using the Oral Communication, Writing, and Critical Thinking Rubrics VALUE Rubrics Ashley Finley, Ph.D Senior Director of Assessment.
Adjunct Training for Elementary and Middle Grades Masters Program Gardner-Webb University Graduate Program Dr. Jane King – Elementary Dr. Kelly Taylor.
Implementing the AAC&U Value Rubrics in Sakai Sean Keesler and Janice Smith Three Canoes June 15, 2010.
S TEPS FOR S CORING W ORK Florida State University March 18-19, 2015 VALUE R UBRIC C ALIBRATION E VENT.
Assessment Consultant to THECB’s Undergraduate Education Advisory Committee (UEAC): Danita McAnally, Chief of Planning and Advancement.
Student Success as a University-wide Commitment Faculty Presentation August 25, 2011.
PPA Advisory Board Meeting, May 12, 2006 Assessment Summary.
Introduction to teaching and assessing so students will learn more using learner-centered teaching Phyllis Blumberg Warm-up activity How can instructor’s.
The Current Refocusing of General Education. Objectives for the Workshop Proposing and/or Renewing a Course Assessing the general education aspect of.
Maps, Rubrics and Templates A Primer on Their Uses for Assessment in Student Affairs.
Frequently asked questions about development, interpretation and use of rubrics on campuses 1.
Spring 2012 Pilot Project Module Nine A New Texas Core Curriculum 1.
Topic #3 - CRITICAL THINKING Key Evidence 1 Provided by Amarillo College Offices of Institutional Research and Outcomes Assessments.
Essential Elements of a Workable Assessment Plan Pat Tinsley McGill, Ph.D. Professor, Strategic Management College of Business Faculty Lead, Assessment.
California State University East Bay
Jason D. Powell Ferrum College Saturday, October 15, :30-2:30 PM ACA Summit Asheville, NC.
Source Code: Simple Tool to Help Assess and Improve Student Research Writing Dale Vidmar Information Literacy and Instruction Librarian Southern Oregon.
Instruction & Assessment Plan, Melissa Bowles-Terry April 4, 2011.
LEARNING OUTCOMES & SNAAP ATHE Leadership Institute Montreal, July 2015 Sally Gaskill, Director Strategic National Arts Alumni Project Indiana University.
Palomar College General Education/Institutional Learning Outcomes Overview Fall, 2015.
GENERAL EDUCATION COURSE- BASED ASSESSMENT – IT’S EASIER THAN YOU THINK! C. Griffin.
+ General Education Assessment Spring 2014 Quantitative Literacy.
Competency Assessment Advisory Team (CAAT) QUANTITATIVE REASONING DEPARTMENT OF MATHEMATICS REP – ROB NICHOLS 1.
Information Literacy Module for FYI Available to any FYI Tony Penny, Research Librarian – Goddard Library Research & Library Instruction Services We support.
Assessment Committee. Communicate what we Think about Assessment. Learn to Integrate assessment into our teaching practice and Act on it toward continuous.
Program Assessment: What Difference Does it Make? Ross Miller, AAC&U.
1 Scoring Provincial Large-Scale Assessments María Elena Oliveri, University of British Columbia Britta Gundersen-Bryden, British Columbia Ministry of.
Assessment at City Tech April 17, 2015 Living Lab Associate Seminar Susan Nilsen-Kupsch RDH MPA.
Lander University QEP Committee Report October 14, 2015 Committee Chair Jim Colbert.
Lead Faculty Conversation October 22, Broad Participation.
AAC&U VALUE Project: Minnesota Collaborative Demonstration Year 2 ( ) Demonstration Year 3 ( ) 1.
AAC&U Members on Trends in Learning Outcomes Assessment Key findings from a survey among 325 chief academic officers or designated representatives at AAC&U.
Using AAC&U’s Learning Tools to Address Core Revision Terrel L. Rhodes Vice President Association of American Colleges and Universities Texas Coordinating.
CDIO: Overview, Standards, and Processes (Part 2) Doris R. Brodeur, November 2005.
FALL 2015 MIRAMAR COLLEGE LAURA MURPHY COLLEGE- WIDE OUTCOMES AND ASSESSMENT FACILITATOR ISLO Assessment Summary.
Assessment Instruments and Rubrics Workshop Series Part 1: What is a rubric? What are the required elements? What are VALUE rubrics? February 24, 2016.
Student Learning Outcomes Assessment Montgomery College Fall 2011 Orientation.
C OLLEGIATE L EARNING A SSESSMENT Dr. Pedro Reyes, Executive Vice Chancellor for Academic Affairs January 2014.
Learning Assessment Techniques
GENERAL EDUCATION COURSE-BASED ASSESSMENT – IT’S EASIER THAN YOU THINK! C. Griffin.
Designing Valid Reliable Grading Tools Using VALUE Rubrics
CRITICAL CORE: Straight Talk.
GENERAL EDUCATION COURSE-BASED ASSESSMENT – IT’S EASIER THAN YOU THINK! S. Berlin.
GENERAL EDUCATION COURSE-BASED ASSESSMENT – IT’S EASIER THAN YOU THINK! C. Griffin.
Mathematics Assessment (Math 321 & Math 334)
Director, Institute for Faculty Development
First-Stage Draft Plans for Gen Ed Revision
Student Learning Outcomes Assessment
GENERAL EDUCATION COURSE-BASED ASSESSMENT – IT’S EASIER THAN YOU THINK!
Purposeful Pathways: Designing Relevant and Transparent HIPs
Using VALUE Rubrics to Assess Almost Any Program Outcome
Let’s Get it Started! Using VALUE Rubrics to Assess Almost Any Program Outcome Jessica Dennis- Interim Director of Assessment March 5, 2018.
Problem Solving Institutional Learning Outcome Assessment
AAC&U Members on Trends in Learning Outcomes Assessment
Jillian Kinzie, Indiana University Center for Postsecondary Research
GENERAL EDUCATION COURSE-BASED ASSESSMENT – IT’S EASIER THAN YOU THINK!
We VALUE HIPs Utilizing VALUE Rubrics and HIP QA Tools in Course Revitalization Presented by Melynda Conner, TBR OSS HIP Specialist 2019.
Curriculum Coordinator: D. Para Date of Presentation: Jan. 20, 2017
GENERAL EDUCATION COURSE-BASED ASSESSMENT – IT’S EASIER THAN YOU THINK! C. Griffin.
Presentation transcript:

Aligning VALUE Rubrics to Institutional Purposes: Sharing Practical Approaches that Promote Consistent Scoring and Program Improvement Linda Siefert Anne Pemberton University of North Carolina Wilmington

Discussion Topics Who is using VALUE rubrics or other meta- rubrics? How are they being used? What feedback is being gathered from the rubric users? What procedures are being used to improve consistency of scoring? What changes have been made at the institutions using the rubrics?

POLL Are you using AAC&U VALUE Rubrics at your institution? For General Education assessment For assessment in the majors Are you using other meta-rubrics at your institution? For General Education assessment For assessment in the majors

Which VALUE Rubrics do you use? Inquiry and analysis Critical thinking Creative thinking Written communication Oral communication Reading Quantitative literacy Information literacy Teamwork Problem solving Civic knowledge and engagement Intercultural knowledge Ethical reasoning Foundations and skills for life-long learning Integrative and applied learning

Implementation Procedures (1) How we are using the rubrics: A number of VALUE Rubrics are aligned to our UNCW Learning Goals. These rubrics are used to score student work products from general education courses and senior capstone courses. Courses with student learning outcomes that are aligned to the Learning Goals are selected that are representative of those taken by most students. Sections are selected by stratified random sampling. Students within sections are selected randomly. A workshop is held to acquaint or reacquaint instructors with the rubric(s) prior to the beginning of the semester. Instructors select an assignment that they believe matches most or all dimensions of the rubric.

Implementation Procedures (2) Scoring Workshop Two-hour workshop prior to scoring Scorers report that the training is adequate, yet a few say that they were not as prepared on the day of scoring as they thought they were. Scoring is performed at an all-day or half-day session. Scorers score the first work product from each packet together in pairs to recalibrate. Additional work products are double scored to measure IRR.

Implementation Procedures (3) How are you using VALUE or other meta- rubrics?

Feedback from Scorers Feedback weve received: Written communication rubric fits assignments well, requiring few assumptions. Inquiry is approached differently across disciplines. Most of these differences fall into Design Process. Most scoring pairs needed to make assumptions about the process that was inferred from the assignment. At the basic studies level, Topic Selection was often determined to be not applicable. Critical thinking has been the most difficult rubric to applysome difficulty comes from the rubric, some from the assignments. A number of scorers said that the assignments needed to be better matched to the rubrics. There were a number of comments concerning the need for faculty to provide more in-depth instructions for assignments. Feedback youve received:

Feedback from Instructors Feedback weve received: All instructors to date have said that the assignment selection process was not difficult, which does not match with scorer feedback about some of the assignments. One instructor said that more training would be beneficial. This is an area that we will be working on. Feedback youve received:

Interrater Reliability We measure using Percent Agreement, Spearmans Rho, and Krippendorffs Alpha. During first round, we met our benchmark on 3 of 15 dimensions, and were close on 3 more. Meta-rubrics are more difficult to apply than assignment-specific rubrics. How are you measuring, and what are your findings?

Changes to VALUE Rubrics We have made one change: Evidence dimension of Critical Thinking Divided interpretation from questioning viewpoints of experts What changes have you made to any of the VALUE rubrics? To fit institutional mission and use To improve consistent use

Changes Made to Instruction We have started a UNCW Learning Goals series through the Center for Teaching Excellence to begin conversations about each learning goal. Faculty are beginning to grapple with the difference between thinking and critical thinking. What changes are being made at your institutions?