Assessing Gen-Ed and Institutional Learning Goals: Innovative Efforts Barbara E. Walvoord, Ph.D. Professor Emerita, University of Notre Dame Consultant.

Slides:



Advertisements
Similar presentations
Eli Collins-Brown, Ed.D. Illinois State University July 12, 2006 Aspects of Online Courses That Are More Effective and Successful than Traditional, Face-to-Face.
Advertisements

General Education Assessment AAC&U GE and Assessment Conference March 1, 2007.
Cathy Jordan, PhD Associate Professor of Pediatrics Director, Children, Youth and Family Consortium University of Minnesota Member, Community Campus Partnerships.
Supporting Team-Based Learning Dr. Kathryn R. Ross, Indiana University Kokomo Team-Based Learning Conference 2007, Vancouver, Canada Copyright 2007 Kathryn.
Supplemental Instruction in Precalculus
Confirming Student Learning: Using Assessment Techniques in the Classroom Douglas R. Davenport Dean, College of Arts and Sciences Truman State University.
Rich Veit Judy Nye Laura Jannone. Why a New First-Year Seminar First-year seminars are offered at more than 95% of American colleges and universities.
First-Year Student Success: In Search of Best Practice Randy L. Swing, Ph.D. Co-Director, Policy Center on the First Year of College Fellow, National Resource.
Good Teaching Practices
© 2008 Brigham Young University–Idaho. © 2010 Brigham Young University–Idaho COURSE LEAD RESPONSIBILITIES TRAINING Feb. 7,
Outcomes Assessment. Many academic institutions measure the success of service- learning based on participation, number of hours, or estimated monies.
GENERAL EDUCATION ASSESSMENT Nathan Lindsay January 22-23,
Student Focus Groups UNIVERSITY OF KENTUCKY COLLEGE OF PHARMACY 1 Formative Evaluation Using Student Focus Groups Heidi M. Anderson, Ph.D. University of.
ACADEMIC DEGREE ASSESSMENT & GENERAL EDUCATION ASSESSMENT Nathan Lindsay Arts & Sciences Faculty Meeting March 12,
Student Learning Outcomes at PCC Adapted from a presentation to the PCC Board of Trustees in 2007.
Assessing Student Learning Outcomes In the Context of SACS Re-accreditation Standards Presentation to the Dean’s Council September 2, 2004.
Jennifer Strickland, PhD,
Getting Started on Your Teaching Portfolio 2013 Future Faculty Teaching Fellows Summer Institute Julie Saam, Ph.D. Associate Professor of Education Assistant.
MUS Outcomes Assessment Workshop University-wide Program-level Writing Assessment at The University of Montana Beverly Ann Chin Chair, Writing Committee.
Engaging Online Faculty and Administrators in the Assessment Process at the American Public University System Assessment and Student Learning: Direct and.
Identifying Promising Practices Promising Practices for Community College Student Success A FIRST LOOK.
Working with Rubrics: Using the Oral Communication, Writing, and Critical Thinking Rubrics VALUE Rubrics Ashley Finley, Ph.D Senior Director of Assessment.
School of Business University of Bridgeport Admissions Presentation Robert Gilmore, Ph.D. Associate Dean School of Business.
Kontos1 Principles of Quality Instruction in Web Classes George Kontos, Ed.D. Assistant Professor
Presented by Dr. Martin Camacho on behalf of Lela Morgan.
Pace University Assessment Plan. Outline I. What is assessment? II. How does it apply to Pace? III. Who’s involved? IV. How will assessment be implemented.
The SACS Re-accreditation Process: Opportunities to Enhance Quality at Carolina Presentation to the Faculty Council September 3, 2004.
SURVEYS, OBSERVATIONS, AND RUBRICS OH MY! ASSESSING CAREER SERVICES Jessica M. Turos Bowling Green State University Career Center.
Prince George’s Community College Online Express Training Faculty to Teach Online Mary Wells Margo Chaires Andrew Habermacher.
TIMELESS LEARNING POLICY & PRACTICE. JD HOYE President National Academy Foundation.
Using Technology to Enhance Instruction. Educational Technologies Blackboard, Content- Based Tools Distribution Tools Communicatio n Tools Presentatio.
MARTIN COMMUNITY COLLEGE ACHIEVING THE DREAM COMMUNITY COLLEGES COUNT IIPS Conference Charlotte, North Carolina July 24-26, 2006 Session: AtD – Use of.
TENN TLC addresses retention through student engagement UT SIFE students 13 May 2010.
AAHE 2004 Connecting Public Audiences to the College Experience: A Model of General Education Assessment Susan L. Davis James Madison University A. Katherine.
Using Technology to Enhance Instruction. Educational Technologies Blackboard, Content- Based Tools Distribution Tools Communicatio n Tools Presentatio.
Fast Track to Accelerate Student Success 1.What Fast Track is 2.Strategies for Implementation a) Pre-Production b) Implementation c) Post-Production.
Using Electronic Portfolios to Assess Learning at IUPUI. Trudy Banta, et. al. Indiana University-Purdue University Indianapolis 2007.
Using Technology to Enhance Instruction. Educational Technologies Blackboard, Content- Based Tools Distribution Tools Communicatio n Tools Presentatio.
Assessing General Education Workshop for College of the Redwoods Fred Trapp August 18, 2008.
Evidence of Student Learning Fall Faculty Seminar Office of Institutional Research and Assessment August 15, 2012.
1 This CCFSSE Drop-In Overview Presentation Template can be customized using your college’s CCFSSE/CCSSE results. Please review the “Notes” section accompanying.
EDU 385 CLASSROOM ASSESSMENT Week 1 Introduction and Syllabus.
SACS-CASI Southern Association of Colleges and Schools Council on Accreditation and School Improvement FAMU DRS – QAR Quality Assurance Review April 27-28,
What could we learn from learning outcomes assessment programs in the U.S public research universities? Samuel S. Peng Center for Educational Research.
Practicing Meaningful Learning Outcomes Assessment at UGA Department of Crop and Soil Sciences August 10, 2015 Dr. Leslie Gordon Associate Director for.
Implementing an Ability Based Education System Colleen Keyes Dean of Academic Affairs Dr. David England Director of Institutional Effectiveness.
Meeting the ‘Great Divide’: Establishing a Unified Culture for Planning and Assessment Cathy A. Fleuriet Ana Lisa Garza Presented at the 2006 Conference.
ACADEMIC PLAN REPORT Faculty Council March 16, 2012 Bruce W. Carney Executive Vice Chancellor & Provost.
1 Techniques for Online Retention Dr. Andrea Henne, Dean, Online and Distributed Learning.
State University of New York An Emerging Model for Online Learning MERLOT International Conference – August A Systemic Approach to Online Learning.
THE SLO PROCESS #1 create/update SLO’s, rubrics, & assessment methods #2 assess all students in all sections of all courses #3 maintain SLO assessment.
1 Faculty Development Opportunities Weekly events – Institute on High Impact Pedagogical Practices – Faculty Development Workshop Series – The Pre-Tenure.
It Takes a College! KARI KAHLER AND ASHLEY DARGA NORTHWESTERN MICHIGAN COLLEGE.
Accentuating Time in Student- Professor Interactions: Time, Value and Course Quality Dr. Ann V. Doty.
Mentoring and Teaching Pat Rogers, Associate Vice President: Teaching and Learning Wilfrid Laurier University Annual Academic Administrators Workshop Balsillie.
Working With the Instructional Development Team Presented by Heidi King 20 November 2002.
Wisconsin Administrative Code PI 34 1 Wisconsin Department of Public Instruction - Elizabeth Burmaster, State Superintendent Support from a Professional.
Program Assessment – an overview Karen E. Dennis O: sasoue.rutgers.edu.
A Proven Model for Faculty-driven General Education Assessment Dr. Rose Mince, Dean of Instruction for Curriculum and Assessment Ms. Nancy Bogage, Director,
Service Learning: What is it and how can it enhance student learning? Kim Buch Psychology.
1 Embracing Math Standards: Our Journey and Beyond 2008.
Making an Excellent School More Excellent: Weston High School’s 21st Century Learning Expectations and Goals
Bringing Active Learning to Scale at Bronx Community College (BCC) of the City University of New York (CUNY) Dr. Nancy Ritze August 3, 2016.
Helping students know what they know
Institutional Learning Outcomes Assessment
Jo Lynn Autry Digranes Coordinator for Assessment Updated 10/2017
Program Assessment Processes for Developing and Strengthening
The Heart of Student Success
TENN TLC addresses retention
Curriculum Coordinator: Patrick LaPierre February 3, 2017
Presentation transcript:

Assessing Gen-Ed and Institutional Learning Goals: Innovative Efforts Barbara E. Walvoord, Ph.D. Professor Emerita, University of Notre Dame Consultant in Assessment, Teaching and Learning, and Writing Across the Curriculum 45 Huckleberry Lane, Easthampton, MA Phone:

Outline: Six Areas of Challenge and Innovation for Gen-Ed Assessment Definitions; Basic, No-Frills Plan 1. Teaching and Faculty Engagement 2. High-Impact Practices 3. Systems for Information and Action 4. Using Information: The “Stomach” 5. Rubrics and Evaluation of Student Work 6. Keeping it Simple Definitions; Basic, No-Frills Plan 1. Teaching and Faculty Engagement 2. High-Impact Practices 3. Systems for Information and Action 4. Using Information: The “Stomach” 5. Rubrics and Evaluation of Student Work 6. Keeping it Simple

Definitions: What Is Gen-Ed? Gen Ed GOALS: Everyone works on them. – May be a subset or a more specific version of institution-wide goals Gen Ed CURRICULUM: Every course that students can use to fulfill their general- education requirements Community colleges: Your Associates Transfer degree is someone else’s gen-ed

The Basic, No-Frills System of Gen-Ed Assessment 1. Goals 2. Information – Direct (sample of students’ work and/or standardized test) – Indirect (student survey, student retention/ success, etc.) 3. Action – Forums for discussion – A system for information to flow into decisions

Outline: Six Areas of Challenge and Innovation for Gen-Ed Assessment Definitions; Basic, No-Frills Plan 1. Teaching and Faculty Engagement 2. High-Impact Practices 3. Systems for Information and Action 4. Using Information: The “Stomach” 5. Rubrics and Evaluation of Student Work 6. Keeping it Simple Definitions; Basic, No-Frills Plan 1. Teaching and Faculty Engagement 2. High-Impact Practices 3. Systems for Information and Action 4. Using Information: The “Stomach” 5. Rubrics and Evaluation of Student Work 6. Keeping it Simple

Bottom Line: TEACHING! In the presence of a reasonable curriculum, what makes a difference for student learning is TEACHING, broadly defined. That is, how the instructor manages student-instructor and student-student interaction and arranges the educative experiences--in class, out of class, online.

Corollary for Bottom Line If you do not engage the faculty, you will not change teaching. Thus, your gen-ed assessment system should be planned at every step for maximum faculty engagement and impact on teaching. Innovation: Replace “faculty on board” with “collaborative steering.”

Gen-Ed Revision -- TEACHING Many institutions spend too much time changing titles, number, and stated learning goals of required courses. Innovation: Gen-ed reform focusing on faculty development, not (or not only) courses. Innovation: Gen-ed courses are required/ encouraged to use research-based pedagogical approaches: e.g. active learning. Innovation: Gen-ed reform institutes research-based “high-impact practices.”

Questions for Discussion How do your institutions try to affect teaching and faculty engagement? What was the focus of your most recent gen- ed reform? Did it affect teaching? What are the implications of requiring certain types of pedagogy in gen-ed courses: e.g. active learning? Where do “writing-intensive” and similar courses fit?

Outline: Six Areas of Challenge and Innovation for Gen-Ed Assessment Definitions; Basic, No-Frills Plan 1. Teaching and Faculty Engagement 2. High-Impact Practices 3. Systems for Information and Action 4. Using Information: The “Stomach” 5. Rubrics and Evaluation of Student Work 6. Keeping it Simple Definitions; Basic, No-Frills Plan 1. Teaching and Faculty Engagement 2. High-Impact Practices 3. Systems for Information and Action 4. Using Information: The “Stomach” 5. Rubrics and Evaluation of Student Work 6. Keeping it Simple

Research-Based Seven Principles for Good Practice in Undergraduate Education Good practice … 1.Encourages contact between students and faculty 2.Develops reciprocity and cooperation among students 3.Encourages active learning 4.Gives prompt feedback

Research-Based Seven Principles for Good Practice in Undergraduate Education Good practice … 5. Emphasizes time on task 6. Communicates high expectations 7. Respects diverse talents and ways of learning Chickering and Gamson, 1987, widely available online.

Innovation: Research-Based High- Impact Educational Practices First-Year Seminars/Experiences Common Intellectual Experiences Service Learning/Community-Based Learning Learning Communities Writing-Intensive Courses

Research-Based High-Impact Educational Practices, cont. Collaborative Assignments/Projects Undergraduate Research Diversity/Global Learning Internships Capstone Courses and Projects Kuh, See AACU.org. Also pact_2008_final.pdf pact_20

High Impact Practices for Community Colleges Academic goal setting and planning Orientation Accelerated for fast-track developmental education First-year experience Student success course Learning community

High Impact Practices for Community Colleges, cont. Experiential learning beyond the classroom Tutoring Supplemental instruction Assessment and Placement Registration before classes begin Class attendance Alert and intervention 2.pdf.

Seven Principles, High-Impact Practices, and YOUR Assessment Seven Principles and High-Impact Practices can guide you. They are powerful because they change factors that affect learning: engagement, interactions among instructor and students, and the arrangement of educative experiences.

BUT High-impact practices are not assessment of learning. AND, to get the benefits, you have to do them well. – Use published research about how to make practices most effective. – Use assessment in your own setting to inform your practices.

Questions for Discussion How do your institutions implement and assess high-impact practices? How do you use assessment in connection with high-impact practices?

Outline: Six Areas of Challenge and Innovation for Gen-Ed Assessment Definitions; Basic, No-Frills Plan 1. Teaching and Faculty Engagement 2. High-Impact Practices 3. Systems for Information and Action 4. Using Information: The “Stomach” 5. Rubrics and Evaluation of Student Work 6. Keeping it Simple Definitions; Basic, No-Frills Plan 1. Teaching and Faculty Engagement 2. High-Impact Practices 3. Systems for Information and Action 4. Using Information: The “Stomach” 5. Rubrics and Evaluation of Student Work 6. Keeping it Simple

SYSTEMS for Information and Action You Need a SYSTEM for Gen-Ed, not just a set of isolated actions. NEXT: Diagram shows a system by which assessment information flows through the institution to inform action at every level. Diagram is in your handout.

Student work Stand. tests Dept, group Scorers Administration, faculty committees Innovation: Institutional System for Gen-Ed Assessment Student affairs; academic support Start reading at the bottom boxes, which show common types of assessment information. Instructor IR: surveys STOMACH: Assessment Committee, IR, etc. Aggregate/analyze data; recommend

Student work Stand. tests Dept, group Scorers Administration, faculty committees Institutional System for Gen-Ed Assessment Student affairs; academic support Instructor IR: surveys Black arrows with numbers show pathways for assessment information. STOMACH: Assessment Committee, IR, etc. Aggregate/analyze data; recommend

Student work Stand. tests Dept, group Scorers Administration, faculty committees Institutional System for Gen-Ed Assessment Student affairs; academic support Instructor IR: surveys Fat green arrows show feedback loops where resources and policies flow back to influence student learning STOMACH: Assessment Committee, IR, etc. Aggregate/analyze data; recommend

Student work Stand. tests Dept, group Scorers Administration, faculty committees Institutional System for Gen-Ed Assessment Student affairs; academic support Instructor IR: surveys You do not need all possible sources of information. Keep it simple. Gather only what you can use. STOMACH: Assessment Committee, IR, etc. Aggregate/analyze data; recommend

Student work Stand. tests Dept, group Scorers STOMACH: Assessment Committee, IR, etc. Aggregate/analyze data; recommend Administration, faculty committees Institutional System for Gen-Ed Assessment Student affairs; academic support Instructor IR: surveys Make your own version of this diagram, with your own offices and details.

Questions for Discussion What is your institution’s system?

Outline: Six Areas of Challenge and Innovation for Gen-Ed Assessment Definitions; Basic, No-Frills Plan 1. Teaching and Faculty Engagement 2. High-Impact Practices 3. Systems for Information and Action 4. Using Information: The “Stomach” 5. Rubrics and Evaluation of Student Work 6. Keeping it Simple Definitions; Basic, No-Frills Plan 1. Teaching and Faculty Engagement 2. High-Impact Practices 3. Systems for Information and Action 4. Using Information: The “Stomach” 5. Rubrics and Evaluation of Student Work 6. Keeping it Simple

Student work Stand. tests Dept, group Scorers STOMACH: Assessment Comm., Deans, IR, etc. Aggregate/analyze data; recommend Administration, faculty committees Innovation: A Better “Stomach” Student affairs; academic support Instructor IR: surveys

Models for the “Stomach” Disbursement Model: “Stomach” members work to ensure use of data at every level. Requirement Model: Provost and others require assessment data for budget and policy Retreat Model: Retreat (leaders or entire camps) to discuss 5-8-page summary of relevant data, how to use it in their own areas, and what the institution should work on.

Questions for Discussion What people/offices make up your “stomach”? What functions do they perform? Which model(s) do you use? How well is your “stomach” working to ensure that information about learning is aggregated, analyzed, distributed, and used for action?

Outline: Six Areas of Challenge and Innovation for Gen-Ed Assessment Definitions; Basic, No-Frills Plan 1. Teaching and Faculty Engagement 2. High-Impact Practices 3. Systems for Information and Action 4. Using Information: The “Stomach” 5. Rubrics and Evaluation of Student Work 6. Keeping it Simple Definitions; Basic, No-Frills Plan 1. Teaching and Faculty Engagement 2. High-Impact Practices 3. Systems for Information and Action 4. Using Information: The “Stomach” 5. Rubrics and Evaluation of Student Work 6. Keeping it Simple

Student work Stand. tests Dept, group Scorers Administration, faculty committees Rubrics and Evaluation of Student Work: Paths 1-4 Student affairs; academic support Instructor IR: surveys STOMACH: Assessment Committee, IR, etc. Aggregate/analyze data; recommend

Student work Stand. tests Dept, group Scorers Advantages and Problems Instructor The further to the left, the more faculty involvement. The further to the right, the more inter- rater reliability for institution-wide quantitative data. To Stomach

Student work Stand. tests Dept, group Scorers Innovations: Try to solve the problems, by… Instructor & 2: Providing institution- wide information, externally trusted 3 & 4: Ensuring validity, faculty involvement To Stomach

Can involve many instructors. Instructors piggy-back onto grading. No one else reads the student work. Rich discussion, collaboration. Engages faculty with their OWN work Modest software requirements because reports, not scores, are aggregated. Path 1: Instructor-Group Analysis: PROS and CONS To stomach 1 Dept, group Student work Instructor

Challenges if common rubric: Validity: what is being measured? Inter-rater reliability Challenges if own rubrics: Comparability at institutional level Path 1: Instructor-Group Analysis: Pros and Cons, cont. To stomach 1 Dept, group Student work Instructor

Innovation: Own rubrics, but within categories (e.g. critical thinking). Scores are aggregated for categories (Prince George’s CC). Innovation: Reports, not scores, are aggregated. Report: what we found, what we did, what we recommend institution should work on. Innovations for Path 1: Instructor-Group Analysis To stomach 1 Dept, group Student work Instructor

Innovations for Path 1, cont. Innovation: Sample of depts/groups document improvement in learning. Sample results are extrapolated to institution. Innovation: Triangulate with survey or standardized test. To stomach 1 Dept, group Student work Instructor

Path 1: Instructor - Group: Examples Raymond Walters College (2 year) of the University of Cincinnati. Each program/department holds an end-of-year meeting in which faculty each present one assignment that assesses “critical thinking,” a rubric, scores, and instructor’s action. Departments/programs take action, and also report in a common format to the Academic Assessment Committee, which makes recommendations to the Chief Academic Officer about common needs and institution-wide actions. All record-keeping is done in Word. Walvoord, Bardes, and Denton in Banta, ed, 2007.

Path 1: Instructor - Group: Examples “Medium-sized public university.” Selected faculty report to gen-ed “area committees,” which aggregate reports and recommend action to Gen Ed Council, which informs departments about their gen-ed courses. Gerretson, H. & Golson, E. (2005). Synopsis of the use of course-embedded assessment in a medium sized public university’s general education program. Journal of General Education, 54(2),

Path 1: Instructor - Group: Examples Juniata College. Center for Teaching holds numerous faculty workshops and discussion groups where faculty conduct and share assessment and improvement of student learning. Strong influence of the Scholarship of Teaching and Learning (SOTL). ataCaseStudy.pdf.

Path 1: Instructor-Group: Examples, cont. La Guardia Community College. Extensive workshops and faculty seminars support a strong e-portfolio system. ments/LaGuardiaCC.pdf. ments/LaGuardiaCC.pdf

Path 1: Instructor-Group: Examples, cont. Washington State University. Extensive faculty workshops involve faculty in developing and using/adapting common rubrics for critical thinking. Some faculty conduct classroom research to show improvement in student learning when faculty use the rubrics and teaching strategies developed in the workshops. These studies can be aggregated. Kelly-Riley in Banta, ed., 2007.

Student work Stand. tests Dept, group Scorers Administration, faculty committees Student affairs; academic support Instructor IR: surveys STOMACH: Assessment Committee, IR, etc. Aggregate/analyze data; recommend Path 2: Instructor Reports Directly to Stomach

Path 2: Instructor Reports: OPTIONS Instructor uses own OR common assignment. Instructor uses own OR common rubric. Instructors submits rubric scores AND/OR report: What I found, what I am doing, what the institution should work on. Student responses may OR may not be included. To Stomach Student work Instructor 2

Path 2: Instructor Reports: PROS and CONS Bypasses department or group. Saves instructor meeting time. Leaves instructor isolated, without community discussion. Inter-rater reliability problems. Requires software to aggregate individual instructor scores/reports. May not include assignment or student response. Thus scores interpreted in isolation. To Stomach Student work Instructor 2

Path 2: Instructor Reports: Examples Prince George’s Community College. Each instructor uses a course-specific rubric to enter scores into a database. Each cell of the rubric is assigned a point value, so the same rubric can be used to calculate the student’s grade. In the software program, each row of the rubric is connected to a course outcome, which is connected to program and gen-ed outcomes. Thus rubric scores can be aggregated to provide scores for each outcome. ccasional%20Paper%20FINAL.pdf ccasional%20Paper%20FINAL.pdf

Path 2: Instructor Reports: Examples North Carolina State University. Gen-ed instructors report to the Assessment Office how they have assessed student work that addresses common gen-ed goals, and how they have used information for changes. Reports can be aggregated to determine, for example, what goals faculty find most difficult for students, and what faculty are working on. Assessment Office also conducts a few focused studies, e.g. common math exam questions and common rubric scores for first-year writing. DuPont in Bresciani, ed., 2007

Student work Stand. tests Dept, group Scorers Administration, faculty committees Path 3: Institution-Wide Sample/Portfolios Scored Student affairs; academic support Instructor IR: surveys STOMACH: Assessment Committee, IR, etc. Aggregate/analyze data; recommend

Path 3: Institution-wide Samples/Portfolios: OPTIONS Scoring team may be large OR small. Scorers may be trained/normed more OR less rigorously. Scorers may submit scores and/OR recommendations. Student Work Scorers 3 To Stomach

Path 3: Institution-wide Samples: PROS - CONS Allows wide sample of student work. Allows careful norming for inter-rater reliability. Scorers may be isolated from rest of faculty. Action requires getting faculty/depts to act on the scorer report. Allows most faculty and depts to not participate in data analysis. Requires software and effort to collect student work. Student Work Scorers 3 To Stomach

Path 3: Institution-Wide Samples/Portfolios: Examples Community College of Baltimore County. Discipline teams design assignments approved by faculty that are incorporated into all sections of designated courses each semester. Detailed assignments require students to demonstrate their learning in a variety of ways, e.g., writing, graphic, and oral presentations; and/or creating a website. A random sample of students’ work is then scored by trained raters using a rubric. Nunley, Bers, and Manning. Learning Outcomes Assessment in Community Colleges. NILOA Occasional Paper # 10. July, 2011, p. 8. Learningoutcomesassessment.org/documents/Commu nityCollege.pdf

Path 3: Institution-Wide Samples/Portfolios: Examples Keene State College Faculty identify one assignment that can be used to assess each of the common outcomes for the “Integrative Studies Program.” Students are required to submit work in Blackboard. Common rubrics for each outcome are created by faculty teams and shared with instructors whose student work is being analyzed. A random sample of the work is graded by 3-person faculty teams who are trained and normed. Scores and recommendations from the scoring teams are shared across the campus. Rancourt, A. “Assessing Academic/Intellectual Skills in Keene State College’s Integrative Studies Program.” Journal of Assessment and Institutional Effectiveness, 2010, 1(1), 1-57.

Student work Stand. tests Dept, group Scorers Administration, faculty committees Path 4: Standardized Test Student affairs; academic support Instructor IR: surveys STOMACH: Assessment Committee, IR, etc. Aggregate/analyze data; recommend

Path 4: Standardized Tests: OPTIONS Innovation: Hybrid options that combine some traits of standardized tests but with more campus input (next slide). Stand. Tests To Stomach

Innovations: Hybrids Between Standardization and Campus-Based ItemProvide prompt Provide rubric Score the papers CLAXXX CAT (TN Tech) Stein & Haynes, Change, March/April XXThey train/norm your faculty scorers CLAQWA (claqwa.com and Banta et al, Occasional Paper #2 on learningoutc omesassessment.org) XOnline resources help your faculty scorers AACU Value (aacu.org)XCase studies

Questions for Discussion How does your campus evaluate student work? Method 1, 2, 3, or 4? Hybrid methods? How do you address the disadvantages of your method?

Outline: Six Areas of Challenge and Innovation for Gen-Ed Assessment Definitions; Basic, No-Frills Plan 1. Teaching and Faculty Engagement 2. High-Impact Practices 3. Systems for Information and Action 4. Using Information: The “Stomach” 5. Rubrics and Evaluation of Student Work 6. Keeping it Simple Definitions; Basic, No-Frills Plan 1. Teaching and Faculty Engagement 2. High-Impact Practices 3. Systems for Information and Action 4. Using Information: The “Stomach” 5. Rubrics and Evaluation of Student Work 6. Keeping it Simple

Student work Dept, group Administration, faculty committees Instructor IR: surveys 5 1 STOMACH: Assessment Committee, IR, etc. Aggregate/analyze data; recommend Keep It Simple: One Option You do not need all types of informa tion. Gather only what you can use.

Student work Dept, group Administration, faculty committees Instructor IR: surveys 5 1 STOMACH: Assessment Committee, IR, etc. Aggregate/analyze data; recommend Keep It Simple: Another Option You do not need all types of informa tion. Gather only what you can use. Scorers 3

Student work Scorers Administration, faculty committees Keep it Simple: Another Option IR: surveys 3 5 STOMACH: Assessment Committee, IR, etc. Aggregate/analyze data; recommend

Questions for Discussion What information have you collected that you have used well enough to be worth the trouble of collecting it? What is the simplest system that would serve your needs?

Teaching and Faculty Engagement: 1. How does your institution try to affect teaching and faculty engagement? 2. What was the focus of your most recent gen-ed reform? Did it affect teaching? 3. What are the implications of requiring certain types of pedagogy in gen-ed courses: e.g. active learning? 4. Where do “writing intensive” and similar courses fit? High-Impact Practices: 5. How does your institution implement and assess best practices? System 6. What is your system for assessment? “Stomach” 7. What people/offices make up your “stomach”? What are their functions? 8. Which model do you use? A hybrid? 9. How well is your “stomach” working to ensure that information is used for action? Rubrics and Student Work 10. How does your campus evaluate student work? Which method? A hybrid method? 11. How do you address the disadvantages of your method? Keep It Simple 12. What information have you collected that has been worth the trouble? 13. What is the simplest system that would serve your needs? Summary of Questions for Discussion

How to Find Examples Association of American Colleges and Universities. Case studies of institutions using VALUE rubrics. Aacu.org/value/casestudies. New book with case studies by T. Rhodes & A. Finley, Using the VALUE Rubrics for Improvement of Learning and Authentic Assessment. AACU, Banta, T. W., ed. Assessing Student Achievement in General Education. Assessment Update Collections. San Francisco: Jossey-Bass, Banta, T. W., ed. Community College Assessment. Assessment Update Collections. San Francisco: Jossey-Bass, Banta, T. W., Jones, E. A., and Black, K. E. Designing Effective Assessment: Principles and Profiles of Good Practice. San Francisco: Jossey-Bass, Bresciani, M. J., ed. Assessing Student Learning in General Education: Good Practice Case Studies. Bolton, MA: Anker, National Institute for Learning Outcomes Assessment (NILOA). Learningoutcomesassessment.org. Look for “Occasional Papers” and “Examples of Good Assessment Practice.” Websites of standardized tests (see table above) Assessment journals case studies. Use ERIC database. List of assessment journals at Learningoutcomesassessment.org/AssessmentBriefs.htm.