From the IR Office To the Classroom: The Role of Assessment in Student Learning Dr. John W. Quinley Dr. Brett Parker.

Slides:



Advertisements
Similar presentations
General Education Assessment AAC&U GE and Assessment Conference March 1, 2007.
Advertisements

ASSESSMENT 101 Preparing Future Faculty (PFF) Workshop Spring 2011 Facilitators: Dr. Kara Penfield, Director of Academic Assessment; Isis Artze-Vega, PFF.
What “Counts” as Evidence of Student Learning in Program Assessment?
Assessing student learning from Public Engagement David Owen National Co-ordinating Centre for Public Engagement Funded by the UK Funding Councils, Research.
Jennifer Strickland, PhD,
Teaching Methods Related to Student Progress in Lower-level STEM Courses Steve Benton, Ph.D. Senior Research Officer IDEA Emeritus Professor, Kansas State.
Integrating Ethics Into Your Compliance Program John A. Gallagher, Ph.D Center for Ethics in Health Care Atlanta, GA.
Leadership Development Nova Scotia Public Service
Measuring Student Learning March 10, 2015 Cathy Sanders Director of Assessment.
Development of Student Learning Outcomes for GE: lessons from a collaborative approach A Lawson General Education in California Conference, CSU Fullerton.
The Course Ajar: Principles and Practices for Multi-Section Program Design Ed Nagelhout Department of English UNLV Faculty Research Institute ● 5 March.
Curriculum, Instruction, & Assessment
Program Assessment Workshop Kathleen Harring. What is Assessment? Assessment is the systematic gathering and analysis of information to inform and improve.
Growing Success-Making Connections
Outcomes Assessment 101 Assessment Showcase 2009: Best Practices in the Assessment of Learning and Teaching February 11, 2009 Gwendolyn Johnson, Ph.D.
Goals of This Session Provide background for program review development Describe document make-up.
Confluence in Information Literacy Lisa Baures Randall McClure Georgia Conference on Information Literacy Coastal Georgia Center Savannah, GA 6 October.
Interstate New Teacher Assessment and Support Consortium (INTASC)
SOCIAL SCIENCES STANDARDS REVIEW AND REVISION February 2009-June 2011 PRESENTATION TO THE STATE BOARD OF EDUCATION.
Eportfolio: Tool for Student Career Development and Institutional Assessment Sally L. Fortenberry, Ph.D., and Karol Blaylock, Ph.D. Eportfolio: Tool for.
PEGGY MAKI, PH.D. EDUCATION CONSULTANT IN ASSESSING STUDENT LEARNING PRESENTED AT CCRI SEPTEMBER 21, 2012 Assessment at Three Levels: Institution, Program,
From the Beginning Planning for Impact in Writing Instruction, 6-12 Dr. Patti McWhorter
Evidence of Success: Assessing Student Learning Outcomes in International Education Dr. Darla K. Deardorff Association of International Education.
Creating a Teaching Dossier Shea Wang, Ph.D Interim Faculty Evaluation Coordinator Oct. 21, 2013.
Focus on Learning: Student Outcomes Assessment and the Learning College.
Accreditation, SLOs and You – What are the patient outcomes? or Will the patient survive? Facilitators: Janet Fulks and Phillip Maynard.
Using Electronic Portfolios to Assess Learning at IUPUI. Trudy Banta, et. al. Indiana University-Purdue University Indianapolis 2007.
Assessing Program-Level SLOs November 2010 Mary Pape Antonio Ramirez 1.
Unit 1 – Preparation for Assessment LO 1.1&1.2&1.3.
Institutional Outcomes and their Implications for Student Learning by John C. Savagian History Department Alverno C O L L E G E.
Assessment Workshop College of San Mateo February 2006.
 This prepares educators to work in P-12 schools (1)  It provides direction (1)  It is knowledge-based, articulated, shared, coherent, consistent with.
Evidence of Student Learning Fall Faculty Seminar Office of Institutional Research and Assessment August 15, 2012.
Improving Performance Outcomes in an Age of Accountability Oregon Conference 2008 Dr. Krista D. Parent South Lane School District.
ACCREDITATION Goals: Goals: - Certify to the public and to educational organizations that the school is recognized as an effective institution of learning.
2009 Pitt Community College CCSSE Results September 21, 2009 Report to the Campus College CCSSE Results Pitt Community College Dr. Brian Miller, Assistant.
 Read through problems  Identify problems you think your team has the capacity and interest to solve  Prioritize the problems and indicate the.
What could we learn from learning outcomes assessment programs in the U.S public research universities? Samuel S. Peng Center for Educational Research.
Periodic Program Review Guiding Programs in Today’s Assessment Climate LaMont Rouse Executive Director of Assessment, Accreditation & Compliance.
Assessing Your Assessments: The Authentic Assessment Challenge Dr. Diane King Director, Curriculum Development School of Computer & Engineering Technologies.
The Conceptual Framework: What It Is and How It Works Linda Bradley, James Madison University Monica Minor, NCATE April 2008.
Intro to Outcomes. What is “Outcomes”? A. a statewide initiative aimed at improving learning and accountability in education B. a standing SFCC committee.
Learning Goals at St. John Fisher College Peter J. Gray, Ph.D. Director of Academic Assessment United States Naval Academy May 2004.
Introduction to Academic Assessment John Duffield Office of Academic Assessment Georgia State University September 2013.
Diamond Model A systems oriented Instructional Design Model based on the design generated from Designing and Assessing Courses and Curricula: A Practical.
Accreditation Update and Institutional Student Learning Outcomes Deborah Moeckel, SUNY Assistant Provost SCoA Drive in Workshops Fall 2015
Dr. Derrica Davis Prospective Principal Candidate: Fairington Elementary School.
Criterion 1 – Program Mission, Objectives and Outcomes Weight = 0.05 Factors Score 1 Does the program have documented measurable objectives that support.
Dr. Salwa El-Magoli Chairperson of the National Quality Assurance and Accreditation Committee. Former Dean of the Faculty of Agricultural, Cairo university.
Getting Ready for the Higher Learning Commission (NCA) July 2011 Dr. Linda Johnson HLC/Assessment Coordinator/AQIP Liaison Southeast Technical Institute.
Creative Intervention Planning through Universal Design for Learning MariBeth Plankers, M.S. CCC-SLP Page 127.
JACKSON STATE UNIVERSITY ACADEMIC ASSESSMENT COMMITTEE WORKSHOP By: Dr. Shemeka McClung Director Ms. Arnitra Hunter Research Associate Institutional Research.
Texas Higher Education Coordinating Board Dr. Christopher L. Markwood Texas A&M University-Corpus Christi January 23, 2014.
Council for the Advancement of Standards in Higher Education.
MUS Outcomes Assessment Workshop University-wide Program-level Writing Assessment at The University of Montana Beverly Ann Chin Chair, Writing Committee.
Revised Quality Assurance Arrangements for Registered Training Organisations Strengthening our commitment to quality - COAG February 2006 September 2006.
Weston High School Improvement Plan 21st Century Learning Expectations and Goals
Making an Excellent School More Excellent: Weston High School’s 21st Century Learning Expectations and Goals
CBU CALIFORNIA BAPTIST UNIVERSITY Assessment, Accreditation, and Curriculum Office CBU - OIRPA.
Assessment Planning and Learning Outcome Design Dr
CRITICAL CORE: Straight Talk.
Outcomes Assessment Committee
Consider Your Audience
Phyllis Lynch, PhD Director, Instruction, Assessment and Curriculum
JACKSON STATE UNIVERSITY ACADEMIC ASSESSMENT COMMITTEE WORKSHOP
FEAPs (Florida Educator Accomplished Practices)
Helping students know what they know
Shazna Buksh, School of Social Sciences
February 21-22, 2018.
JACKSON STATE UNIVERSITY ACADEMIC ASSESSMENT COMMITTEE WORKSHOP
Presentation transcript:

From the IR Office To the Classroom: The Role of Assessment in Student Learning Dr. John W. Quinley Dr. Brett Parker

From IR Office To The Classroom Expansion of focus History of student outcomes assessment Rubric review project

Expansion from IR & Indirect to Classroom & Direct

“We talk a good fight about wanting to have excellent schools when in fact we’re content to have average ones.” Nation At Risk David Gardner “Nation At Risk” Commission Shift

Institutional Effectiveness Movement (indirect) Accreditation agencies National profession associations National taskforces State efforts Shift

Student Learning Outcomes Movement (direct) What the student – Knows – Can do – Values Appropriate Shift

Indirect: IR OfficeDirect: Faculty Where College objectives Outside classroom Embedded in curriculum Classroom What Surveys & focus groups Transcript analysis Papers Presentations Portfolios

Indirect: IR OfficeDirect: Faculty Audience Administrative Regulatory agencies Public Academic administration Faculty Regulatory agencies Students Purpose Broad policy Accountability to regulatory agencies Public relations Curriculum modification Instructional delivery Accountability to regulatory agencies Students

Push External mandates Administrative mandates Models, systematic approaches Shift

Model Develops frameworks Ties measures to core Reports, analyzes findings Uses findings Integrates plans

Structure Program Outcome Who will be assessed? When will they be assessed? What is the assessment approach? What is the measurement? Articulate the creative process & its influence on project development Students in ART 121, 131, Students in ART 141, 241 (sample of 5 students) At end of course All assignments Oral critique Artist’s statement Written critique Rubric score

Faculty the Key to Success Involvement and meaning Decentralized responsibility with support Sustainability

“One of the distinguishing characteristics of successful assessment programs is the extent to which they engage faculty and others in the process.” Palomba & Banta, 1999

“In order to maintain buy-in and relevance to purposes, it is important to decentralize the day-to-day assessment work while providing central support to the process. Everyone should be held responsible for his or her role.” Keeton, 1998

“The weight of trying to assess too many learning outcomes…may unduly tax faculty and professional staff who will need to…integrate the process of learning about student learning into institutional rhythms and practices.” Maki, 2002 Morante, 2003 “While all areas should be assessed, “an institution that tries to define all areas equally, for whatever reason, is more likely to get bogged down in minutiae and overwork, increasing the likelihood of missing the improvement of student learning.”

Pull Discussion Encouragement, coaching Faculty led initiatives Professional development

1. On scale of 1-5, where is your college On push? On pull? 2. Think of an experience with push or pull that you would be willing to share with the group. Engaging Faculty

History of Student Outcomes Assessment

Isothermal Experience Learning College Assessment Taskforce Learning outcome statements Criteria and rubrics Curriculum mapping Faculty quality improvement forms History

Learning College: To Improve Life Through Learning Creates substantive change Engages learners Provides options Collaborates in learning Defines instructors needs by student needs Supports learning by everyone Succeeds only when learning documented O’Banion, 1997 A Learning College Primer History

Assessment Taskforce Annual goals since 1998 Procedure, responsibility, purpose,& timeline Taskforce & college-wide meetings Hosts assessment authorities Professional development History

Communicate Effectively Through Writing… 1234 Adheres to rules in mechanics and style Varies sentence structure Uses standard English Uses language which is clear, concise and appropriate History

Curriculum Mapping General education statements – Criteria Individual class – Extent of emphasis (0,1,2) List major assignments/assessments History

General Education Competency and Rubric Review

General Education Competency Statements Communication Problems Interpersonal Quantitative Computer Culture

Originated from QEP development process 1. Form teams 2. Review using provided list of questions, survey current usage, examine literature 3. Revise 4. Test revisions with students and faculty 5. Present revisions to steering team & administration 6. Present to college-wide assessment meeting

Questions Content Scale Clarity and reliability Usability

Revisions introduced at a Rubrics Faire Reviewed rubrics highlighting any changes Provide examples of rubric use Sought input for additional revisions

QEP Update Notes from the chairs Faire Literature Assessment in support areas History and culture CCSSE

Results of process Changes were made to all but one rubric In almost all cases the content areas within a rubric were not changed

Considerable revision aimed at clarity; often leading to reduction of detail

Consistency of across rubrics – Language – Formatting

In all but one case, the 1-4 scale was maintained, although changes were made to the description of one scale

In summary, the process has … Shifted from indirect to direct measures Included both push & pull processes Arisen from sustainable and faculty driven efforts Involved students in various systems of assessment Engaged students in self- assessment

Resulted in meaningful data from many audiences Led to – Improved programs – Improved instruction – Improved student learning

Plans for next year include… Focus on two general education outcomes each year Workshops for faculty to improve information literacy understanding and use Speaker for interpersonal skills

From the IR Office To the Classroom: The Role of Assessment in Student Learning Dr. John W. Quinley Dr. Brett Parker Any questions or comments?