. Reaffirmation 2014Off-site review is completedOn-site review is February 2014QEP is ready to be submitted in January 2014.

Slides:



Advertisements
Similar presentations
General Education Assessment 2013: Introduction
Advertisements

David J. Sammons, Dean UF International Center. Southern Association of Colleges and Schools: SACS is our regional accrediting authority. The last SACS.
Internationalization of the Undergraduate Curriculum.
Instructor Teaching Impact. University Writing Program 150 sections of required writing courses per semester, taught by Instructors and GTAs 33 Instructors–
A Guide for College Assessment Leaders Ursula Waln, Director of Student Learning Assessment Central New Mexico Community College.
Learning without Borders: Internationalizing the Gator Nation M. David Miller Director, Quality Enhancement Plan Timothy S. Brophy Director, Institutional.
Where Have We Been? Where Are We Going? Using Student Surveys to Assess and Improve Literature Courses Kelly Douglass, PhD Asst. Professor, English Riverside.
Bringing the World to UNO: Global Learning & Engagement Quality Enhancement Plan (QEP) SACSCOC Committee Presentation.
Learning Objectives, Performance Tasks and Rubrics: Demonstrating Understanding and Defining What Good Is Brenda Lyseng Minnesota State Colleges.
Measuring Student Learning March 10, 2015 Cathy Sanders Director of Assessment.
Program Review: The Foundation for Institutional Planning and Improvement.
An Outcomes-based Assessment Model for General Education Amy Driscoll WASC EDUCATIONAL SEMINAR February 1, 2008.
Educational Outcomes: The Role of Competencies and The Importance of Assessment.
Update from the UNC General Education Council [presented to the UNC Board of Governors’ Educational Planning, Programs, and Policies Committee on February.
Consistency of Assessment
PPA Advisory Board Meeting, May 12, 2006 Assessment Summary.
Teaching, Learning, and Assessment Learning Outcomes Are formulated by the academic staff, preferably involving student representatives in the.
Assessing a Campus-wide Internationalization Initiative: Aligning Assessments with Institutional Outcomes M. David Miller Timothy S. Brophy University.
INACOL National Standards for Quality Online Teaching, Version 2.
FLCC knows a lot about assessment – J will send examples
Designing Effective Training Programs for Diverse Audiences Laura Stock, MPH Labor Occupational Health Program - UC Berkeley.
Purpose Program The purpose of this presentation is to clarify the process for conducting Student Learning Outcomes Assessment at the Program Level. At.
Outcomes Understand the way in which the Australian Curriculum has been structured in these learning areas Spend time familiarising themselves with the.
Spring 2012 Pilot Project Module Nine A New Texas Core Curriculum 1.
David Gibbs and Teresa Morris College of San Mateo.
Overhaul of a Graduate Program in Arts Administration Master of Arts in Arts Administration – Initiated in 2003 – Low-residency—one weekend per month (over.
The Third Year Review A Mini-Accreditation Florida Catholic Conference National Standards and Benchmarks.
Conceptual Framework for the College of Education Created by: Dr. Joe P. Brasher.
California State University East Bay
Student Learning Outcomes: Interpretations, Validity, and Factor Development Krista Soria and Laura Gorny This project was funded by the Undergraduate.
Pierce College CSUN-Pierce Paths Project Outcomes Report 2013.
Making It Meaningful: Authentic Assessment for Intentional Education David W. Marshall, PhD Joanna M. Oxendine, MEd.
January 29, 2010ART Beach Retreat ART Beach Retreat 2010 Assessment Rubric for Critical Thinking First Scoring Session Summary ART Beach Retreat.
Assessing General Education Workshop for College of the Redwoods Fred Trapp August 18, 2008.
 This prepares educators to work in P-12 schools (1)  It provides direction (1)  It is knowledge-based, articulated, shared, coherent, consistent with.
Preparing for SACS: Focusing our Quality Enhancement Plan.
March 26-28, 2013 SINGAPORE CDIO Asian Regional Meeting and Workshop on Engineering Education and Policies for Regional Leaders Programme Evaluation (CDIO.
Quality Enhancement Plan (QEP) 101 Del Mar College January 8, 2007 Loraine Phillips, Ph.D. Interim Assessment Director Texas A&M University.
REVISIONS TO GENERAL EDUCATION STUDENT LEARNING OUTCOMES Auburn University Senate Information Item, August 2014.
What could we learn from learning outcomes assessment programs in the U.S public research universities? Samuel S. Peng Center for Educational Research.
Bonnie Paller 2013 AALC Assessment Retreat.  The charge of the Task Force is to identify the abilities and intellectual traits that all students are.
VALUE/Multi-State Collaborative (MSC) to Advance Learning Outcomes Assessment Pilot Year Study Findings and Summary These slides summarize results from.
THE SLO PROCESS #1 create/update SLO’s, rubrics, & assessment methods #2 assess all students in all sections of all courses #3 maintain SLO assessment.
Developing Meaningful, Measurable Student Learning Outcomes Tulsa Community College January 2013 Susan Hatfield Professor, Winona State University
October 15, 2015 QEP: PAST AND PRESENT AND FUTURE.
Intro to Outcomes. What is “Outcomes”? A. a statewide initiative aimed at improving learning and accountability in education B. a standing SFCC committee.
Course, Curriculum, and Laboratory Improvement (CCLI) Transforming Undergraduate Education in Science, Technology, Engineering and Mathematics PROGRAM.
Gordon State College Office of Institutional Effectiveness Faculty Meeting August 5, 2015.
MT ENGAGE Student Learning Outcomes and Assessment April 27, 2015.
Texas Higher Education Coordinating Board Dr. Christopher L. Markwood Texas A&M University-Corpus Christi January 23, 2014.
QCC General Education Assessment Task Force March 21 and 22, 2016 Faculty Forum on General Education Outcomes.
Systems Wide Learning at a Community College Developments in the last five years –SACS-COC (Course Outcomes to Program Outcomes) –The Texas Higher Education.
Overview of Types of Measures Margaret Kasimatis, PhD VP for Academic Planning & Effectiveness.
Weston High School Improvement Plan 21st Century Learning Expectations and Goals
College of Business Administration Assessing Inter- Cultural Learning: Direct Measures and Beyond Kathleen A. Krentler San Diego State University.
Foreign language SLOs Rafael Arias and June Miyasaki March 11, 2008.
Laboratory Science and Quantitative Core Requirements.
Designing Quality Assessment and Rubrics
Making an Excellent School More Excellent: Weston High School’s 21st Century Learning Expectations and Goals
Writing and Revising SLOs with Best Practices in Mind
The Evolution of a QEP: One Institution’s Mid-Cycle Report
CRITICAL CORE: Straight Talk.
SLOs: What Are They? Information in this presentation comes from the Fundamentals of Assessment conference led by Dr. Amy Driscoll and sponsored by.
General Education Assessment Subcommittee Report
New Student Experience
Timothy S. Brophy, Director of Institutional Assessment
Assessment and Accreditation
Presented by: Skyline College SLOAC Committee Fall 2007
Assessing Academic Programs at IPFW
Curriculum Coordinator: Patrick LaPierre February 3, 2017
Presentation transcript:

Reaffirmation 2014Off-site review is completedOn-site review is February 2014QEP is ready to be submitted in January 2014

The institution has developed an acceptable Quality Enhancement Plan that includes an institutional process for identifying key issues emerging from institutional assessment and focuses on learning outcomes and/or the environment supporting student learning and accomplishing the mission of the institution. (Quality Enhancement Plan)

The Institution has developed a Quality Enhancement Plan that (1) demonstrates institutional capability for the initiation, implementation and completion of the QEP; (2) includes broad-based involvement of institutional constituencies in the development and proposed implementation of the QEP; and (3) identifies goals and a plan to assess their achievement.

Purpose Uses and interpretations Student Learning (Outcomes) QEP Assessment

Step 1. Establish topic and student learning outcomes (SLOs) Step 2. Identify direct and indirect assessments to measure SLOs (and outputs to measure initiatives) Step 3. If assessments are not available and sufficient, develop your own assessments Step 4. Plan Assessment Schedule and follow it

Direct – Measures of Student Learning Indirect – Measures of Attitudes, Behaviors, Beliefs, … Outputs – Counts of the numbers of students participating in events, number of events, number of classes, etc.

Direct and indirect assessments must measure progress on the SLOs Outputs must measure participation in the initiatives Each measure must have a planned use or interpretation (Validity) Each measure must be reliably measured (with minimum error)

It is easier to adopt an existing instrument than to develop a new instrument Easier is not better if you violate your principles! That is, existing instruments must: measure the SLOs; be valid for its planned uses and interpretations; and be reliable for your purposes

 UF Theme : Internationalization Internationalization is the conscious integration of global awareness and intercultural competence into student learning.

SLO1 (Content):Students identify, describe, and explain global and intercultural conditions and interdependencies. SLO2 (Critical Thinking): Students analyze and interpret global and intercultural issues. SLO3 (Communication): Students communicate effectively with members of other cultures.

 Internationalization Measures Identified – Commercially Available ◦ Global Perspectives Inventory (GPI) ◦ Global Competence Aptitude Assessment (GCAA) ◦ Intercultural Development Inventory (IDI) ◦ Global Competencies Inventory ◦ Cross Cultural Adaptability Inventory ◦ Global Awareness Profile ◦ Intercultural Effectiveness Scale

 International Task Force ◦ Representation from all 16 Colleges ◦ Student Representation ◦ Representation from Faculty, Staff, Students, Administration  Assessment Committee (5-6 with expertise on content and assessment)

Measures the SLOs Validity Evidence Reliability Evidence Feasibility to Use in Our Context

Most commercial products had good reliability and validity evidence for their stated purposes None matched our SLOsLittle evidence of feasibility for large scale useFinal Decision: Need to develop our own assessments

 Consider Options Already Being Used as Part of Reporting System  Student Experience in the Research University (SERU) - biennial survey used by multiple universities that includes items on internationalization

 Items measure behaviors (e.g., courses taken, participation in Study Abroad, and other types of international experiences) and attitudes toward other cultures  We elected to add 10 items designed to measure attitudes related to SLOs 2 and 3

Direct Assessments focusing on SLO achievement Must allow flexibility to measure learning in any discipline Needs to be aligned with Curriculum Indirect Assessments of SLOs 2 and 3 only Not appropriate for SLO 1 since Content is discipline-based

Writing the Assessment Items Ongoing review for fidelity with the SLOs (Validity) Piloting to establish reliability and psychometric properties of items

International Critical Thinking (IntCRIT) Attitudes and Beliefs International Communication (IntCOMM) Attitudes and Beliefs International Content not feasible with Disciplinary Differences Two Assessments Developed

1.Development of item specifications based on the two SLOs and a literature review. 2.Writing items based on the item specifications.  Approximately 70 items were written for each SLO

3.Review of the items by the ITF, the Assessment Committee and other experts in assessment. (Validity and Match to SLOs) 4. Revision of items based on feedback from expert review.  Revisions were minor changes in wording.

5. Pilot testing with undergraduate students at UF and eliminating items with poor discriminations. ◦ Initial piloting was completed with four forms to minimize the testing burden for students. ◦ Forms overlapped with ten items that expert agreed helped to define the construct. ◦ Each form was pilot tested with undergraduates. 6. Item analysis of pilot data. ◦ The scale reliabilities exceeded.95 for all four forms. ◦ Items were retained that had an item discrimination of.25 or higher.

7. Pilot testing (N=70-80) the retained items on a single form for each SLO. 8. Item analysis of pilot data. ◦ Recommended retaining items with the highest item discriminations that would result in a scale with a reliability of at least.90.  For IntCRIT, the recommendation was to retain 12 items.  For IntCOMM, the recommendation was to retain 14 items. 9. Review of the items by the ITF, the Assessment Committee and other experts in assessment. (Validity and Match to SLOs)

1. I consider different perspectives before making conclusions about the world. 2. I am able to manage when faced with multiple cultural perspectives. 3. I am open to different cultural ways of thinking in any international context. 4. I can make effective decisions when placed in different cultural situations 5. Knowing about other cultural norms and beliefs is important to me. 6. I am able to think critically to interpret global and intercultural issues. 7. I actively learn about different cultural norms. 8. Understanding different points of view is a priority to me. 9. I can recognize how different cultures solve problems. 10. I can contrast important aspects of different cultures with my own. 11. Knowing about other cultural beliefs is important. 12. I am able to recognize how members of other cultures make decisions

1. I demonstrate flexibility when interacting with members of another culture. 2. I prefer to socialize with people of my culture. 3. I am confident that I can adapt to different cultural environments 4. I am able to communicate effectively with members of other cultures 5. I like working in groups with students from other countries. 6. I feel comfortable in conversations that may involve cultural differences. 7. When working on a group project, I enjoy collaborating with students from other countries. 8. I often ask questions about culture to members of other cultures. 9. I enjoy learning about other cultures 10. I appreciate members of others cultures teaching me about their culture. 11. I am able to interact effectively with members of other cultures. 12. I appreciate differences between cultures 13. I feel comfortable discussing international issues. 14. I can clearly articulate my point of view to members of other cultures

Use with curriculum Must be flexible to allow faculty to define internationalization in discipline Must provide a standard measure

Define Rubric Instructors develop or identify assessments in courses Instructors will provide evidence of use and scores to University with examples of assessments

 The Association of American Colleges and Universities developed 15 Rubrics that can be used across programs and courses  VALUE (Valid Assessment of Learning in Undergraduate Education) rubrics were developed by faculty and assessment expert teams across the country -- used by more than 2000 institutions (

UF QEP SLOVALUE RubricAdaptation SLO 1: Content Intercultural Knowledge and Competence Limit criteria to knowledge, dropping skills and attitudes. Modify descriptions for consistency across levels and ease of use. SLO 2: Critical Thinking Critical Thinking Add language to reflect emphasis on international context for critical thinking. Modify descriptions for consistency across levels and ease of use. SLO 3: Communication Written Communication Oral Communication Combine rubrics to measure communication in multiple modes, and add language to reflect emphasis on international context. Modify descriptions for consistency across levels and ease of use.

QEP Content Rubric SLO Components Outstanding 3 Satisfactory 2 Unsatisfactory 1 Not Applicable 0 Concepts/ Principles Consistently and effectively demonstrates sophisticated understanding of the complexity of factors important to members of another culture in relation to its history, values, politics, communication styles, economy, and beliefs and practices. Usually demonstrates understanding of the complexity of factors important to members of another culture in relation to its history, values, politics, communication styles, economy, or beliefs and practices. Rarely or never understands the complexity of factors important to members of another culture in relation to its history, values, politics, communication styles, economy, or beliefs and practices. Not Applicable to Assignment or Course Terminology Consistently recognizes and effectively utilizes important and relevant terminology regarding intercultural and global issues in the appropriate environmental context. Usually identifies and implements important and relevant terminology regarding intercultural and global issues in the appropriate environmental context. Rarely or never understands important and relevant terminology regarding intercultural and global issues in the appropriate environmental context. Not Applicable to Assignment or Course Methodologies Consistently comprehends and effectively utilizes diverse and appropriate methodologies for understanding complex intercultural and global issues. Usually comprehends and utilizes diverse and appropriate methodologies for understanding intercultural and global issues. Rarely or never comprehends and utilize diverse and appropriate methodologies for understanding intercultural and global issues. Not Applicable to Assignment or Course

Review by Internationalization Task Force, Assessment Committee, and Experts for Match to SLOs and Validity Piloting (in progress) in curriculum and Study Abroad

 Number of participants at specific campus events with an international focus (QEP events).  Number of International Scholar courses.  Number of students enrolled in International Scholar courses.  Number of Study Abroad courses offered.  Number of students studying abroad.

University wide impact Indirect Assessments SERU Biennial, odd years IntCrit and IntComm – Annual – Fall - Sampling Outputs Monitor participation and implementation - Annual Impact on those taking courses or Study Abroad Direct Assessments - Rubrics Indirect Assessments – IntCrit and IntComm Output – Number of students and options

Annual analysis, review, and interpretation of data to inform: The implementation of the program The degree of SLO achievement When adjustments to program are needed The benefit of the program for underrepresented populations

M. David Miller Timothy S. Brophy