CLOSING THE LOOP: A PRACTICAL APPLICATION By Dr. Debra Bryant (NWCCU and Business Department Accreditation Liaison)

Slides:



Advertisements
Similar presentations
Introduction to Assessment – Support Services Andrea Brown Director of Program Assessment and Institutional Research Dr. Debra Bryant Accreditation Liaison.
Advertisements

A Practical Guide. The Handbook Part I BCCC Vision of Assessment Guiding Principles of Assessment Part II The Assessment Model Part III A guide on how.
The Assessment Imperative: A Work in Progress A Focus on Competencies Ricky W. Griffin, Interim Dean Mays Business School Texas A&M University.
THIS WORKSHOP WILL ADDRESS WHY THE FOLLOWING ARE IMPORTANT: 1. A comprehensive rationale for funding; 2. Measurable objectives and performance indicators/performance.
Dept. of Computing and Technology (CaT) School of Science and Technology B.S. in Computer Information Systems (CIS) CIP Code: Program Code: 411.
PEER REVIEW OF TEACHING WORKSHOP SUSAN S. WILLIAMS VICE DEAN ALAN KALISH DIRECTOR, UNIVERSITY CENTER FOR ADVANCEMENT OF TEACHING ASC CHAIRS — JAN. 30,
2013 Spring Assessment Colloquium Beth Tipton CBPA Associate Dean “CLOSING THE LOOP” AND IMPROVING STUDENT LEARNING VIA ONGOING ASSESSMENT.
OUTCOMES ASSESSMENT Developing and Implementing an Effective Plan.
FAMU ASSESSMENT PLAN PhD Degree Program in Entomology Dr. Lambert Kanga / CESTA.
An Outcomes-based Assessment Model for General Education Amy Driscoll WASC EDUCATIONAL SEMINAR February 1, 2008.
Assurance of Learning The School of Business and Economics SUNY Plattsburgh.
WASC Accreditation Process DUE Managers Meeting December 2, 2009 Sharon Salinger and Judy Shoemaker.
The Academic Assessment Process
SCHOOL OF BUSINESS Masters in Business Administration CIP Code: digit Program Code: Program Quality Improvement Report
Standards and Guidelines for Quality Assurance in the European
FLCC knows a lot about assessment – J will send examples
Dr. Timothy S. Brophy Director of Institutional Assessment University of Florida GRADUATE AND PROFESSIONAL PROGRAM ASSESSMENT PLANS.
Purpose Program The purpose of this presentation is to clarify the process for conducting Student Learning Outcomes Assessment at the Program Level. At.
The Role of Assessment in the EdD – The USC Approach.
Assessment Surveys July 22, 2004 Chancellor’s Meeting.
TIMELESS LEARNING POLICY & PRACTICE. JD HOYE President National Academy Foundation.
Accrediting Commission for Community and Junior Colleges of the Western Association of Schools and Colleges.
How to Use Data to Improve Student Learning Training Conducted at Campus-Based SLO Summit Spring 2014 How to Use Data to Improve Student Learning.
Department of Physical Sciences School of Science and Technology B.S. in Chemistry Education CIP CODE: PROGRAM CODE: Program Quality Improvement.
Tusculum College School of Business. Tusculum College Program is: –Approved…Regionally by Southern Association of Colleges and Schools –Flexible & cost.
ACADEMIC PERFORMANCE AUDIT
SAR as Formative Assessment By Rev. Bro. Dr. Bancha Saenghiran February 9, 2008.
Mountain View College Spring 2008 CCSSE Results Community College Survey of Student Engagement 2008 Findings.
California State University East Bay
If you don’t know where you’re going, any road will take you there.
CONNECT WITH CAEP | Transitioning from NCATE and TEAC to CAEP: How? Patty Garvin, Senior Director,
Using Electronic Portfolios to Assess Learning at IUPUI. Trudy Banta, et. al. Indiana University-Purdue University Indianapolis 2007.
Assessing General Education Workshop for College of the Redwoods Fred Trapp August 18, 2008.
Quality Assurance of Malaysian Higher Education COPIA – Code of Practice for Institutional Audit COPPA – Code of Practice for Programme Accreditation.
 This prepares educators to work in P-12 schools (1)  It provides direction (1)  It is knowledge-based, articulated, shared, coherent, consistent with.
March 26-28, 2013 SINGAPORE CDIO Asian Regional Meeting and Workshop on Engineering Education and Policies for Regional Leaders Programme Evaluation (CDIO.
Evidence of Student Learning Fall Faculty Seminar Office of Institutional Research and Assessment August 15, 2012.
Designing Local Curriculum Module 5. Objective To assist district leadership facilitate the development of local curricula.
AASCB The Assurance of Learning AASCB Association to Advance Collegiate Schools of Business Marta Colón de Toro, SPHR Assessment Coordinator College of.
EDU 385 CLASSROOM ASSESSMENT Week 1 Introduction and Syllabus.
Expeditionary Learning Queens Middle School Meeting May 29,2013 Presenters: Maryanne Campagna & Antoinette DiPietro 1.
What could we learn from learning outcomes assessment programs in the U.S public research universities? Samuel S. Peng Center for Educational Research.
Practicing Meaningful Learning Outcomes Assessment at UGA Department of Crop and Soil Sciences August 10, 2015 Dr. Leslie Gordon Associate Director for.
Dr. Amina M R El-Nemer Lecturer Maternity and Obstetric Nursing Dep. IQAP Manager Program Specification.
“Learn It! Live It!” Ensuring the Workforce Readiness Skills and Behaviors of Today’s and Tomorrow’s Workers Fall Convocation 2015 Presentation Quality.
A Basic Guide to Academic Assessment Presented by Darby Kaikkonen Director of Institutional Research.
Assessment System Overview Center for Education Overview for the NCATE BOE Team April 18-22, 2009.
Changes in Professional licensure Teacher evaluation system Training at Coastal Carolina University.
Planning 101 Overview of integrated planning at SCC
STRATEGIC PLANNING & WASC UPDATE Tom Bennett Presentation to Academic Senate February 1, 2006.
Assessing Information Literacy with SAILS Juliet Rumble Reference & Instruction Librarian Auburn University.
What Your Program Needs to Know about Learning Outcomes Assessment at UGA.
Assessing Student Learning Workshop for Department Chairs & Program Directors Workshop for Department Chairs & Program Directors January 9, 2007.
Accreditation Update and Institutional Student Learning Outcomes Deborah Moeckel, SUNY Assistant Provost SCoA Drive in Workshops Fall 2015
ACS WASC/CDE Visiting Committee Final Presentation South East High School March 11, 2015.
Criterion 1 – Program Mission, Objectives and Outcomes Weight = 0.05 Factors Score 1 Does the program have documented measurable objectives that support.
Standardized Testing EDUC 307. Standardized test a test in which all the questions, format, instructions, scoring, and reporting of scores are the same.
Presentation on Outcomes Assessment Presentation on Outcomes Assessment toCCED Mohawk Valley Community College October 11, 2004.
Overview of Types of Measures Margaret Kasimatis, PhD VP for Academic Planning & Effectiveness.
Note: In 2009, this survey replaced the NCA/Baldrige Quality Standards Assessment that was administered from Also, 2010 was the first time.
External Review Exit Report Campbell County Schools November 15-18, 2015.
HLC Criterion Four Primer Thursday, Oct. 15, :40 – 11:40 a.m. Event Center.
Implementing QM towards Program Certification
Assessment: Will It Be a “We Have To?” or “We Get To?”
Department of Political Science & Sociology North South University
Outcome Assessment Using a Total Quality Management Paradigm
Derek Herrmann & Ryan Smith University Assessment Services
The Heart of Student Success
Business Administration Programs School of Business and Liberal Arts Fall 2016 Assessment Report
Student Learning Outcomes at CSUDH
Presentation transcript:

CLOSING THE LOOP: A PRACTICAL APPLICATION By Dr. Debra Bryant (NWCCU and Business Department Accreditation Liaison)

What does it mean to “Close the Loop”? “Closing the Loop” is one of the most important stages in the assessment process. It is the action stage. Once a department has (a) decided what they want their students to learn, (b) determined where & when assessment of that learning should take place (c) gathered samples of students’ work, and (d) analyzed the data, faculty takes the time to evaluate whether students actually learned what they were expected to learn, and use that information to effectively improve teaching and learning.

Faculty collaboratively: * discuss the assessment results, * reach conclusions about their meaning, * decide what changes are needed, if any, * determine the implications for those changes, and * follow through to implement the changes.

An example from the Business Department (a) Decide what we want students to learn Mission The mission of the Udvar-Hazy School of Business is to prepare students for successful employment, advanced learning and service to community. We are committed to providing an environment that embraces experiential learning, stimulates academic excellence and incorporates ethical considerations. Goals 1. Provide students with core business knowledge and skills that enable attainment of advanced business degrees and success in a rapidly changing, competitive business environment. (Core Theme One – A Culture of Learning)

UHSB Student Learning Outcomes 1. Students will demonstrate a working level knowledge of the core functional areas of business: A. Students will demonstrate a working level knowledge of core business functions in accounting, economics, finance, information systems, international business, legal and social environment, marketing, and management. B. Students will analyze a complex business situation, identify relevant functional business issues and suggest viable courses of action

(b) Determine where & when assessment of learning should take place Direct Measurement: Major Field Test in Business by ETS When: During the Capstone course *We neglected to consider the “who” or “by whom” and the exact “when”.

(c) Gather samples of students’ work Conduct Major Field Test in Business by ETS during the semester in the Testing Center.

(d) Analyze the data Fall 2011 Learning Outcome for All Business Majors Assessment Method Benchmark Timing/ Placement Results 1A. Students will demonstrate a working level knowledge of core business functions in accounting, economics, finance, information systems, international business, legal and social environment, marketing, and management. Direct: Major Field Test in Business by ETS Not set Annually Capstone, MGMT 4800 Students: 40 Percentiles - Overall: 37 Accounting: 61 Economics: 36 Management: 28 Quantitative: 10 Finance: 54 Marketing: 28 Legal: 30 Info Sys: 38 International: 47

(d) Analyze the data Human response to bad results? Shock! Stupid students? Bad teaching? Defensiveness In a quandary of where to start?

QUESTIONS TO ASK & THINGS TO CONSIDER WHEN LOOKING AT ASSESSMENT RESULTS [adapted from “Closing the Loop” Allen & Driscoll, 2013 WASC Assessment Leadership Academy]  What do the data say about your students’ mastery of subject matter, of research skills, or of writing and speaking?  What do the data say about your students’ preparation for taking the next step in their careers?  Are there areas where your students are outstanding? Are they consistently weak in some respects?  Are graduates of your program getting good jobs, accepted into reputable graduate schools, reporting satisfaction with their undergraduate education?  Do you see indications in student performance that point to weakness in any particular skills, such as research, writing, or critical thinking skills?

MORE QUESTIONS TO ASK & THINGS TO CONSIDER  Do you see areas where performance is okay, but not outstanding, and where you would like to it improve?  Do the results live up to the expectations we set? o Are our students meeting internal and external standards? How do our students compare to peers? o Are our students doing as well as they can?  Are our expectations (benchmarks) appropriate? Should expectations be changed?  Does the curricula adequately address the learning outcomes?  Are our teaching & curricula improving?  What are the most effective tools to assess student learning? Do they clearly correspond to our program learning outcomes? Do the learning outcomes need to be clarified or revised?

Possible issues to consider with assessment practices:  Need revisions to the outcome. It isn’t what our students really learn.  Better evidence, e.g., better assignment or writing prompt. Conclusions are of questionable validity.  Better sample. It was too small or biased. The results are not generalizable.  Better rubric. The criteria aren’t reasonable for our program.  Calibration. Questionable reliability? Compare with something standardized.  Summative or formative evidence needed?  More direct assessment to actually see what our students can do.  Collected too much evidence. Make assessment more manageable.  Need to involve more faculty, including adjuncts.  Collected only one line of evidence and don’t have confidence in our conclusion.  There are many ways to close the loop. What should we do???

When results suggest the need for change, consider improvements to the program in the following areas:  Pedagogy—e.g., changing course assignments; providing better formative feedback to students; use of more active learning strategies to motivate and engage students;  Curriculum—e.g., adding a second required speech course; designating writing-intensive courses; changing prerequisites; resequencing courses for scaffolded learning;  Student support—e.g., improving tutoring services; adding on-line, self-study materials; developing specialized support by library or writing center staff; improving advising;  Faculty support—e.g., providing a writing-across-the-curriculum workshop; campus support for TAs; professional development for improving pedagogy or curricular design;  Equipment/Supplies/Space—e.g., new or updated computers or software, improvements or expansions of laboratories;  Budgeting and planning—e.g., reallocating funds to support improvement plans based on assessment findings; budgeting for new resources (software, staff)  Management practices—e.g., establishing new procedures to ensure assessment results are tracked and used for follow-up planning and budgeting

Motivation Ideas for the MAPP Test & Major Field Tests A culture of assessment at the institution can be an immense benefit in motivating students for these types of tests. Faculty enthusiasm is a tremendous influence on students’ perception of the importance of the test. A letter is sent to entering freshmen and it explains the importance of the test and lets them know they will be tested as a freshman and a senior. Helps to do this along with #1. Test takers are divided into teams, with prizes and recognition awarded to the top teams, as well as top individual performers. With Major Field Tests, some departments will award different levels of credit toward a grade in a capstone course in proportion to the score received. (One institution uses the MFT Comparative Data Guide’s average student score: if a student beats that, he/she receives a full 10% credit toward a grade, if lower, the student receives 5%). $20 Gift certificates to the bookstore. Free cap & gown rental..…..

(d) Analyze the data (e) Close the Loop – Take action 1. Change out assessment faculty member (to nonadjunct) 2. Implement motivational activities: - Emphasize with new faculty member the importance and value of the assessment - Emphasize to students the value of the assessment (resume, grad school …) - School pride - High score names posted - Awards? 3. Set benchmarks (average at least 50th percentile in each area)

(d) Analyze the data Spring 2012 LOAssessment ResultsAction Taken/Closing the Loop 1AThe percentile rank on ETS Business Major field test: Overall=88; Acct= 87; Econ= 66; Mgmt: 78; Quant:83; Fin:93; Mktg:84; Legal/Social:86; IS:88; International: 82 Much higher scores than for Fall The two lowest areas were 66 rd percentile in Economics and 78 th percentile in Management. 1. Embed a grade for the ETS exam and create wall of recognition plaques for each semester cohort’s percentile ranking 2. Benchmark raised in all areas to be above the 75th percentile 3. Additional focus on & reinforcement of economics and management concept building 4. All core areas of study will gather a list of foundational concepts to be shared with all faculty

(d) Analyze the data Fall 2012 & Spring 2013 Assessment ResultsAction Taken/Closing the Loop Fall2012 percentile rank on ETS Business Major field test: Overall=94; Acct= 87; Econ= 96; Mgmt: 96; Quant:96; Fin:88; Mktg:69; Legal/Social:99; IS:81; International: 82 Note: Mktg below 75 th percentile Sp percentile rank on ETS Business Major field test: Overall=94; Acct=92; Econ=97; Mgmt:86; Quant:90; Fin:85; Mktg:79; Legal/Social:97; IS:88; International: 88. Note: Mktg still lowest, but above 75 th percentile 1. Move ETS test to week 12 of semester so that students are through most of required coursework. 2. Revise prep information from faculty.

(d) Analyze data Fall 2013 & Spring 2014 Just when you think you have it right… Assessment Results Fall2013 percentile rank on new ETS Business Major Field Test: Overall=91; Acct= 84; Econ= 72; Mgmt: 66; Quant:77; Fin:71; Mktg:95; Legal/Social:98; IS:88; International: 12 Sp percentile rank on new ETS Business Major field test: Overall=94; Acct=84; Econ=90; Mgmt:72; Quant:59; Fin:92; Mktg:84; Legal/Social:99; IS:94; International: 21 There are significant result differences, they are lower for all subject areas, except Legal/Social & IS. International Issues is of greatest concern. But, Economics, Finance, Mgmt & Quantitative Analysis are also low percentiles. Note: The newly added process of across discipline faculty involvement in test preparation was encouraging to students.

(d) Analyze the data Fall 2013 Action Recommended New Business MFT with significant test revision. 1. Need to determine significance of new test results. Request copy of new revised test sample. 2. Obtain AACSB SLOs for subjects. 3. Each subject area should consider significance of lower percentile ranking. International results of particular concern. 4. Re-evaluation of International Business course content. Research AACSB SLOs for this area. 5. Assess results according to students’ majors. Quantitative Analysis results of particular concern for Bus. Admin. majors.

(d) Analyze the data – Trend it Lesson: Don’t panic! Apply the right questions and considerations. Fall 2011 Spring 2012Fall 2012 Spring 2013 Fall 2013 (New) Spring 2014 Summer 2014 # Students Overall Accounting Economics Management Quantitative Finance Marketing Legal Information Sys International Above 75 Below 50Note: Test revised as of Fall13

Don’t let assessment results dictate decisions. But, assessment results should advise faculty as they use professional judgment to make suitable decisions. Assessment results are meant to inform planning and influence decision making, therefore reporting results to the various stakeholders (e.g., students, administration, accrediting agencies, alumni) is an integral part of “closing the loop”.