Kimberlee Pottberg.  Part 1: Why we use WEAVEonline  Part 2: How to enter components.

Slides:



Advertisements
Similar presentations
EVALUATOR TIPS FOR REVIEW AND COMMENT WRITING The following slides were excerpted from an evaluator training session presented as part of the June 2011.
Advertisements

Goals-Based Evaluation (GBE)
Support Services Review Support Services Review (SSR) is a representative, responsive form of assessment and self-evaluation to ensure continuous quality.
A Self Study Process for WCEA Catholic High Schools
Del Mar College Planning and Assessment Process Office of Institutional Research and Effectiveness January 10, 2005.
Campus Improvement Plans
PREPARING FOR SACS Neal E. Armstrong Vice Provost for Faculty Affairs July 13, 2004.
Institutional Effectiveness Operational Update Presentation made to the Indiana State University Board of Trustees October 5, 2001.
ASSESSMENT WORKSHOP: SESSION 1 ADMINISTRATIVE SUPPORT SERVICES ACADEMIC AND STUDENT SUPPORT SERVICES PRESENTED BY THE DIVISION OF INSTITUTIONAL EFFECTIVENESS.
Support Program Assessment November 18-19, 2014 Ryan J. McLawhon, Ed.D. Director Institutional Assessment Elizabeth C. Bledsoe,
Session 6 Common Problems Associated with Writing Learning Outcomes Comments from your reports.
HELPFUL TIPS FOR UNIT PLANNING Office of Institutional Effectiveness.
Student Services Assessment Lee Gordon Assistant Vice President for Student Services Purdue University.
Reaffirmation of Accreditation: Institutional Effectiveness Southern Association of Colleges and Schools February 2008 Stephen F. Austin State University.
2012 West Texas Assessment Conference CREATING A CULTURE OF ASSESSMENT IN NON-INSTRUCTIONAL AREAS KARA LARKAN-SKINNER, DIRECTOR OF IR AND IE & KRISTIN.
Assessment 101 Elizabeth Bledsoe.
Graduate Program Review Where We Are, Where We Are Headed and Why Duane K. Larick, Associate Graduate Dean Presentation to Directors of Graduate Programs.
Mia Alexander-Snow, PhD Director, Office for Planning and Institutional Effectiveness Program Review Orientation 1.
ASSESSMENT  Are you having an impact?  How do you know?  Are the programs and services you work on getting better?  How do you know?
 1. Methods of evaluation are thorough, feasible, and appropriate  2. Use of objective measures to produce quantitative and qualitative data  3. Methods.
Departmental Assessment Process.  The institution identifies expected outcomes, assesses the extent to which it achieves these outcomes, and provides.
TaskStream Training Presented by the Committee on Learning Assessment 2015.
Academic Assessment at UTB Steve Wilson Director of Academic Assessment.
University Assessment Committee Comprehensive Standard (CS) The institution identifies expected outcomes, assesses the extent to which it achieves.
Learning Outcomes Assessment in WEAVEonline
Developing Administrative and Educational Support Outcomes and Methods of Assessment Lisa Garza Director, University Planning and Assessment Beth Wuest.
SACS Coordinators Meeting Non-Academic Units Wednesday, January 9, 2013 Timothy Brophy – Director, Institutional Assessment Cheryl Gater – Director, SACS.
Institutional Effectiveness &. Institutional Effectiveness & Strategic Planning IE & SP Committees have developed a new system that integrates these two.
ASSESSMENT OF STUDENT SUPPORT SERVICES Kimberly Gargiulo, Coordinator of Assessment Office of Institutional Research and Assessment.
ASSESSMENT. Assessment is the systematic and on-going process of collecting and reviewing evidence about the College's academic and administrative programs.
Mia Alexander-Snow, PhD Director, Office for Planning and Institutional Effectiveness Program Review Orientation 1.
Introduction to WEAVE and Assessment November 18-19, 2014 Ryan J. McLawhon, Ed.D. Director Institutional Assessment Elizabeth C.
Academic Program Assessment November 18-19, 2014 Ryan J. McLawhon, Ed.D. Director Institutional Assessment Elizabeth C. Bledsoe,
What could we learn from learning outcomes assessment programs in the U.S public research universities? Samuel S. Peng Center for Educational Research.
Florida Tech’s University Assessment Committee For A Continuing Culture of Assessment.
Meeting the ‘Great Divide’: Establishing a Unified Culture for Planning and Assessment Cathy A. Fleuriet Ana Lisa Garza Presented at the 2006 Conference.
March Madness Professional Development Goals/Data Workshop.
Columbia University School of Engineering and Applied Science Review and Planning Process Fall 1998.
Administrative and Educational Support Outcomes: Reporting Results, Taking Action, and Improving Services Lisa Garza Director, University Planning and.
By Monica Y. Peters, Ph.D. Coordinator of Institutional Effectiveness/QEP Office of Quality Enhancement.
The University of Kentucky Program Review Process for Administrative Units April 18 & 20, 2006 JoLynn Noe, Assistant Director Office of Assessment
SACS Coordinators Meeting Wednesday, June 6, 2012 Timothy Brophy – Director, Institutional Assessment Cheryl Gater – Director, SACS Accreditation.
SUBMITTED TO THE HIGHER LEARNING COMMISSION OF THE NORTH CENTRAL ASSOCIATION OF COLLEGES AND SCHOOLS MAY 2010 Progress Report on Outcomes Assessment.
Systems Accreditation Berkeley County School District School Facilitator Training October 7, 2014 Dr. Rodney Thompson Superintendent.
Integrated Planning = Reacting, Reflecting, Recharging.
WEAVEONLINE TRAINING By Sabrina Qureshi. OBJECTIVES  Become familiar with WEAVEonline software  Understand how WEAVEonline is used as an assessment.
Yes, It’s Time!  10 years after the most recent visit ( )  (probably spring semester)  SMSU proposes dates; HLC replies  Much to be.
2008 Spring Semester Workshop AN INTRODUCTION TO STRATEGIC PLANNING AND ASSESSMENT WORKSHOP T. Gilmour Reeve, Ph.D. Director of Strategic Planning.
1 Learning Outcomes Assessment: An Overview of the Process at Texas State Beth Wuest Director, Academic Development and Assessment Lisa Garza Director,
Institutional Effectiveness at CPCC DENISE H WELLS.
A Commitment to Continuous Improvement in Teaching and Learning Michaela Rome, Ph.D. NYU Assistant Vice Provost for Assessment.
UK Leadership Team Meeting, July 6, SACS Reaffirmation Project: July Assessment Update Presented by: Dr. Mia Alexander-Snow, Director, Planning &
Demonstrating Institutional Effectiveness Documenting Using SPOL.
Selection Criteria and Invitational Priorities School Leadership Program U.S. Department of Education 2005.
Help Me Understand the Basics of Non-academic Assessment A BETHUNE-COOKMAN UNIVERSITY OFFICE OF ASSESSMENT WORKSHOP Presented by Cory A. Potter Executive.
Annual Program Assessment With a Five- Year Peer Review John Henik Associate Vice President, Academic Affairs Dave Horsfield
1 Establishing a New Gallaudet Program Review Process Pat Hulsebosch Office of Academic Quality CUE – 9/3/08: CGE – 9/16/08.
CBU CALIFORNIA BAPTIST UNIVERSITY Assessment, Accreditation, and Curriculum Office CBU - OIRPA.
Academic Program Review
Texas Woman’s University Academic Degree Program Assessment
CALIFORNIA BAPTIST UNIVERSITY Office of Educational Effectiveness
SACSCOC Fifth-Year Readiness Audit
Assessment & Evaluation workshop
Institutional Program Review 2017 Update
Institutional Effectiveness USF System Office of Decision Support
Governance and leadership roles for equality and diversity in Colleges
Roles and Responsibilities of an External Examiner
Welcome! Session Recording
CEA Final Report Much to celebrate!.
Fort Valley State University
Presentation transcript:

Kimberlee Pottberg

 Part 1: Why we use WEAVEonline  Part 2: How to enter components

SACS Comprehensive Standard Institutional Effectiveness The institution identifies expected outcomes, assesses the extent to which it achieves these outcomes, and provides evidence of improvement based on analysis of the results in each of the following areas: (Institutional Effectiveness) educational programs, to include student learning outcomes administrative support services educational support services research within its educational mission, if appropriate community/public service within its educational mission, if appropriate

SACS Comprehensive Standard Institutional Effectiveness The institution identifies expected outcomes, assesses the extent to which it achieves these outcomes, and provides evidence of improvement based on analysis of the results in each of the following areas: (Institutional Effectiveness) educational programs, to include student learning outcomes administrative support services educational support services research within its educational mission, if appropriate community/public service within its educational mission, if appropriate and provides evidence of improvement based on analysis of the results…

Develop Program Mission & Outcomes Design an Assessment Plan Implement the Plan & Gather Information Interpret/ Evaluate Information Modify & Improve Adapted from: Trudy Banta, IUPUI

Update Assessment Plans (mission, outcomes, and measures with achievement targets Done by 11/1/2013 Findings EnteredDone by 8/1/2014 Action Plan(s) EnteredDone by 9/1/2014 Cycle Closes10/1/2014

 The mission statement links the functions of your unit to the overall mission of Texas A&M.

 Goals:  Additional objections which may be tied to specific portions of a programs mission.  Not considered in the progress reports sent to each Assessment Liaison, but be used by individual offices if found to be useful.

 Learning Outcomes  Learning statements Defines the information or skills stakeholders/students will acquire from the program  Program Outcomes  Process statements Relate to what the unit intends to accomplish. Examples include:  Level or volume of activity  Efficiency with which you conduct the processes  Compliance with external standards of “good practice in the field” or regulations  Satisfaction statements Describe how those you serve rate their satisfaction with your unit’s processes or services

 Measures:  Define and identify the sources of evidence used to determine the degree to which an outcome is met.

 Targets:  The result, target, benchmark, or value representing success or the achievement of a given outcome.

 Findings:  A concise summary of the results gathered from a given assessment measure.  Note: The language of this statement should parallel the corresponding achievement target. Results should be described in enough detail to prove you have met, partially met, or not met the achievement target.

 After reflecting on the findings, you and your colleagues should determine appropriate action to improve the program. This will lead to at least one action plan.  Actions outlined in the action plan should be specific and relate directly to the outcome and the results of assessment.

 Analysis Questions:  Responses to provided questions which provide an update of ongoing action plans as well as an opportunity to discuss the significance of new action plans.

Office of Institutional Assessment Assessment Report Continuous Improvement To fulfill the action plan to address the unmet target of 80% of conference respondents indicating satisfaction with the variety of poster sessions offered, the Office of Institutional Assessment (OIA) along with the Assessment Conference Committee (ACC) sought more variety in the posters for the 2012 Assessment Conference. As a result, the percentage of respondents satisfied with the variety of posters increased from 74% to 78%. Although the 85% target was still not met during the cycle, this result shows improvement towards the target. To complete the other action plan, OIA enhanced the Assessment Review Guidelines to include more practical and applicable “good practices” for assessment liaisons to pass along to their programs as formative assessment. Additionally, the Assessment Review Rubric was modified to be more exhaustive in its evaluation of assessment reports. As a result, less variance was observed in the quality of assessment reports. Lastly, the Vice Provost of Academic Affairs supplied each dean with a college-specific, personalized memo addressing the strengths and weaknesses of assessment reports in each college. This process was well received and will continue as a service to colleges from the Office of the Vice Provost. Outcome/ObjectiveMeasureTargetFindingAction Plan O 5: Provide Excellent Concurrent and Poster Sessions Provide excellent concurrent and poster sessions for participants at the Annual Assessment Conference. M 8: Overall Assessment Conference Survey 85%, or more, of the Annual Assessment Conference attendees will report satisfaction with the Concurrent and Poster Sessions. Status: Partially Met Following the end of the 13th Annual Texas A&M Assessment Conference, an on-line conference evaluation survey was sent out to all attendees. Information gained from this survey was organized into the 13th Annual Conference Survey Report, and was distributed to the Assessment Conference Committee for review. Results from the survey questions relating to Concurrent and Plenary Sessions are below: Concurrent Sessions - Question 16: "How satisfied were you with the quantity of Concurrent Sessions?" % were "Very Satisfied" or "Satisfied" Question 17: "How satisfied were you with the variety of Concurrent Sessions?" % were "Very Satisfied" or "Satisfied" Poster Sessions - Question 19: "How satisfied were you with the quantity of Poster Sessions?" % were "Very Satisfied" or "Satisfied" Question 20: "How satisfied were you with the variety of Poster Sessions?" % were "Very Satisfied" or "Satisfied" Although we improved from the findings of 73%, based on our findings from the Assessment Cycle, 77% of respondents indicated that they were satisfied with the variety of poster sessions offered. In response, the Office of Institutional Assessment will seek posters from each track to provide a greater variety of posters during the 14th Annual Texas A&M Assessment Conference. Use of Results Although the satisfaction results from the conference survey related to the variety of poster sessions increased from 74% to 78%, the 85% target was still not met. In response, the Office of Institutional Assessment (OIA) and the Assessment Conference Committee (ACC) will ensure that each of the conference “tracks” has coverage in the poster session. OIA and the ACC have traditionally ensured track coverage in concurrent session offerings but has never paid close attention to track coverage in the poster session offerings. This strategy includes contacting specific authors of concurrent session proposals in underrepresented tracks and inviting them to consider a poster presentation, perhaps in addition to the concurrent session.

 Useful Reports  DAR: Detailed Assessment Report All data from all cycles that has been entered into WEAVEonline  DES: Data Entry Status Report Color coded report of the status of data entry. Will be “Not Reported,” “In-Progress,” or “Final.”  Audit Reports 5 varieties, each of which will show which portions of the assessment plan are missing specific information

 What was the most valuable thing you learned?  What is one question that you still have?  What do you think is the next step that your program needs to take in order to implement course embedded assessment?

For more information on the conference and registration, visit February 16-18, 2014 College Station, TX