ASSESSMENT OF STUDENT SUPPORT SERVICES Kimberly Gargiulo, Coordinator of Assessment Office of Institutional Research and Assessment.

Slides:



Advertisements
Similar presentations
Quantitative and Scientific Reasoning Standard n Students must demonstrate the math skills needed to enter the working world right out of high school or.
Advertisements

Assessment of Academic Advising Assessing for Excellence Conference Central Carolina Community College Joni Pavlik & Brian Merritt April 16, 2008.
What is Assessment? The university, with the support and encouragement of President Hitt and the Vice President team, engages in the institutional effectiveness.
What “Counts” as Evidence of Student Learning in Program Assessment?
STUDENT LEARNING OUTCOMES ASSESSMENT. Cycle of Assessment Course Goals/ Intended Outcomes Means Of Assessment And Criteria For Success Summary of Data.
2014 AmeriCorps External Reviewer Training Assessing Need, Theory of Change, and Logic Model.
Session Objective To understand how to set high quality learning objectives and learning outcomes Session Outcomes By the end of this session you will.
Copyright © 2014 by The University of Kansas Qualitative Methods to Assess Community Issues.
Does It Work? Evaluating Your Program
OUTCOMES ASSESSMENT Developing and Implementing an Effective Plan.
1 PORTFOLIO EVALUATION TRAINING Nancy Bolt LSSC Co-Director.
An Assessment Primer Fall 2007 Click here to begin.
Evaluation.
Academic Advising Implementation Team PROGRESS REPORT April 29, 2009.
FAMU ASSESSMENT PLAN PhD Degree Program in Entomology Dr. Lambert Kanga / CESTA.
Performance Appraisal in the Public Sector
ASSESSMENT WORKSHOP: SESSION 1 ADMINISTRATIVE SUPPORT SERVICES ACADEMIC AND STUDENT SUPPORT SERVICES PRESENTED BY THE DIVISION OF INSTITUTIONAL EFFECTIVENESS.
Evaluation is a professional and ethical responsibility and is a core part of PHN professional practice Commitment to evaluation helps build the PHN intelligence.
Accreditation Strategy for the BYU CE En Dept. Presentation to External Review Board October 20, 2000.
Creating a User-Centered Culture of Assessment Stella Bentley and Bill Myers University of Kansas EDUCAUSE Southwest Regional Conference 2005.
Using Measurable Outcomes to Evaluate Tutor Programs Jan Norton, Presenter.
UNDERSTANDING, PLANNING AND PREPARING FOR THE SCHOOL-WIDE EVALUATION TOOL (SET)
How to Write Goals and Objectives
Measuring Learning Outcomes Evaluation
Assessment Assessment Planning Assessment Improvements Assessment Data Dialogue & Reflection.
Reaffirmation of Accreditation: Institutional Effectiveness Southern Association of Colleges and Schools February 2008 Stephen F. Austin State University.
Measuring for Success Module Nine Instructions:
Innovative Educators Webinar December 1, 2010 Jan Norton, Presenter.
Assessment Surveys July 22, 2004 Chancellor’s Meeting.
11 Reporting Outcomes Results and Improvements Produced by Non-Instructional Subcommittee of Assessment Committee.
2014 AmeriCorps External Reviewer Training
Closing the Loop in Assessment.
Dr. Geri Cochran Director, Institutional Research, Assessment and Planning.
Pre-Conference Workshop – June 2007 BUILDING A NATIONAL TEAM: Theatre Education Assessment Models Robert A. Southworth, Jr., Ed.D. TCG Assessment Models.
The Evaluation Plan.
JACKSON STATE UNIVERSITY ACADEMIC ASSESSMENT COMMITTEE WORKSHOP
Identifying and Assessing Learning Outcomes for Professional Development Programming Diane E. Waryas, Ph.D. Kim E. VanDerLinden, Ph.D.
Developing Administrative and Educational Support Outcomes and Methods of Assessment Lisa Garza Director, University Planning and Assessment Beth Wuest.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
Qualitative Methods to Assess Community Issues. What are qualitative methods of assessment? Qualitative methods of assessment are those whose results.
Joanne Chen Irvine Valley College.  SLOs are statements that specify what students will know, be able to perform and to demonstrate.  SLOs specify an.
Overview of Evaluation Designs. Learning objectives By the end of this presentation, you will be able to: Explain evaluation design Describe the differences.
Kirkpatrick’s Levels of Evaluation
Achieving the Dream: Assessing Implementation CCPRO February 20, 2007.
TRAINING EVALUATION WHAT? WHAT? WHY? WHY? HOW? HOW? SO WHAT? SO WHAT?
 How do we begin?  What are the different types of evaluation research?  How do these different types fit together?  What purpose do they serve?
Director of Evaluation and Accountability Manager, UW’s Grand Rapids, Michigan Robert McKown, CIRS Director of Evaluation and Accountability Sherri.
ASSESSMENT. Assessment is the systematic and on-going process of collecting and reviewing evidence about the College's academic and administrative programs.
Assessing Your Assessments: The Authentic Assessment Challenge Dr. Diane King Director, Curriculum Development School of Computer & Engineering Technologies.
Administrative and Educational Support Outcomes: Reporting Results, Taking Action, and Improving Services Lisa Garza Director, University Planning and.
LINKS Evaluation. Why evaluate? Evaluation of the LINKS Advising Program is particularly important in this pilot year. It gives participating staff, students.
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
Program Evaluation Principles and Applications PAS 2010.
JACKSON STATE UNIVERSITY ACADEMIC ASSESSMENT PLAN/REPORT By: Dr. Shemeka McClung Director Ms. Arnitra Hunter Research Associate Institutional Research.
Program Evaluation Making sure instruction works..
JACKSON STATE UNIVERSITY ACADEMIC ASSESSMENT COMMITTEE WORKSHOP By: Dr. Shemeka McClung Director Ms. Arnitra Hunter Research Associate Institutional Research.
Noel-Levitz’ College Student Inventory (CSI) as a Retention Tool Bobby Sharp Appalachian State University Summer Institute on First-Year Assessment Asheville,
INSTRUCTORS Please review the slides below and update the given examples with information relevant to your state and/or institution: 9 Achievable, Relevant,
Kimberlee Pottberg.  Part 1: Why we use WEAVEonline  Part 2: How to enter components.
Improved socio-economic services for a more social microfinance.
Assessment in student life
An agency of the Office of the Secretary of Education and the Arts
Functional Area Assessment
Assessment & Evaluation workshop
Welcome! Session Recording
Presented by: Skyline College SLOAC Committee Fall 2007
Tasks & Grades for MET3.
What to do with your data?
Assessing Administrative and Educational Support Services
Presentation transcript:

ASSESSMENT OF STUDENT SUPPORT SERVICES Kimberly Gargiulo, Coordinator of Assessment Office of Institutional Research and Assessment

Cycle of Assessment II. Administrative Goals III. Means Of Assessment And Criteria For Success IV. Summary of Data Collected V. Use of Results To Implement Change or Improvement I. Institutional & Unit Mission/Goal Reference

5 Column Model* *: Adapted from Institutional Effectiveness Associates Institutional Mission Unit Mission Administrative Goals Means of Assessment & Criteria for Success Results Use Of Results

Cycle of Assessment II. Administrative Goals III. Means Of Assessment And Criteria For Success IV. Summary of Data Collected V. Use of Results To Implement Change or Improvement I. Institutional & Unit Mission/Goal Reference

Administrative Goals  Linked to Unit Mission  Realistic  Limited in number ( 3-5 per assessment period)  Measurable

Interconnectedness: Administrative Goals-  Related to the mission and goals of the college  Supports Unit’s mission

Deciding on Goals- Short & Long List Short List: What currently is being assessed Can be assessed within one assessment period Linked to Unit’s Mission Long List: Other important goals – input from staff Refer back to every year Cycle into Short List

Outcome, Process or Satisfaction? Outcome: “Students who meet with advisors will graduate faster…” Process: “The AATC will increase the number of students served…” Satisfaction: “Students will be satisfied with our services…”

Principles of good goals:  Singular  Measurable  Observable Use action words  Reasonable

Observable Use action words  Students WILL: “be satisfied” “be aware” “demonstrate” “understand”

Working Example: Freshmen will be satisfied with AATC advising services.

Cycle of Assessment II. Administrative Goals III. Means Of Assessment And Criteria For Success IV. Summary of Data Collected V. Use of Results To Implement Change or Improvement I. Institutional & Unit Mission/Goal Reference

Means of Assessment & Criteria for Success: How do you know: Students met the goals? And to what extent?

Examples of Assessment- Direct Evidence  Surveys  Focus Groups  Student Interviews  Suggestion Boxes  1 minute paper

SURVEYS  Measures attitudes about service  Measures satisfaction  Can measure knowledge too (quiz)  Determine weak areas  Quick feedback  Open-ended or multiple choice  Created locally or standardized

FOCUS GROUPS  Questions asked to a group  7-12 participants  Attain qualitative data  Could be video or audio taped or notes can be taken during session  Facilitator needed

Indirect Evidence Examples:  GPA of students who attend AATC workshops or advisement  Graduation Rates  Retention Rates  Transfer Rates Compare participants versus non-participants

Criteria for Success Overall Primary total rating Component Secondary, more detailed. Identify scores which would elicit further review

Establishing Criteria for Success  Reasonable expectation of responses  Expressed in specific terms (overall) “The average rating of the workshop will be at least 4.0” (overall) “At least 75% of respondents will be satisfied or very satisfied with their advisement session.”  (component) “No more than 5% of respondents will be very dissatisfied with their advisement session.”

REMEMBER  Means of Assessment must be directly linked to goals Make sure instrument will give you the answers you need  Criteria for Success Check wording in tool and in criteria for success (rating scale, what rating scale??)

Working Example- Means of Assessment Goal: Freshmen will be satisfied with AATC advising services. Means of Assessment: 1) Survey administered to freshmen immediately following their advisement session will ask students about their level of satisfaction with the session.

Working Example- Criteria for Success Assessment Tool 1) Survey Criteria for Success 1) At least 80% of the respondents will be satisfied or very satisfied with their advisement session; no more than 5% of respondents will state they left the session still unclear about graduation requirements.

Cycle of Assessment II. Administrative Goals III. Means Of Assessment And Criteria For Success IV. Summary of Data Collected V. Use of Results To Implement Change or Improvement I. Institutional & Unit Mission/Goal Reference

Summary of Data Collected  What were your results?  Did you meet, exceed or fall below your target?  What do the results tell you? Hypothesize

Results Should:  Show to what extent goals were accomplished  Be linked to assessment means used  Be detailed to show the assessment took place  Justify the “use of results”

Results Should Not:  Be statistically unlikely  Match criteria of success exactly  NOT BE USED!

Working Example- Summary of Data Collected Criteria for Success 1) At least 80% of the respondents will be satisfied or very satisfied with their advisement session; no more than 5% of respondents will state they left the session still unclear about graduation requirements. Summary of Data Collected 1) 82% of respondents stated they were at least satisfied with their advisement session 2) 10% of respondents answered they left the session unclear about graduation requirements 3) The average rating of the session was a 3.5 out of 5.0 4) Response rate: 30%

Cycle of Assessment II. Administrative Goals III. Means Of Assessment And Criteria For Success IV. Summary of Data Collected V. Use of Results To Implement Change or Improvement I. Institutional & Unit Mission/Goal Reference

Using Results -Are the use of results substantive enough? -Detailed enough? -Do you need to modify your services, goals, targets or assessment tools? -Communicated to the appropriate parties?

Working Example- Using Results to Implement Change Summary of Data Collected 1) 82% of respondents stated they were at least satisfied with their advisement session 2) 10% of respondents answered they left the session unclear about graduation requirements 3) The average rating of the session was a 3.5 out of 5.0 4) Response rate: 30% Use of Results 1) Target of 80% was met; add a fill in question asking to detail why they answered this question as they did 2) Target of less than 5% not met; hold training for advisors about graduation requirements for incoming freshmen 3) Hold brainstorming sessions with staff about how to improve advising sessions (new goal to cycle in?) 4) Review mode of administration of survey, think about ways to increase rate.

If Results are Positive:  Celebrate your successes!  Let colleagues know!  Cycle in new goals?

If Results are Not So Positive:  What needs to be modified?  Work with colleagues  Don’t get discouraged!

Next Steps:  Continue the assessment cycle  Share with department  Create an assessment process  Visit Office of Institutional Research & Assessment with any questions!

EXERCISE 1) Two goals for the AATC. 2) Means of Assessment & Criteria for Success for these goals.

Assessment Time!  Please take a moment to fill out a brief survey. Thank You!

Questions? Kimberly Gargiulo, Coordinator of Assessment Office of Institutional Research and Assessment S