WesternU Assessment Kick-off Meeting: The why’s, who’s, what’s, how’s, and when’s of assessment 2013-2014 Institutional Research & Effectiveness Neil M.

Slides:



Advertisements
Similar presentations
Writing an NCATE/IRA Program Report
Advertisements

GENERAL EDUCATION ASSESSMENT Nathan Lindsay January 22-23,
Degree SLO Workshop 17 Sept Workshop Learning Outcomes Faculty who participate in the Workshop will: – Know the accreditation requirements and timeline.
Learning Outcomes, Authentic Assessments and Rubrics Erin Hagar
Academic Program and Unit Review at UIS Office of the Provost Fall 2014.
Deconstructing Standard 2c Angie Gant, Ed.D. Truett-McConnell College 1.
Designing Scoring Rubrics. What is a Rubric? Guidelines by which a product is judged Guidelines by which a product is judged Explain the standards for.
Measuring Student Learning March 10, 2015 Cathy Sanders Director of Assessment.
Mary Allen Qatar University September Workshop participants will be able to:  draft/revise learning outcomes  develop/analyze curriculum maps.
National Commission for Academic Accreditation & Assessment Preparation for Developmental Reviews.
The Program Review Process: NCATE and the State of Indiana Richard Frisbie and T. J. Oakes March 8, 2007 (source:NCATE, February 2007)
Orientation to the Accreditation Internal Evaluation (Self-Study) Flex Activity March 1, 2012 Lassen Community College.
QEP Update CFCC Planning Retreat June 19, QEP Update Mid-Term Report Global Outcomes 1.Measurable improvement of students’ critical thinking skills.
Developing an Assessment Plan Owens Community College Assessment Day Workshops November 13-14, 2009 Anne Fulkerson, Ph.D. Institutional Research.
FLCC knows a lot about assessment – J will send examples
Reaffirmation of Accreditation: Institutional Effectiveness Southern Association of Colleges and Schools February 2008 Stephen F. Austin State University.
Purpose Program The purpose of this presentation is to clarify the process for conducting Student Learning Outcomes Assessment at the Program Level. At.
JIC ABET WORKSHOP No.4 Guidelines on: II Faculty Survey Questionnaire.
Accrediting Commission for Community and Junior Colleges of the Western Association of Schools and Colleges.
Becoming a Teacher Ninth Edition
Candidate Work Sample. Section I: Unit Topic or Title.
BY Karen Liu, Ph. D. Indiana State University August 18,
Essential Elements of a Workable Assessment Plan Pat Tinsley McGill, Ph.D. Professor, Strategic Management College of Business Faculty Lead, Assessment.
Everything you wanted to know about Assessment… Dr. Joanne Coté-Bonanno Barbara Ritola September 2009 but were afraid to ask!
Assessing Progress on the Quality Enhancement Plan (QEP) Quality Enhancement Committee Meeting Department of Academic Effectiveness and Assessment.
WesternU Assessment Kick-off Meeting: The why’s, who’s, what’s, how’s, and when’s of assessment Institutional Research & Effectiveness Neil M. Patel, Ph.D.
Note: Because of slide animation, this ppt is intended to be viewed as a slide show.  While viewing the ppt, it may be helpful to obtain a sample Core.
Outcome Assessment Reporting for Undergraduate Programs Stefani Dawn and Bill Bogley Office of Academic Programs, Assessment & Accreditation Faculty Senate,
Assessing Program-Level SLOs November 2010 Mary Pape Antonio Ramirez 1.
Assessing General Education Workshop for College of the Redwoods Fred Trapp August 18, 2008.
Deconstructing Standard 2c Dr. Mike Mahan Gordon College 1.
Venue: M038 Date: Monday March 28,2011 Time: 10:00 AM JIC ABET WORKSHOP No.2 Guidelines on: IMapping of PEOs to Mission Statement IIMapping of SOs to PEOs.
Alternative Assessment
ASSESSING STUDENT LEARNING OUTCOMES IN DEGREE PROGRAMS CSULA Workshop Anne L. Hafner May 12, 2005.
National Commission for Academic Accreditation & Assessment Developmental Reviews at King Saud University and King Faisal University.
ARE STUDENTS LEARNING WHAT WE SAY THEY ARE? THE IMPORTANCE AND PROCESS FOR CONDUCTING EFFECTIVE PROGRAM REVIEWS IN THE BUSINESS CURRICULUM Presented by:
EDU 385 CLASSROOM ASSESSMENT Week 1 Introduction and Syllabus.
NEASCNEASC Standards Committees Kick-off Today’s Topic: Overview of the NEASC Game Plan and instructions for today’s work.
Periodic Program Review Guiding Programs in Today’s Assessment Climate LaMont Rouse Executive Director of Assessment, Accreditation & Compliance.
Welcome to Curriculum Mapping… The attendee will understand the purpose of curriculum mapping, with a focus on the alignment of instruction with desired.
Performance-Based Assessment HPHE 3150 Dr. Ayers.
Practicing Meaningful Learning Outcomes Assessment at UGA Department of Crop and Soil Sciences August 10, 2015 Dr. Leslie Gordon Associate Director for.
Assessing Your Assessments: The Authentic Assessment Challenge Dr. Diane King Director, Curriculum Development School of Computer & Engineering Technologies.
Assessment Information from multiple sources that describes a student’s level of achievement Used to make educational decisions about students Gives feedback.
NCATE STANDARD I STATUS REPORT  Hyacinth E. Findlay  March 1, 2007.
Assessing Information Literacy with SAILS Juliet Rumble Reference & Instruction Librarian Auburn University.
Deciphering SPA Requirements Kathy Hildebrand, Ph.D., Assistant Dean of Assessment & Continuous Improvement, College of Education Cynthia Conn, Ph.D.,
What Your Program Needs to Know about Learning Outcomes Assessment at UGA.
Assessing Student Learning Workshop for Department Chairs & Program Directors Workshop for Department Chairs & Program Directors January 9, 2007.
National Board Study Group Meeting Dan Barber 5 th Grade Teacher, Irwin Academic Center
1 Learning Outcomes Assessment: An Overview of the Process at Texas State Beth Wuest Director, Academic Development and Assessment Lisa Garza Director,
Developing Program Learning Outcomes To help in the quality of services.
UK Office of Assessment. The LEARNING Initiative Dual Track Implementation Strategy Completion Dates Not actively engaged in program level assessment.
Program Assessment – an overview Karen E. Dennis O: sasoue.rutgers.edu.
JACKSON STATE UNIVERSITY ACADEMIC ASSESSMENT COMMITTEE WORKSHOP By: Dr. Shemeka McClung Director Ms. Arnitra Hunter Research Associate Institutional Research.
QCC General Education Assessment Task Force March 21 and 22, 2016 Faculty Forum on General Education Outcomes.
Assurance of Learning “Eberly AOL” All College Meeting – January 21, 2009 Prashanth Bharadwaj, Dean’s Associate Cyndy Strittmatter, Assistant Dean.
Deconstructing Standard 2c Laura Frizzell Coastal Plains RESA 1.
The Process The Results The Repository of Assessment Documents (ROAD) Project Sample Characteristics (“All” refers to all students enrolled in ENGL 1551)
Consider Your Audience
Developing a Student Learning Outcomes Assessment Plan and Report
Effective Outcomes Assessment
Institutional Learning Outcomes Assessment
Student Learning Outcomes Assessment
Creating Assessable Student Learning Outcomes
Leanne Havis, Ph.D., Neumann University
Presented by: Skyline College SLOAC Committee Fall 2007
Assessing Academic Programs at IPFW
Student Learning Outcomes at CSUDH
Curriculum Coordinator: Patrick LaPierre February 3, 2017
Presentation transcript:

WesternU Assessment Kick-off Meeting: The why’s, who’s, what’s, how’s, and when’s of assessment Institutional Research & Effectiveness Neil M. Patel, Ph.D. Juan Ramirez, Ph.D.

Meeting Roadmap The goals are to understand – Why assessment needs to take place – Who should be involved in assessment – What needs to be assessed – How to assess the learning outcomes – When assessment reports are due

Assessment Overview Why assess? – Accountability – To measure learning – To identify challenges related to instruction, curriculum, or assignments. – To improve learning Methods must be in place to properly assess Information should be shared widely and used to inform decision- making Key Players – Deans, Faculty, Curriculum committees, Assessment committees, Assessment Specialists, Preceptors

What needs to be assessed? PHASEYEARINSTITUTIONAL LEARNING OUTCOMES Evidence based practiceInterpersonal communication skills Critical thinkingCollaboration skills Breadth and depth of knowledge in the discipline/Clinical competence Ethical and moral decision making skills Life-long learningHumanistic practice

What needs to be assessed? (cont.): We cannot assess everything! Direct assessment of Signature Assignments – Signature assignments have the potential to help us know whether student learning reflects “the ways of thinking and doing of disciplinary experts” – Course-embedded assessment – Aligned with LO’s – Authentic in terms of process/content, “real world application” Indirect assessment, i.e., Student perceptions – First year survey – Graduating survey – Alumni surveys – Student evaluation of course

ILO ASSESSMENT TEMPLATE

Assessment Template Timeline Section I: Progress Report Section II: Learning Outcome Alignment Section III.1: Methodology Section IV.1: Results Section V.1: Discussion & Implications Section III.2: Methodology Section IV.2: Results Section V.2: Discussion & Implications

Assessment Template Timeline Section I: Progress Report Section II: Learning Outcome Alignment Section III.1: Methodology Section IV.1: Results Section V.1: Discussion & Implications Section III.2: Methodology Section IV.2: Results Section V.2: Discussion & Implications

Section I: Progress Report Goal: To document what occurred as a result of assessment.

Section II: Learning Outcome Alignment Goal: To determine which PLO’s align with the ILO, and, to determine, over time, which PLO’s are not assessed.

Section III: Methodology It will be necessary to copy and paste sections III-V if there are more than two assessments completed. Every ILO report needs to include one direct and one indirect assessment-Multiple assessments may be necessary to cover ALL PLOs.

Section III: Methodology

Note: Participation section is for participation in assessment process not for the participation of the development of the student work

Section IV: Results Analytical approach – Should align with assessment goal – To determine how many students are achieving at a specific level/score: Frequency distribution – To determine if differences in scores exist between two or more groups: chi-square, t-test or ANOVA – To determine if scores from one assignment predict scores of another assignment: Regression Sample size: number of students assessed Statistical results: Frequency table, p value, Etc.

Section V: Discussion & Implications

EXAMPLE

Example Scenario: Following a discussion between faculty, Curriculum Committee, the Program Assessment Committee and the Dean, it was decided Critical Thinking will be assessed using 4 th year preceptor evaluations. Question: What do we need to do?

Example: 4 th year preceptor evaluations to assess Critical Thinking Things to consider: – Which PLO(s) are assessed? – How is the assessment scored? – Who has the data? – What is/are the quantifiable assessment goals? Standards of success – How do we analyze the data?

Example: 4 th year preceptor evaluations to assess Evidence-Based Practice Assessment: The preceptor evaluation of students occurs during various time points within the 4 th year rotations. For the purpose of assessment, the program has decided to use the students’ entire 4 th year preceptor evaluations (eight evaluations in total). The preceptors are asked to indicate using a Yes/No format if a student has been observed demonstrating a list of certain skills or has been observed displaying certain knowledge elements; there are 20 total items in the evaluation form. These elements are commonly displayed within the profession. The data is sent directly to the 4 th year Director. To assess Critical Thinking, a single item within the checklist is used: The student utilizes and displays critical thinking.

Example: 4 th year preceptor evaluations to assess Critical Thinking Assessment Goal: 90% of students will demonstrate critical thinking skills. Why did we come up with 90%? – Peer or aspirational college has similar standard – Professional community suggests such standard – Our own data has set the standard The assessment goal is different than grading – For grading, passing = 70%; 14/20; “Yes” = 1 point – It is possible for all students to score 0 on the Critical Thinking item.

Averaged data of 4 th year preceptor evaluations assessing Critical Thinking per student CT Score: 0 = no, 1 =yes StudentCT Score StudentCT Score

Example: Section III.1 Methodology Name of assessment:4 th year preceptor evaluation Evidence: Indicate if this is a direct or indirect assessment- Direct Evidence: PLO(s) assessed (Section II)- List the PLOs that will be assessed by this particular assessment. PLO 2: Upon graduation, students should be able to think critically when in the clinic. Evidence: Description: Please write a narrative that explains the student work completely so that someone who knows nothing about the program will understand what it consists of and include how the assessment addresses the PLO(s). Preceptors indicate using a Yes/No format if students are observed demonstrating a list of certain skills or display certain knowledge elements; there are 20 total items in the evaluation form. Eight rotations during the 4 th year were used, and scores were averaged for each student. Data Collection Method: How is the assessment scored? State the type of scoring mechanism used. Yes/No scoring guide. Data Collection Method: Does the data isolate the PLO? Yes or No Yes

Data Collection Method: Provide the scoring mechanism as an attachment, as well as any other important documents for this assessment- State the title of the attachment(s) and what each one includes. If applicable, please highlight what specifically is being utilized for assessment within the attachment. Single item: The student utilizes and displays critical thinking. Please state the quantifiable assessment goal: Assessment is completed to determine how well students are achieving the PLOs. For example, a goal may be written to determine how many students are achieving at a specific level/score. There can be more than one goal for each assessment. For example, if students are reaching a particular score, and, if current students are performing differently from previous students. 90% of students will demonstrate critical thinking skills in all eight rotations (avg score = 1) Participation: Describe the assessment process and who participated. Please list the roles each person played. This section is meant to keep track of program participation from faculty, committees, deans, and Institutional research etc. Faculty, Curriculum Committee, Assessment Committee and Dean selected assignment; 4 th year preceptors evaluated students; 4 th year program director collected data; Assessment Committee analyzed data Example: Section III.1 Methodology

Assessment 1 Name: Please state the name of the chosen assignment, survey, exam, etc. 4 th year preceptor evaluation Assessment 1 Goal (Section III.1): 90% of students will demonstrate critical thinking skills in all eight rotations (avg score = 1) Analytical Approach: Frequency distribution Sample Size: N=20 Statistical Results: Present the statistical results in a figure or table that aligns with the goal. FrequencyPercent No735.0% Yes1365.0% Total % Example: Section IV.1 Results

Example: Section V.1 Discussion & Implications Assessment 1 Name: Please state the name of the chosen assignment, survey, exam, etc. 4 th year preceptor evaluation Assessment 1 Goal (Section III.1): 90% of students will demonstrate critical thinking skills Discussion-Was the goal reached? (Yes or no; if no, why): No; Only 65% of students demonstrated critical thinking skills. Discussion-How do the results relate back to the PLO: How are students performing (refer to results) in relation to the PLO? What do the results mean? What were the limitations? 65% of the students were able to demonstrate critical thinking skills in the clinic. Since this data is collected during their 4 th year, it seems clear the program is not reaching the PLO. Although the results determine the program is not meeting the goal, the program is limited with data. The ability to determine who these students are is not present at the moment. Implications-How are the results being used? Please describe what changes are being made or if things will remain the same in regards to the PLO being assessed. Who were the results discussed with or have they been circulated? Is there an action plan for closing the loop? Please describe. The program is determining 1. If preceptors know what to look for when evaluating students, 2. If there are predictors to student success for this assignment, 3. If previous 4 th year evaluations lead to a different conclusion, 4. If the assessment is rigorous.

You can see a lot by just looking ---Yogi Berra CT Score: 0 = no, 1 =yes Gender: 1 = male, 2 =female StudentCT ScoreGender StudentCT ScoreGender

GROUP ACTIVITY

Timeline TIMELINE FOR PROGRAMS Section I: Progress Report (draft) Section II: Institutional Learning Outcome & Program Learning Outcome Alignment (draft) Section III: Methodology, Assessment Goals, & Participation (draft) May 9, 2014 Section IV: Results (draft)June 6, 2014 FINAL Assessment Report DueJuly 31, 2014 TIMELINE FOR REVIEW Assessment Committee Review of ReportsAug, 2014 Distribution of FeedbackOct, 2014 Meetings of UnderstandingDec,2014-Jan, 2015 Report to ProvostFeb, 2015 Deans’ Council PresentationMarch, 2015

CAPE Workshops Spring 2014 Measurable Student Learning Outcomes – Tuesday, January 14 at 12pm Curricular Mapping – Tuesday, February 11 at 12pm Operationalizing and assessing WesternU ILOs – Tuesday, March 4 at 12pm Developing Valid and Reliable Rubrics – Tuesday, April 8 at 12pm Basic Techniques in Presenting Data – Tuesday, May 6 at 12pm Closing the Loop – Tuesday June 10 at 12pm

Questions? Concerns? Institutional Learning Outcomes Assessment information can be found on the IRE website: assessment-home