Presentation is loading. Please wait.

Presentation is loading. Please wait.

WesternU Assessment Kick-off Meeting: The why’s, who’s, what’s, how’s, and when’s of assessment 2013-2014 Institutional Research & Effectiveness Neil M.

Similar presentations


Presentation on theme: "WesternU Assessment Kick-off Meeting: The why’s, who’s, what’s, how’s, and when’s of assessment 2013-2014 Institutional Research & Effectiveness Neil M."— Presentation transcript:

1 WesternU Assessment Kick-off Meeting: The why’s, who’s, what’s, how’s, and when’s of assessment Institutional Research & Effectiveness Neil M. Patel, Ph.D. Juan Ramirez, Ph.D.

2 Meeting Roadmap The goals are to understand – Why assessment needs to take place – Who should be involved in assessment – What needs to be assessed – How to assess the learning outcomes – When assessment reports are due

3 Assessment Overview Why assess? – Accountability – To measure learning – To identify challenges related to instruction, curriculum, or assignments. – To improve learning Methods must be in place to properly assess Information should be shared widely and used to inform decision- making Key Players – Deans, Faculty, Curriculum committees, Assessment committees, Assessment Specialists, Preceptors

4 What needs to be assessed? PHASEYEARINSTITUTIONAL LEARNING OUTCOMES Evidence based practiceInterpersonal communication skills Critical thinkingCollaboration skills Breadth and depth of knowledge in the discipline/Clinical competence Ethical and moral decision making skills Life-long learningHumanistic practice

5

6 What needs to be assessed? (cont.): We cannot assess everything! Direct assessment of Signature Assignments – Signature assignments have the potential to help us know whether student learning reflects “the ways of thinking and doing of disciplinary experts” – Course-embedded assessment – Aligned with LO’s – Authentic in terms of process/content, “real world application” Indirect assessment, i.e., Student perceptions – First year survey – Graduating survey – Alumni surveys – Student evaluation of course

7 ILO ASSESSMENT TEMPLATE

8 Assessment Template Timeline Section I: Progress Report Section II: Learning Outcome Alignment Section III.1: Methodology Section IV.1: Results Section V.1: Discussion & Implications Section III.2: Methodology Section IV.2: Results Section V.2: Discussion & Implications

9 Assessment Template Timeline Section I: Progress Report Section II: Learning Outcome Alignment Section III.1: Methodology Section IV.1: Results Section V.1: Discussion & Implications Section III.2: Methodology Section IV.2: Results Section V.2: Discussion & Implications

10 Section I: Progress Report Goal: To document what occurred as a result of assessment.

11 Section II: Learning Outcome Alignment Goal: To determine which PLO’s align with the ILO, and, to determine, over time, which PLO’s are not assessed.

12 Section III: Methodology It will be necessary to copy and paste sections III-V if there are more than two assessments completed. Every ILO report needs to include one direct and one indirect assessment-Multiple assessments may be necessary to cover ALL PLOs.

13 Section III: Methodology

14

15 Note: Participation section is for participation in assessment process not for the participation of the development of the student work

16 Section IV: Results Analytical approach – Should align with assessment goal – To determine how many students are achieving at a specific level/score: Frequency distribution – To determine if differences in scores exist between two or more groups: chi-square, t-test or ANOVA – To determine if scores from one assignment predict scores of another assignment: Regression Sample size: number of students assessed Statistical results: Frequency table, p value, Etc.

17 Section V: Discussion & Implications

18 EXAMPLE

19 Example Scenario: Following a discussion between faculty, Curriculum Committee, the Program Assessment Committee and the Dean, it was decided Critical Thinking will be assessed using 4 th year preceptor evaluations. Question: What do we need to do?

20 Example: 4 th year preceptor evaluations to assess Critical Thinking Things to consider: – Which PLO(s) are assessed? – How is the assessment scored? – Who has the data? – What is/are the quantifiable assessment goals? Standards of success – How do we analyze the data?

21 Example: 4 th year preceptor evaluations to assess Evidence-Based Practice Assessment: The preceptor evaluation of students occurs during various time points within the 4 th year rotations. For the purpose of assessment, the program has decided to use the students’ entire 4 th year preceptor evaluations (eight evaluations in total). The preceptors are asked to indicate using a Yes/No format if a student has been observed demonstrating a list of certain skills or has been observed displaying certain knowledge elements; there are 20 total items in the evaluation form. These elements are commonly displayed within the profession. The data is sent directly to the 4 th year Director. To assess Critical Thinking, a single item within the checklist is used: The student utilizes and displays critical thinking.

22 Example: 4 th year preceptor evaluations to assess Critical Thinking Assessment Goal: 90% of students will demonstrate critical thinking skills. Why did we come up with 90%? – Peer or aspirational college has similar standard – Professional community suggests such standard – Our own data has set the standard The assessment goal is different than grading – For grading, passing = 70%; 14/20; “Yes” = 1 point – It is possible for all students to score 0 on the Critical Thinking item.

23 Averaged data of 4 th year preceptor evaluations assessing Critical Thinking per student CT Score: 0 = no, 1 =yes StudentCT Score StudentCT Score

24 Example: Section III.1 Methodology Name of assessment:4 th year preceptor evaluation Evidence: Indicate if this is a direct or indirect assessment- Direct Evidence: PLO(s) assessed (Section II)- List the PLOs that will be assessed by this particular assessment. PLO 2: Upon graduation, students should be able to think critically when in the clinic. Evidence: Description: Please write a narrative that explains the student work completely so that someone who knows nothing about the program will understand what it consists of and include how the assessment addresses the PLO(s). Preceptors indicate using a Yes/No format if students are observed demonstrating a list of certain skills or display certain knowledge elements; there are 20 total items in the evaluation form. Eight rotations during the 4 th year were used, and scores were averaged for each student. Data Collection Method: How is the assessment scored? State the type of scoring mechanism used. Yes/No scoring guide. Data Collection Method: Does the data isolate the PLO? Yes or No Yes

25 Data Collection Method: Provide the scoring mechanism as an attachment, as well as any other important documents for this assessment- State the title of the attachment(s) and what each one includes. If applicable, please highlight what specifically is being utilized for assessment within the attachment. Single item: The student utilizes and displays critical thinking. Please state the quantifiable assessment goal: Assessment is completed to determine how well students are achieving the PLOs. For example, a goal may be written to determine how many students are achieving at a specific level/score. There can be more than one goal for each assessment. For example, if students are reaching a particular score, and, if current students are performing differently from previous students. 90% of students will demonstrate critical thinking skills in all eight rotations (avg score = 1) Participation: Describe the assessment process and who participated. Please list the roles each person played. This section is meant to keep track of program participation from faculty, committees, deans, and Institutional research etc. Faculty, Curriculum Committee, Assessment Committee and Dean selected assignment; 4 th year preceptors evaluated students; 4 th year program director collected data; Assessment Committee analyzed data Example: Section III.1 Methodology

26 Assessment 1 Name: Please state the name of the chosen assignment, survey, exam, etc. 4 th year preceptor evaluation Assessment 1 Goal (Section III.1): 90% of students will demonstrate critical thinking skills in all eight rotations (avg score = 1) Analytical Approach: Frequency distribution Sample Size: N=20 Statistical Results: Present the statistical results in a figure or table that aligns with the goal. FrequencyPercent No735.0% Yes1365.0% Total % Example: Section IV.1 Results

27 Example: Section V.1 Discussion & Implications Assessment 1 Name: Please state the name of the chosen assignment, survey, exam, etc. 4 th year preceptor evaluation Assessment 1 Goal (Section III.1): 90% of students will demonstrate critical thinking skills Discussion-Was the goal reached? (Yes or no; if no, why): No; Only 65% of students demonstrated critical thinking skills. Discussion-How do the results relate back to the PLO: How are students performing (refer to results) in relation to the PLO? What do the results mean? What were the limitations? 65% of the students were able to demonstrate critical thinking skills in the clinic. Since this data is collected during their 4 th year, it seems clear the program is not reaching the PLO. Although the results determine the program is not meeting the goal, the program is limited with data. The ability to determine who these students are is not present at the moment. Implications-How are the results being used? Please describe what changes are being made or if things will remain the same in regards to the PLO being assessed. Who were the results discussed with or have they been circulated? Is there an action plan for closing the loop? Please describe. The program is determining 1. If preceptors know what to look for when evaluating students, 2. If there are predictors to student success for this assignment, 3. If previous 4 th year evaluations lead to a different conclusion, 4. If the assessment is rigorous.

28 You can see a lot by just looking ---Yogi Berra CT Score: 0 = no, 1 =yes Gender: 1 = male, 2 =female StudentCT ScoreGender StudentCT ScoreGender

29 GROUP ACTIVITY

30 Timeline TIMELINE FOR PROGRAMS Section I: Progress Report (draft) Section II: Institutional Learning Outcome & Program Learning Outcome Alignment (draft) Section III: Methodology, Assessment Goals, & Participation (draft) May 9, 2014 Section IV: Results (draft)June 6, 2014 FINAL Assessment Report DueJuly 31, 2014 TIMELINE FOR REVIEW Assessment Committee Review of ReportsAug, 2014 Distribution of FeedbackOct, 2014 Meetings of UnderstandingDec,2014-Jan, 2015 Report to ProvostFeb, 2015 Deans’ Council PresentationMarch, 2015

31 CAPE Workshops Spring 2014 Measurable Student Learning Outcomes – Tuesday, January 14 at 12pm Curricular Mapping – Tuesday, February 11 at 12pm Operationalizing and assessing WesternU ILOs – Tuesday, March 4 at 12pm Developing Valid and Reliable Rubrics – Tuesday, April 8 at 12pm Basic Techniques in Presenting Data – Tuesday, May 6 at 12pm Closing the Loop – Tuesday June 10 at 12pm

32 Questions? Concerns? Institutional Learning Outcomes Assessment information can be found on the IRE website: assessment-home


Download ppt "WesternU Assessment Kick-off Meeting: The why’s, who’s, what’s, how’s, and when’s of assessment 2013-2014 Institutional Research & Effectiveness Neil M."

Similar presentations


Ads by Google