Presentation is loading. Please wait.

Presentation is loading. Please wait.

Introduction to WEAVE and Assessment November 18-19, 2014 Ryan J. McLawhon, Ed.D. Director Institutional Assessment Elizabeth C.

Similar presentations


Presentation on theme: "Introduction to WEAVE and Assessment November 18-19, 2014 Ryan J. McLawhon, Ed.D. Director Institutional Assessment Elizabeth C."— Presentation transcript:

1 Introduction to WEAVE and Assessment November 18-19, 2014 Ryan J. McLawhon, Ed.D. Director Institutional Assessment Ryan.McLawhon@tamu.edu Elizabeth C. Bledsoe, M.A. Program Coordinator Institutional Assessment ebledsoe@tamu.edu Kimberlee Pottberg Sr. Admin Coordinator Institutional Assessment K-pottberg@tamu.edu assessment@tamu.edu 979.862.2918 assessment.tamu.edu

2 Why we use WEAVEonline How to enter components Assessment Overview Agenda

3 SACS Comprehensive Standard 3.3.1 3.3 Institutional Effectiveness 3.3.1 The institution identifies expected outcomes, assesses the extent to which it achieves these outcomes, and provides evidence of improvement based on analysis of the results in each of the following areas: (Institutional Effectiveness) 3.3.1.1educational programs, to include student learning outcomes 3.3.1.2 administrative support services 3.3.1.3 educational support services 3.3.1.4 research within its educational mission, if appropriate 3.3.1.5 community/public service within its educational mission, if appropriate SACS Expectations

4 SACS Comprehensive Standard 3.3.1 3.3 Institutional Effectiveness 3.3.1 The institution identifies expected outcomes, assesses the extent to which it achieves these outcomes, and provides evidence of improvement based on analysis of the results in each of the following areas: (Institutional Effectiveness) 3.3.1.1educational programs, to include student learning outcomes 3.3.1.2 administrative support services 3.3.1.3 educational support services 3.3.1.4 research within its educational mission, if appropriate 3.3.1.5 community/public service within its educational mission, if appropriate SACS Expectations and provides evidence of improvement based on analysis of the results…

5 Program-level Assessment Degree Programs –“to include student learning outcomes” –Faculty judgment of student work –Curricular and pedagogical improvements Support Offices –Efficiency of key functions –Improvement of programs and services –Student learning outcomes within mission

6 The Assessment Process Develop Program Mission & Outcomes Design an Assessment Plan Implement the Plan & Gather Information Interpret/ Evaluate Information Modify & Improve Adapted from: Trudy Banta, IUPUI

7 Update Assessment Plans (mission, outcomes, and measures with achievement targets Done by 1/12/2015 Findings EnteredDone by 8/1/2015 Action Plan(s) EnteredDone by 9/1/2015 Cycle Closes10/1/2015 Assessment Timeline

8 Assessment 101

9 Develop Mission and Outcomes Develop Program Mission & Outcomes

10 The mission statement links the functions of your unit to the overall mission of Texas A&M. Mission Statement

11 Goals: –Additional objections which may be tied to specific portions of a programs mission. –Not considered in the progress reports sent to each Assessment Liaison, but be used by individual offices if found to be useful. Goals

12 Learning Outcomes –Learning statements Defines the information or skills stakeholders/students will acquire from the program Program Outcomes –Process statements Relate to what the unit intends to accomplish. Examples include: –Level or volume of activity –Efficiency with which you conduct the processes –Compliance with external standards of “good practice in the field” or regulations –Satisfaction statements Describe how those you serve rate their satisfaction with your unit’s processes or services Outcomes

13 When writing Learning Outcomes, the focus must be on the students and what they will think, know, be able to do, or value as a result of participation in the educational environment. Learning Outcomes

14 Cognitive Learning Knowledge - to recall or remember facts without necessarily understanding them articulate, define, indicate, name, order, recognize, relate, recall, reproduce, list, tell, describe, identify, show, label, tabulate, quote Comprehensive - to understand and interpret learned information classify, describe, discuss, explain, express, interpret, contrast, associate, differentiate, extend, translate, review, suggest, restate Application - to put ideas and concepts to work in solving problems apply, compute, give examples, investigate, experiment, solve, choose, predict, translate, employ, operate, practice, schedule Analysis - to break information into its components to see interrelationships analyze, appraise, calculate, categorize, compare, contrast, criticize, differentiate, distinguish, examine, investigate, interpret Synthesis - to use creativity to compose and design something original arrange, assemble, collect, compose, construct, create, design, formulate, manage, organize, plan, prepare, propose, set up Evaluation - to judge the value of information based on established criteria appraise, assess, defend, judge, predict, rate, support, evaluate, recommend, convince, conclude, compare, summarize Affective Learning appreciate, accept, attempt, challenge, defend, dispute, join, judge, praise, question, share, support

15 Design an Assessment Plan

16 Measures: –Define and identify the tool used to determine the extent to which an outcome is met. Direct Indirect Measures

17 Direct measures are those designed to directly measure:  what a stakeholder knows or is able to do (i.e., requires a stakeholder to actually demonstrate the skill or knowledge) Examples: –A rubric used to assess a collection of work samples (student work) –Observation of behavior Direct Measures

18 Indirect measures focus on:  stakeholders’ perception of their level of learning  stakeholders’ perception of the benefit or satisfaction with programming, an intervention, or services Examples: –Surveys –Exit Interviews Indirect Measures

19 Targets: – The result, target, benchmark, or value representing success or the achievement of a given outcome. Targets

20 Implement & Gather Information Implement the Plan & Gather Information

21 Findings: –A concise summary of the results gathered from a given assessment measure. –Note: The language of this statement should parallel the corresponding achievement target. Results should be described in enough detail to prove you have met, partially met, or not met the achievement target. Findings

22 Interpret/Evaluate Information

23 Analysis Questions: –Responses to provided questions which provide an update of ongoing action plans as well as an opportunity to discuss the significance of new action plans. Analysis Questions

24 Modify/Improve Modify & Improve

25 After reflecting on the findings, you and your colleagues should determine appropriate action to improve the program. This will lead to at least one action plan. Actions outlined in the action plan should be specific and relate directly to the outcome and the results of assessment. Action Plans

26 Using WEAVEonline

27 Office of Institutional Assessment 2012-2013 Assessment Report Continuous Improvement To fulfill the 2011-12 action plan to address the unmet target of 80% of conference respondents indicating satisfaction with the variety of poster sessions offered, the Office of Institutional Assessment (OIA) along with the Assessment Conference Committee (ACC) sought more variety in the posters for the 2012 Assessment Conference. As a result, the percentage of respondents satisfied with the variety of posters increased from 74% to 78%. Although the 85% target was still not met during the 2012-13 cycle, this result shows improvement towards the target. To complete the other 2011-12 action plan, OIA enhanced the Assessment Review Guidelines to include more practical and applicable “good practices” for assessment liaisons to pass along to their programs as formative assessment. Additionally, the Assessment Review Rubric was modified to be more exhaustive in its evaluation of assessment reports. As a result, less variance was observed in the quality of assessment reports. Lastly, the Vice Provost of Academic Affairs supplied each dean with a college-specific, personalized memo addressing the strengths and weaknesses of assessment reports in each college. This process was well received and will continue as a service to colleges from the Office of the Vice Provost. Outcome/ObjectiveMeasureTargetFindingAction Plan O 5: Provide Excellent Concurrent and Poster Sessions Provide excellent concurrent and poster sessions for participants at the Annual Assessment Conference. M 8: Overall Assessment Conference Survey 85%, or more, of the Annual Assessment Conference attendees will report satisfaction with the Concurrent and Poster Sessions. Status: Partially Met Following the end of the 13th Annual Texas A&M Assessment Conference, an on-line conference evaluation survey was sent out to all attendees. Information gained from this survey was organized into the 13th Annual Conference Survey Report, and was distributed to the Assessment Conference Committee for review. Results from the survey questions relating to Concurrent and Plenary Sessions are below: Concurrent Sessions - Question 16: "How satisfied were you with the quantity of Concurrent Sessions?" - 90.58% were "Very Satisfied" or "Satisfied" Question 17: "How satisfied were you with the variety of Concurrent Sessions?" - 83.71% were "Very Satisfied" or "Satisfied" Poster Sessions - Question 19: "How satisfied were you with the quantity of Poster Sessions?" - 77.78% were "Very Satisfied" or "Satisfied" Question 20: "How satisfied were you with the variety of Poster Sessions?" - 77.06% were "Very Satisfied" or "Satisfied" Although we improved from the 2011-2012 findings of 73%, based on our findings from the 2012-2013 Assessment Cycle, 77% of respondents indicated that they were satisfied with the variety of poster sessions offered. In response, the Office of Institutional Assessment will seek posters from each track to provide a greater variety of posters during the 14th Annual Texas A&M Assessment Conference. Use of Results Although the satisfaction results from the conference survey related to the variety of poster sessions increased from 74% to 78%, the 85% target was still not met. In response, the Office of Institutional Assessment (OIA) and the Assessment Conference Committee (ACC) will ensure that each of the conference “tracks” has coverage in the poster session. OIA and the ACC have traditionally ensured track coverage in concurrent session offerings but has never paid close attention to track coverage in the poster session offerings. This strategy includes contacting specific authors of concurrent session proposals in underrepresented tracks and inviting them to consider a poster presentation, perhaps in addition to the concurrent session. Another way of thinking about it…

28 Assess what is important Use your findings to inform actions You do not have to assess everything every year Take-Home Messages

29 OIA Consultations WEAVEonline support and training Assessment plan design, clean-up, and re-design –And we can come to you! New Website: assessment.tamu.edu

30 Questions?

31 http://assessment.tamu.edu/conference


Download ppt "Introduction to WEAVE and Assessment November 18-19, 2014 Ryan J. McLawhon, Ed.D. Director Institutional Assessment Elizabeth C."

Similar presentations


Ads by Google