Introduction to WEAVE and Assessment November 18-19, 2014 Ryan J. McLawhon, Ed.D. Director Institutional Assessment Elizabeth C.

Slides:



Advertisements
Similar presentations
A Focus on Higher-Order Thinking Skills
Advertisements

Outcomes and Standards. Outcome Curricular statements describing how students will integrate knowledge, skills, and values into a complex role performance.
STUDENT LEARNING OUTCOMES ASSESSMENT. Cycle of Assessment Course Goals/ Intended Outcomes Means Of Assessment And Criteria For Success Summary of Data.
The Academic Assessment Process
Support Program Assessment November 18-19, 2014 Ryan J. McLawhon, Ed.D. Director Institutional Assessment Elizabeth C. Bledsoe,
HELPFUL TIPS FOR UNIT PLANNING Office of Institutional Effectiveness.
Reaffirmation of Accreditation: Institutional Effectiveness Southern Association of Colleges and Schools February 2008 Stephen F. Austin State University.
Formulating objectives, general and specific
Learning Outcomes at the University of North Alabama Dr. Andrew L. Luna Institutional Research, Planning, and Assessment.
Assessment 101 Elizabeth Bledsoe.
Preparing for SACSCOC: Establishing Timelines and Responsibilities Michael T. Stephenson, Ph.D. Associate Vice Provost Professor of Communication
RUST COLLEGE MODEL FOR DEMONSTRATING INSTITUTIONAL EFFECTIVENESS.
JIC ABET WORKSHOP No.4 Guidelines on: II Faculty Survey Questionnaire.
From Learning Goals to Assessment Plans University of Wisconsin Parkside January 20, 2012 Susan Hatfield Winona State University
Critical Thinking and Argumentation
“WRITING EDUCATIONAL (a.k.a.: STUDENT LEARNING) OBJECTIVES”
Writing Is a Great Tool for Learning!
Writing Learning Outcomes David Steer & Stephane Booth Co-Chairs Learning Outcomes Committee.
Effective Lesson Planning EnhanceEdu. Agenda  Objectives  Lesson Plan  Purpose  Elements of a good lesson plan  Bloom’s Taxonomy – it’s relevance.
BUILDING A CULTURE OF ASSESSMENT AT ALBANY STATE UNIVERSITY Ian Sakura-Lemessy Ruth Salter Amitabh Singh.
1 Assessment Gary Beasley Stephen L. Athans Central Carolina Community College Spring 2008.
Quick Flip Questioning for Critical Thinking Kobets S.A. Lyceum №87.
ASSESSMENT. Assessment is the systematic and on-going process of collecting and reviewing evidence about the College's academic and administrative programs.
Levels of Questioning Mr. Bishop English 12CP.
Human Learning Asma Marghalani.
Academic Program Assessment November 18-19, 2014 Ryan J. McLawhon, Ed.D. Director Institutional Assessment Elizabeth C. Bledsoe,
Student Learning Outcomes
Writing Objectives Including Bloom’s Taxanomy. Three Primary Components of an Objective Condition –What they’re given Behavior –What they do Criteria.
What could we learn from learning outcomes assessment programs in the U.S public research universities? Samuel S. Peng Center for Educational Research.
D ESCRIBING Y OUR L EARNING Unit 5 Seminar. Agenda Unit Objectives Bloom’s Taxonomy Learning Statements Questions 2.
How to Ask Reading Questions 北一女中 寧曉君老師
By Monica Y. Peters, Ph.D. Coordinator of Institutional Effectiveness/QEP Office of Quality Enhancement.
Blooms Taxonomy Margaret Gessler Werts Department of Language, Reading, and Exceptionalities.
A Decision-Making Tool.  Goal  Educational Objectives  Student Learning Outcomes  Performance Indicators or Criteria  Learning Activities or Strategies.
Does this learning goal focus on what the student will do? Objective: Conservation of energy A.Yes B.No C.Depends on context.
Bloom’s Critical Thinking Questioning Strategies A Guide to Higher Level Thinking Adapted from Ruth Sunda and Kyrene de las Brisas.
Ms. Sana Dabeer Senior Girls PECHS Mathematics, level 10
CREDIT REQUESTS.  Credit Requests  Learning Statement Recap  Importance of Verbs  Creating Credit Requests in PDAS  Technical Support  Questions.
Developing Meaningful, Measurable Student Learning Outcomes Tulsa Community College January 2013 Susan Hatfield Professor, Winona State University
Higher Order Thinking Skills
1xx K K K Program Level Student Learning Outcomes K= Knowledge/Comprehension; A= Application / Analysis; S= Synthesis /Evaluation 1xx S K.
COMPREHENSION ANALYSIS EVALUATION APPLICATION SYNTHESIS KNOWLEDGE
INSTRUCTIONAL OBEJECTIVES PURPOSE OF IO IO DOMAINS HOW TO WRITE SMART OBJECTIVE 1.
If you want better answers, ask better questions.
Unit 5 Seminar D ESCRIBING Y OUR L EARNING. Agenda Unit Objectives Bloom’s Taxonomy Learning Statements Questions.
Assessing Student Learning Workshop for Department Chairs & Program Directors Workshop for Department Chairs & Program Directors January 9, 2007.
D ESCRIBING Y OUR L EARNING Unit 5 Seminar. Agenda Unit Objectives Bloom’s Taxonomy Learning Statements Questions.
Understanding Assessment The Basics Office for Institutional Effectiveness and Assessment.
HOW TO WRITE HISTORICALLY INTRODUCTION TO HISTORICAL KNOWLEDGE AND WRITING.
1 Learning Outcomes Assessment: An Overview of the Process at Texas State Beth Wuest Director, Academic Development and Assessment Lisa Garza Director,
The Goals of Educations Process Courtney Abarr 10/12/2015 EDU / 200 Theresa Melenas.
Bloom’s Taxonomy Dr. Middlebrooks. Bloom’s Taxonomy.
TEMPUS-ELFRUS - Project Meeting, Apri 7-9, Vienna Learning Outcomes I MANSBERGER TEMPUS-ELFRUS Learning Outcomes Reinfried MANSBERGER.
Facilitating Higher Order Thinking in Classroom and Clinical Settings Vanneise Collins, PhD Director, Center for Learning and Development Cassandra Molavrh,
Kimberlee Pottberg.  Part 1: Why we use WEAVEonline  Part 2: How to enter components.
Higher Order Thinking Overview. What to Expect in this Course This course may be different than others by: Incorporating instructional strategies that.
Assessment.
Bloom’s Taxonomy (1956) Evaluation Making critical judgments
A Focus on Higher-Order Thinking Skills
85. BLOOM’S TAXONOMY “Bloom’s Taxonomy is a guide to educational learning objectives. It is the primary focus of most traditional education.”
A guide to reading, writing, thinking and understanding
Assessment Design Essential Question Key Understandings
اهداف یادگیری حیطه ها وسطوح
A Successful Graduate Student by
Bloom’s Taxonomy (1956) Evaluation Making critical judgments
What you assess makes a statement about what you value
Higher Order Thinking Skills
A Focus on Higher-Order Thinking Skills
Our goal is to be thinking at a higher level.
? INQUIRY to question is to learn.
Presentation transcript:

Introduction to WEAVE and Assessment November 18-19, 2014 Ryan J. McLawhon, Ed.D. Director Institutional Assessment Elizabeth C. Bledsoe, M.A. Program Coordinator Institutional Assessment Kimberlee Pottberg Sr. Admin Coordinator Institutional Assessment assessment.tamu.edu

Why we use WEAVEonline How to enter components Assessment Overview Agenda

SACS Comprehensive Standard Institutional Effectiveness The institution identifies expected outcomes, assesses the extent to which it achieves these outcomes, and provides evidence of improvement based on analysis of the results in each of the following areas: (Institutional Effectiveness) educational programs, to include student learning outcomes administrative support services educational support services research within its educational mission, if appropriate community/public service within its educational mission, if appropriate SACS Expectations

SACS Comprehensive Standard Institutional Effectiveness The institution identifies expected outcomes, assesses the extent to which it achieves these outcomes, and provides evidence of improvement based on analysis of the results in each of the following areas: (Institutional Effectiveness) educational programs, to include student learning outcomes administrative support services educational support services research within its educational mission, if appropriate community/public service within its educational mission, if appropriate SACS Expectations and provides evidence of improvement based on analysis of the results…

Program-level Assessment Degree Programs –“to include student learning outcomes” –Faculty judgment of student work –Curricular and pedagogical improvements Support Offices –Efficiency of key functions –Improvement of programs and services –Student learning outcomes within mission

The Assessment Process Develop Program Mission & Outcomes Design an Assessment Plan Implement the Plan & Gather Information Interpret/ Evaluate Information Modify & Improve Adapted from: Trudy Banta, IUPUI

Update Assessment Plans (mission, outcomes, and measures with achievement targets Done by 1/12/2015 Findings EnteredDone by 8/1/2015 Action Plan(s) EnteredDone by 9/1/2015 Cycle Closes10/1/2015 Assessment Timeline

Assessment 101

Develop Mission and Outcomes Develop Program Mission & Outcomes

The mission statement links the functions of your unit to the overall mission of Texas A&M. Mission Statement

Goals: –Additional objections which may be tied to specific portions of a programs mission. –Not considered in the progress reports sent to each Assessment Liaison, but be used by individual offices if found to be useful. Goals

Learning Outcomes –Learning statements Defines the information or skills stakeholders/students will acquire from the program Program Outcomes –Process statements Relate to what the unit intends to accomplish. Examples include: –Level or volume of activity –Efficiency with which you conduct the processes –Compliance with external standards of “good practice in the field” or regulations –Satisfaction statements Describe how those you serve rate their satisfaction with your unit’s processes or services Outcomes

When writing Learning Outcomes, the focus must be on the students and what they will think, know, be able to do, or value as a result of participation in the educational environment. Learning Outcomes

Cognitive Learning Knowledge - to recall or remember facts without necessarily understanding them articulate, define, indicate, name, order, recognize, relate, recall, reproduce, list, tell, describe, identify, show, label, tabulate, quote Comprehensive - to understand and interpret learned information classify, describe, discuss, explain, express, interpret, contrast, associate, differentiate, extend, translate, review, suggest, restate Application - to put ideas and concepts to work in solving problems apply, compute, give examples, investigate, experiment, solve, choose, predict, translate, employ, operate, practice, schedule Analysis - to break information into its components to see interrelationships analyze, appraise, calculate, categorize, compare, contrast, criticize, differentiate, distinguish, examine, investigate, interpret Synthesis - to use creativity to compose and design something original arrange, assemble, collect, compose, construct, create, design, formulate, manage, organize, plan, prepare, propose, set up Evaluation - to judge the value of information based on established criteria appraise, assess, defend, judge, predict, rate, support, evaluate, recommend, convince, conclude, compare, summarize Affective Learning appreciate, accept, attempt, challenge, defend, dispute, join, judge, praise, question, share, support

Design an Assessment Plan

Measures: –Define and identify the tool used to determine the extent to which an outcome is met. Direct Indirect Measures

Direct measures are those designed to directly measure:  what a stakeholder knows or is able to do (i.e., requires a stakeholder to actually demonstrate the skill or knowledge) Examples: –A rubric used to assess a collection of work samples (student work) –Observation of behavior Direct Measures

Indirect measures focus on:  stakeholders’ perception of their level of learning  stakeholders’ perception of the benefit or satisfaction with programming, an intervention, or services Examples: –Surveys –Exit Interviews Indirect Measures

Targets: – The result, target, benchmark, or value representing success or the achievement of a given outcome. Targets

Implement & Gather Information Implement the Plan & Gather Information

Findings: –A concise summary of the results gathered from a given assessment measure. –Note: The language of this statement should parallel the corresponding achievement target. Results should be described in enough detail to prove you have met, partially met, or not met the achievement target. Findings

Interpret/Evaluate Information

Analysis Questions: –Responses to provided questions which provide an update of ongoing action plans as well as an opportunity to discuss the significance of new action plans. Analysis Questions

Modify/Improve Modify & Improve

After reflecting on the findings, you and your colleagues should determine appropriate action to improve the program. This will lead to at least one action plan. Actions outlined in the action plan should be specific and relate directly to the outcome and the results of assessment. Action Plans

Using WEAVEonline

Office of Institutional Assessment Assessment Report Continuous Improvement To fulfill the action plan to address the unmet target of 80% of conference respondents indicating satisfaction with the variety of poster sessions offered, the Office of Institutional Assessment (OIA) along with the Assessment Conference Committee (ACC) sought more variety in the posters for the 2012 Assessment Conference. As a result, the percentage of respondents satisfied with the variety of posters increased from 74% to 78%. Although the 85% target was still not met during the cycle, this result shows improvement towards the target. To complete the other action plan, OIA enhanced the Assessment Review Guidelines to include more practical and applicable “good practices” for assessment liaisons to pass along to their programs as formative assessment. Additionally, the Assessment Review Rubric was modified to be more exhaustive in its evaluation of assessment reports. As a result, less variance was observed in the quality of assessment reports. Lastly, the Vice Provost of Academic Affairs supplied each dean with a college-specific, personalized memo addressing the strengths and weaknesses of assessment reports in each college. This process was well received and will continue as a service to colleges from the Office of the Vice Provost. Outcome/ObjectiveMeasureTargetFindingAction Plan O 5: Provide Excellent Concurrent and Poster Sessions Provide excellent concurrent and poster sessions for participants at the Annual Assessment Conference. M 8: Overall Assessment Conference Survey 85%, or more, of the Annual Assessment Conference attendees will report satisfaction with the Concurrent and Poster Sessions. Status: Partially Met Following the end of the 13th Annual Texas A&M Assessment Conference, an on-line conference evaluation survey was sent out to all attendees. Information gained from this survey was organized into the 13th Annual Conference Survey Report, and was distributed to the Assessment Conference Committee for review. Results from the survey questions relating to Concurrent and Plenary Sessions are below: Concurrent Sessions - Question 16: "How satisfied were you with the quantity of Concurrent Sessions?" % were "Very Satisfied" or "Satisfied" Question 17: "How satisfied were you with the variety of Concurrent Sessions?" % were "Very Satisfied" or "Satisfied" Poster Sessions - Question 19: "How satisfied were you with the quantity of Poster Sessions?" % were "Very Satisfied" or "Satisfied" Question 20: "How satisfied were you with the variety of Poster Sessions?" % were "Very Satisfied" or "Satisfied" Although we improved from the findings of 73%, based on our findings from the Assessment Cycle, 77% of respondents indicated that they were satisfied with the variety of poster sessions offered. In response, the Office of Institutional Assessment will seek posters from each track to provide a greater variety of posters during the 14th Annual Texas A&M Assessment Conference. Use of Results Although the satisfaction results from the conference survey related to the variety of poster sessions increased from 74% to 78%, the 85% target was still not met. In response, the Office of Institutional Assessment (OIA) and the Assessment Conference Committee (ACC) will ensure that each of the conference “tracks” has coverage in the poster session. OIA and the ACC have traditionally ensured track coverage in concurrent session offerings but has never paid close attention to track coverage in the poster session offerings. This strategy includes contacting specific authors of concurrent session proposals in underrepresented tracks and inviting them to consider a poster presentation, perhaps in addition to the concurrent session. Another way of thinking about it…

Assess what is important Use your findings to inform actions You do not have to assess everything every year Take-Home Messages

OIA Consultations WEAVEonline support and training Assessment plan design, clean-up, and re-design –And we can come to you! New Website: assessment.tamu.edu

Questions?