FaCET Workshop on Assessment Basics Nathan Lindsay September 18, 2013.

Slides:



Advertisements
Similar presentations
Outcomes and Standards. Outcome Curricular statements describing how students will integrate knowledge, skills, and values into a complex role performance.
Advertisements

GENERAL EDUCATION ASSESSMENT Nathan Lindsay January 22-23,
ACADEMIC DEGREE ASSESSMENT & GENERAL EDUCATION ASSESSMENT Nathan Lindsay Arts & Sciences Faculty Meeting March 12,
Course Design: The Basics Monica A. Devanas, Ph.D. Director, Faculty Development and Assessment Programs Center for Teaching Advancement and Assessment.
Using WEAVE Online Nathan Lindsay & Dan Stroud September 4, 2013.
STUDENT LEARNING OUTCOMES ASSESSMENT. Cycle of Assessment Course Goals/ Intended Outcomes Means Of Assessment And Criteria For Success Summary of Data.
Assessment 2.0: The Next Frontier Dr. Nathan Lindsay Assistant Vice Provost for Assessment October 28, 2011.
Finding Your Way to More Intentional and Coherent Learning Outcomes Nathan Lindsay April 23, 2014.
Writing Outcomes & Measures Adapted from Susan Hatfield Presented to DC faculty InterCampus Day Spring 2012.
Intellectual Challenge of Teaching
The Missing Link: Development of Programmatic Outcomes Christopher Meseke,Ph.D. Park University.
HOW TO EXCEL ON ESSAY EXAMS San José State University Writing Center Dr. Jim Lobdell.
Writing the Syllabus Teaching Skills Purpose of Syllabus Communicates what the course is about Communicates what students need to know in the beginning.
Bloom's Taxonomy of Learning (Cognitive domain)
Bloom's Taxonomy of Cognitive Development
Writing Goals and Objectives EDUC 490 Spring 2007.
Nathan Lindsay, UM Associate Provost MUS Assessment Workshop
Purpose Program The purpose of this presentation is to clarify the process for conducting Student Learning Outcomes Assessment at the Program Level. At.
Formulating objectives, general and specific
RUST COLLEGE MODEL FOR DEMONSTRATING INSTITUTIONAL EFFECTIVENESS.
CAA’s IBHE Program Review Presentation April 22, 2011.
Writing Objectives Given proper instruction teachers will be able to write one objective within their curricular area.
From Learning Goals to Assessment Plans University of Wisconsin Parkside January 20, 2012 Susan Hatfield Winona State University
Youngstown State University September 2011 Susan Hatfield Winona State University Winona, MN Assessing your Assessment Plan.
College of Sciences & Arts Program Learning Goals  What is a learning goal?  How is it used for assessment?  What is a good program goal?  What not.
NORTH CAROLINA TEACHER EVALUATION INSTRUMENT and PROCESS
Lesson Planning. Teachers Need Lesson Plans So that they know that they are teaching the curriculum standards required by the county and state So that.
Helping Your Department Advance and Implement Effective Assessment Plans Presented by: Karen Froslid Jones Director, Institutional Research.
Chemistry B.S. Degree Program Assessment Plan Dr. Glenn Cunningham Professor and Chair University of Central Florida April 21, 2004.
TTE 350 Lecture Notes for 1/24/01. Review What is Distance Ed? –Teaching and learning opportunities where students are physically Separated and technology.
EDU 385 Education Assessment in the Classroom
Assessment 101: The Core Curriculum Susan Hatfield - Winona State UNC Wilmington November 2005
Agenda Peer Assessment Roundtables – Student Learning 10: :10 – Welcome and Explain Process 10: :30 – Full Group: Coaching Assessment – SLOs.
BUILDING A CULTURE OF ASSESSMENT AT ALBANY STATE UNIVERSITY Ian Sakura-Lemessy Ruth Salter Amitabh Singh.
Paul Parkison: Teacher Education 1 Articulating and Assessing Learning Outcomes Stating Objectives Developing Rubrics Utilizing Formative Assessment.
Northern Illinois University March 2009 Susan Hatfield Winona State University Leave No Outcome Behind: Teaching, Learning & Assessment.
1 Assessment Gary Beasley Stephen L. Athans Central Carolina Community College Spring 2008.
Human Learning Asma Marghalani.
Introduction to WEAVE and Assessment November 18-19, 2014 Ryan J. McLawhon, Ed.D. Director Institutional Assessment Elizabeth C.
EDU 385 CLASSROOM ASSESSMENT Week 1 Introduction and Syllabus.
Student Learning Outcomes
What could we learn from learning outcomes assessment programs in the U.S public research universities? Samuel S. Peng Center for Educational Research.
Assessing Program Level Student Learning Outcomes: What you need to know University of Wisconsin Parkside January 20, 2012 Susan Hatfield Winona State.
Questioning Techniques
How to Ask Reading Questions 北一女中 寧曉君老師
1 Math 413 Mathematics Tasks for Cognitive Instruction October 2008.
A Decision-Making Tool.  Goal  Educational Objectives  Student Learning Outcomes  Performance Indicators or Criteria  Learning Activities or Strategies.
Ms. Sana Dabeer Senior Girls PECHS Mathematics, level 10
CREDIT REQUESTS.  Credit Requests  Learning Statement Recap  Importance of Verbs  Creating Credit Requests in PDAS  Technical Support  Questions.
Developing Meaningful, Measurable Student Learning Outcomes Tulsa Community College January 2013 Susan Hatfield Professor, Winona State University
1xx K K K Program Level Student Learning Outcomes K= Knowledge/Comprehension; A= Application / Analysis; S= Synthesis /Evaluation 1xx S K.
COMPREHENSION ANALYSIS EVALUATION APPLICATION SYNTHESIS KNOWLEDGE
Unit 5 Seminar D ESCRIBING Y OUR L EARNING. Agenda Unit Objectives Bloom’s Taxonomy Learning Statements Questions.
Teaching and Thinking According to Blooms Taxonomy human thinking can be broken down into six categories.
Understanding Assessment The Basics Office for Institutional Effectiveness and Assessment.
The Goals of Educations Process Courtney Abarr 10/12/2015 EDU / 200 Theresa Melenas.
JACKSON STATE UNIVERSITY ACADEMIC ASSESSMENT COMMITTEE WORKSHOP By: Dr. Shemeka McClung Director Ms. Arnitra Hunter Research Associate Institutional Research.
Incorporating Program Assessment into Your Annual Program Review June 29, 2006.
Workshop 2014 Cam Xuyen, October 14, 2014 Testing/ assessment/ evaluation BLOOM’S TAXONOMY.
Making Assessment Meaningful
Setting SMART Objectives
J. Sterling Morton High Schools
Bloom’s Taxonomy (1956) Evaluation Making critical judgments
85. BLOOM’S TAXONOMY “Bloom’s Taxonomy is a guide to educational learning objectives. It is the primary focus of most traditional education.”
Evaluating Classroom assignments: Planning for Grading
Outcomes assessment Basics
اهداف یادگیری حیطه ها وسطوح
Bloom’s Taxonomy (1956) Evaluation Making critical judgments
What you assess makes a statement about what you value
Program Assessment Core Assessment Course Assessment
Presentation transcript:

FaCET Workshop on Assessment Basics Nathan Lindsay September 18, 2013

Can dogs talk?

Our Vision for Assessment To provide sufficient support and guidance to help you realize the dividends for the time/effort invested Enhanced learning Improved programs/degrees Greater communication about teaching/learning among faculty To create a culture of learning, where striving to enrich our students’ learning is what is valued

I think that good teaching is more art than science. 1. Strongly agree 2. Agree 3. Neither agree nor disagree 4. Disagree 5. Strongly disagree 6. Not applicable

Some Guiding Assumptions… Teaching and learning can be improved through systematic inquiry Assessment is always a work in progress, and it’s ok if things don’t go perfectly Assessment is about lessons learned in the efforts to enhance learning/teaching Goal of the Assessment Annual Report = To demonstrate concerted effort on the part of faculty to examine student outcomes and make appropriate adjustments to improve program

I think that the quality of student learning at UMKC is excellent. 1. Strongly agree 2. Agree 3. Neither agree nor disagree 4. Disagree 5. Strongly disagree 6. Don’t know/Not applicable

Four “Big Picture” questions to ask about assessment How do you define a successful student? What have you learned about your students’ learning? Are you satisfied with the results? If not satisfied with the results, what are you going to do about it?

Accreditation Reporting Number & Amount Quality & Utility Interpreting Learning Internal QuestionsExternal Questions Collecting itUsing it Compliance Commitment Assessing Our University’s (& Your Department’s) Assessment Efforts

Initial Assessment Components for Each Academic Degree Mission statement Goals (usually 2-3) Learning Outcomes (usually 3-7) Remember: SMART Specific Measurable Attainable Relevant/Results- Oriented Time-bound

Measurements Complete Measurements Process What instrument? why? formative or summative assessment? direct or indirect measure? if possible, it’s best to use multiple measures How conduct measurement? which students? when measured? where? how administered? by whom? often good to use smaller samples of students; capstone courses How collect and store data? Who analyzes data? how? when? Who reports? to faculty: how? when? where? to WEAVE?

Achievement Targets What kind of performance do you expect from your students on your learning outcomes? What is the desirable level of performance for your students Rubrics can clarify this (see the next slides) What percentage of students do you expect to achieve this?

Using Rubrics A rubric is: “a set of criteria and a scoring scale that is used to assess and evaluate students’ work” (Cambell, Melenyzer, Nettles, & Wyman, 2000). Addresses performance standards in a clear and concise manner (which students appreciate!) Clearly articulates to students the areas of improvement needed to meet these standards Blackboard has a new Rubric feature that makes the process straightforward and easier To find examples, Google rubrics for your discipline, or see the Rubistar website

Example of a Rubric UMKC Foreign Languages and Literatures Assessment Tool for Oral Proficiency Interview adapted from “Interpersonal Mode Rubric Pre-Advanced Learner” 2003 ACTFL CategoryExceeds ExpectationsMeets ExpectationsDoes Not Meet Expectations Comprehensibility Who can understand this person’s meaning? How sympathetic must the listener be? Does it need to be the teacher or could a native speaker understand the speaker? How independent of teaching situation is the conversation? Easily understood by native speakers, even those unaccustomed to interacting with language learners. Clear evidence of culturally appropriate language, Although there may be some confusion about the message, generally understood by those unaccustomed to interacting with language learners. Generally understood by those accustomed to interacting with language learners. Language Control Accuracy, form, appropriate vocabulary, degree of fluency High degree of accuracy in present, past and future time. Accuracy may decrease when attempting to handle abstract topics Most accurate with connected discourse in present time. Accuracy decreases when narrating and describing in time frames other than present. Most accurate with connected sentence-level discourse in present time. Accuracy decreases as language becomes complex.

How to build a rubric Answer the following questions: Given your broad course goals, what determines the extent of student understanding? What criterion counts as EVIDENCE of student learning? What specific characteristics in student responses, products or performances should be examined as evidence of student learning?

volume poise conclusion eye contact style appearance gestures rate evidence sources examples organization transitions verbal variety attention getter Developing a rubric helps you to clarify the characteristics/components of your Learning Outcomes: For example: Can our students deliver an effective Public Speech?

Rubrics Resources at UMKC Two new pages discussing rubrics are available on UMKC’s Blackboard Support Site. culty/rubrics.asp culty/rubrics.asp culty/rubrics-bb.asp culty/rubrics-bb.asp

Training for Rubrics on Blackboard For assistance with using Rubrics in Blackboard, please contact Molly Mead Instructional Designer, E-Learning Experiences at or

More rubric help AACU Rubrics Rubrics from Susan Hatfield (HLC Mentor) Rubistar

Findings What do the data tell you? Part I: specific findings Compare new data to achievement targets Did students meet or deviate from expectations? Important: Include specific numbers/percentages when possible Do not use course grades or pass rates. Optional: Post anonymous data or files in WEAVE Document Management section

Findings (cont.) what do the data tell you? Part II: general findings What lessons did your faculty learn from this evidence about your students? What broader implications do you draw about your program? Ex: curriculum, admissions, administration, policies, requirement, pedagogy, assessment procedures, and so on ◦ Conversations The more people involved, the better!

Action Plans Concrete Steps for Change list of specific innovations that you would like to introduce in AY to address lessons learned in AY Again, in curriculum, admissions, administration, policies, requirement, pedagogy, assessment procedures, and so on Resources? Time Period? Point Person? It is best to have documentation of the changes made through these Action Plans (e.g., in syllabi, the course catalogue, meeting minutes)

Submitting the Assessment Annual Report Part I: Detailed Assessment Report “Assessment Plan Content” All items (mission -> action plans) submitted in WEAVEonline to log in to WEAVE, go to ine.com/umkc/login. aspx ine.com/umkc/login. aspx

Using WEAVE for the Assessment Cycle Everything from previous cycles has carried over into the assessment cycle If you are creating entirely new goals, learning outcomes, etc., don’t write these over the top of old items (this will mess up your linked associations in WEAVE). Create new ones. If you need to delete something in WEAVE, please contact me, and I will do it for you

Sharing Assessment Plans: Printing Reports from WEAVE Click on the “Reports” tab Under “Select cycle,” choose your cycle (the cycle should be chosen if you’d like your findings listed) Under “Select a report,” there is a button you can select for “Assessment Data by Section” to make your report a little shorter Under “Select report entities,” choose the areas you would like to report

Printing Reports from WEAVE (cont.) Click on “Next” (on the right side of the page) On the second page, under “Report-Specific Parameters,” click on “Keep user-inserted formatting.” Click on “Run” (on the right side of the page) The Report will come up in a new window, and this can be copied and pasted into a Word document.

Assessment Plan Narrative Part II: Timeline/Account of Activities “Assessment Plan Narrative” In 1-2 pages, tell the story of all the work and careful consideration you and your colleagues accomplished in your assessment work this year (Ex.: meetings, mentoring, experiments, setbacks, lessons learned) Submit this in the Document Management section in WEAVE Please follow the four outlined questions (see next slide)

Four Questions for the Assessment Narrative 1) Process: Please describe the specific activities and efforts used to design, implement, and analyze your assessment plan during this academic year. This narrative might be organized chronologically, listing meetings, mentoring sessions, and experiments at each stage of the developmental process including the names of people involved in various capacities, with each event given one paragraph. 2) Positives: Please describe what was most useful about the assessment process, or what went well. What did you learn about your faculty, students, or program through this experience? 3) Challenges: Please describe the challenges you encountered in terms of the development or implementation of your assessment procedures, as well as the lessons you learned from this experience and your efforts or plans for overcoming them. This section might be organized topically. 4) Support: Please describe your program’s experience during the past year with the support and administrative structures in place at UMKC for Assessment: the Provost’s Office, the University Assessment Committee, FaCET, and so on. If there are ways in which these areas could be improved to better support your efforts in assessment, please make those suggestions here.

Avoiding “Garbage In, Garbage Out” An assessment plan submitted for each degree is not enough Focus on encouraging best practices Enhancing overall quality through: One-on-one mentoring Multiple drafts/iterative process Timely and thorough peer review given for all degrees and programs (requiring many hours!)

Submission: October 1 st 2013 Final reporting complete for the assessment cycle No edits allowed after 1 st of October During the fall semester, the University Assessment Committee and the Asst. VP for Assessment will give feedback on these Annual Reports

After October 1 st Assessment entries for AY begin Assessment Cycle runs from June 1, 2013 to May 30, 2014 Need to implement the Action Plans from last year Update mission statements, goals, learning outcomes, and measurements based on feedback from UAC. Items in WEAVE carry over from last year unless changed. Enter new findings and action plans.

Assessment Resources University Assessment website: ssment/index.cfm ssment/index.cfm Academic degree assessment General education assessment University Assessment Committee

Assessment Resources Assessment Handbook Core principles and processes regarding UMKC assessment WEAVE guidelines Assessment glossary 10 FAQs Appendices Available at assessment/downloads/handbook-2011.pdf assessment/downloads/handbook-2011.pdf

Assessment Projects from Recent Years UMKC Assessment Plan (see handout) and General Education Assessment Plan ( education-assessment-plan pdf) education-assessment-plan pdf Develop assessment plans for free-standing minors and certificate programs Use the major field exams, WEPT (now RooWriter), and ETS-Proficiency Profile to inform practices across the campus Conduct pilot assessments for General Education

Goals for , Here’s what we hope to see in the WEAVE reports and narratives More faculty/staff involvement within each department Additional learning outcomes measured (so that all outcomes are measured in a three-year cycle) Data showing that changes made to curriculum, pedagogy, advising, services, etc. were related to higher student learning outcomes. In other words, if scores from are significantly higher than the previous year, please highlight these. Again, we need to have assessment findings and action plans from 100% of departments for our Higher Learning Commission requirements

Ongoing Assessment Initiatives at UMKC Helping faculty to develop their assessment plans for the new General Education courses Integrating assessment work more effectively with the Program Evaluation Committee Having departments post their student learning outcomes on their websites Encouraging departments to establish departmental level assessment committees

A Few More Areas of Assessment Progress Encouraging higher order thinking as students progress through the curriculum Using multiple types of assessments Assessing students’ learning in high impact experiences (internships, undergraduate research, service learning, study abroad) Student surveys gauging their learning/satisfaction in the department Making sure that the curriculum and pedagogy is more directly tied to your learning outcomes (i.e., curriculum mapping)

KNOWLEDGE COMPREHENSION APPLICATION ANALYSIS SYNTHESIS EVALUATION Cite Count Define Draw Identify List Name Point Quote Read Recite Record Repeat Select State Tabulate Tell Trace Underline Associate Classify Compare Compute Contrast Differentiate Discuss Distinguish Estimate Explain Express Extrapolate Interpolate Locate Predict Report Restate Review Tell Translate Apply Calculate Classify Demonstrate Determine Dramatize Employ Examine Illustrate Interpret Locate Operate Order Practice Report Restructure Schedule Sketch Solve Translate Use Write Analyze Appraise Calculate Categorize Classify Compare Debate Diagram Differentiate Distinguish Examine Experiment Inspect Inventory Question Separate Su rize Test Arrange Assemble Collect Compose Construct Create Design Formulate Integrate Manage Organize Plan Prepare Prescribe Produce Propose Specify Synthesize Write Appraise Assess Choose Compare Criticize Determine Estimate Evaluate Grade Judge Measure Rank Rate Recommend Revise Score Select Standardize Test Validate Lower level course outcomes

KNOWLEDGE COMPREHENSION APPLICATION ANALYSIS SYNTHESIS EVALUATION Cite Count Define Draw Identify List Name Point Quote Read Recite Record Repeat Select State Tabulate Tell Trace Underline Associate Classify Compare Compute Contrast Differentiate Discuss Distinguish Estimate Explain Express Extrapolate Interpolate Locate Predict Report Restate Review Tell Translate Apply Calculate Classify Demonstrate Determine Dramatize Employ Examine Illustrate Interpret Locate Operate Order Practice Report Restructure Schedule Sketch Solve Translate Use Write Analyze Appraise Calculate Categorize Classify Compare Debate Diagram Differentiate Distinguish Examine Experiment Inspect Inventory Question Separate Summarize Test Arrange Assemble Collect Compose Construct Create Design Formulate Integrate Manage Organize Plan Prepare Prescribe Produce Propose Specify Synthesize Write Appraise Assess Choose Compare Criticize Determine Estimate Evaluate Grade Judge Measure Rank Rate Recommend Revise Score Select Standardize Test Validate Advanced Course / Program outcomes

1xx K K K Program Level Student Learning Outcomes K= Knowledge/Comprehension; A= Application / Analysis; S= Synthesis /Evaluation 1xx S K K 2xx A A S A A K 3xx A K A A K A 4xx S A K S Capstone S S

Questions?

Contact Information For assistance with assessment, please contact Nathan Lindsay, Assistant Vice Provost for Assessment at or (After November 1 st ) Barb Glesner Fines, FaCET Mentor for Assessment at or