Presentation is loading. Please wait.

Presentation is loading. Please wait.

FaCET Workshop on Assessment Basics Nathan Lindsay September 18, 2013.

Similar presentations


Presentation on theme: "FaCET Workshop on Assessment Basics Nathan Lindsay September 18, 2013."— Presentation transcript:

1 FaCET Workshop on Assessment Basics Nathan Lindsay September 18, 2013

2

3 Can dogs talk?

4 Our Vision for Assessment To provide sufficient support and guidance to help you realize the dividends for the time/effort invested Enhanced learning Improved programs/degrees Greater communication about teaching/learning among faculty To create a culture of learning, where striving to enrich our students’ learning is what is valued

5 I think that good teaching is more art than science. 1. Strongly agree 2. Agree 3. Neither agree nor disagree 4. Disagree 5. Strongly disagree 6. Not applicable

6 Some Guiding Assumptions… Teaching and learning can be improved through systematic inquiry Assessment is always a work in progress, and it’s ok if things don’t go perfectly Assessment is about lessons learned in the efforts to enhance learning/teaching Goal of the Assessment Annual Report = To demonstrate concerted effort on the part of faculty to examine student outcomes and make appropriate adjustments to improve program

7 I think that the quality of student learning at UMKC is excellent. 1. Strongly agree 2. Agree 3. Neither agree nor disagree 4. Disagree 5. Strongly disagree 6. Don’t know/Not applicable

8 Four “Big Picture” questions to ask about assessment How do you define a successful student? What have you learned about your students’ learning? Are you satisfied with the results? If not satisfied with the results, what are you going to do about it?

9 Accreditation Reporting Number & Amount Quality & Utility Interpreting Learning Internal QuestionsExternal Questions Collecting itUsing it Compliance Commitment Assessing Our University’s (& Your Department’s) Assessment Efforts

10 Initial Assessment Components for Each Academic Degree Mission statement Goals (usually 2-3) Learning Outcomes (usually 3-7) Remember: SMART Specific Measurable Attainable Relevant/Results- Oriented Time-bound

11 Measurements Complete Measurements Process What instrument? why? formative or summative assessment? direct or indirect measure? if possible, it’s best to use multiple measures How conduct measurement? which students? when measured? where? how administered? by whom? often good to use smaller samples of students; capstone courses How collect and store data? Who analyzes data? how? when? Who reports? to faculty: how? when? where? to WEAVE?

12 Achievement Targets What kind of performance do you expect from your students on your learning outcomes? What is the desirable level of performance for your students Rubrics can clarify this (see the next slides) What percentage of students do you expect to achieve this?

13 Using Rubrics A rubric is: “a set of criteria and a scoring scale that is used to assess and evaluate students’ work” (Cambell, Melenyzer, Nettles, & Wyman, 2000). Addresses performance standards in a clear and concise manner (which students appreciate!) Clearly articulates to students the areas of improvement needed to meet these standards Blackboard has a new Rubric feature that makes the process straightforward and easier To find examples, Google rubrics for your discipline, or see the Rubistar website http://rubistar.4teachers.org/http://rubistar.4teachers.org/

14 Example of a Rubric UMKC Foreign Languages and Literatures Assessment Tool for Oral Proficiency Interview adapted from “Interpersonal Mode Rubric Pre-Advanced Learner” 2003 ACTFL CategoryExceeds ExpectationsMeets ExpectationsDoes Not Meet Expectations Comprehensibility Who can understand this person’s meaning? How sympathetic must the listener be? Does it need to be the teacher or could a native speaker understand the speaker? How independent of teaching situation is the conversation? Easily understood by native speakers, even those unaccustomed to interacting with language learners. Clear evidence of culturally appropriate language, Although there may be some confusion about the message, generally understood by those unaccustomed to interacting with language learners. Generally understood by those accustomed to interacting with language learners. Language Control Accuracy, form, appropriate vocabulary, degree of fluency High degree of accuracy in present, past and future time. Accuracy may decrease when attempting to handle abstract topics Most accurate with connected discourse in present time. Accuracy decreases when narrating and describing in time frames other than present. Most accurate with connected sentence-level discourse in present time. Accuracy decreases as language becomes complex.

15 How to build a rubric Answer the following questions: Given your broad course goals, what determines the extent of student understanding? What criterion counts as EVIDENCE of student learning? What specific characteristics in student responses, products or performances should be examined as evidence of student learning?

16 volume poise conclusion eye contact style appearance gestures rate evidence sources examples organization transitions verbal variety attention getter Developing a rubric helps you to clarify the characteristics/components of your Learning Outcomes: For example: Can our students deliver an effective Public Speech?

17 Rubrics Resources at UMKC Two new pages discussing rubrics are available on UMKC’s Blackboard Support Site. http://www.umkc.edu/ia/its/support/blackboard/fa culty/rubrics.asp http://www.umkc.edu/ia/its/support/blackboard/fa culty/rubrics.asp http://www.umkc.edu/ia/its/support/blackboard/fa culty/rubrics-bb.asp http://www.umkc.edu/ia/its/support/blackboard/fa culty/rubrics-bb.asp

18 Training for Rubrics on Blackboard For assistance with using Rubrics in Blackboard, please contact Molly Mead Instructional Designer, E-Learning Experiences at 235-6595 or meadmo@umkc.edu meadmo@umkc.edu

19 More rubric help AACU Rubrics http://www.aacu.org/value/rubrics Rubrics from Susan Hatfield (HLC Mentor) www.winona.edu/air/rubrics.htm Rubistar http://rubistar.4teachers.org/

20 Findings What do the data tell you? Part I: specific findings Compare new data to achievement targets Did students meet or deviate from expectations? Important: Include specific numbers/percentages when possible Do not use course grades or pass rates. Optional: Post anonymous data or files in WEAVE Document Management section

21 Findings (cont.) what do the data tell you? Part II: general findings What lessons did your faculty learn from this evidence about your students? What broader implications do you draw about your program? Ex: curriculum, admissions, administration, policies, requirement, pedagogy, assessment procedures, and so on ◦ Conversations The more people involved, the better!

22 Action Plans Concrete Steps for Change list of specific innovations that you would like to introduce in AY 2013-14 to address lessons learned in AY 2012- 13. Again, in curriculum, admissions, administration, policies, requirement, pedagogy, assessment procedures, and so on Resources? Time Period? Point Person? It is best to have documentation of the changes made through these Action Plans (e.g., in syllabi, the course catalogue, meeting minutes)

23 Submitting the Assessment Annual Report Part I: Detailed Assessment Report “Assessment Plan Content” All items (mission -> action plans) submitted in WEAVEonline to log in to WEAVE, go to https://app.weaveonl ine.com/umkc/login. aspx https://app.weaveonl ine.com/umkc/login. aspx

24 Using WEAVE for the 2012-2013 Assessment Cycle Everything from previous cycles has carried over into the 2012-2013 assessment cycle If you are creating entirely new goals, learning outcomes, etc., don’t write these over the top of old items (this will mess up your linked associations in WEAVE). Create new ones. If you need to delete something in WEAVE, please contact me, and I will do it for you

25 Sharing Assessment Plans: Printing Reports from WEAVE Click on the “Reports” tab Under “Select cycle,” choose your cycle (the 2012-2013 cycle should be chosen if you’d like your findings listed) Under “Select a report,” there is a button you can select for “Assessment Data by Section” to make your report a little shorter Under “Select report entities,” choose the areas you would like to report

26 Printing Reports from WEAVE (cont.) Click on “Next” (on the right side of the page) On the second page, under “Report-Specific Parameters,” click on “Keep user-inserted formatting.” Click on “Run” (on the right side of the page) The Report will come up in a new window, and this can be copied and pasted into a Word document.

27 Assessment Plan Narrative Part II: Timeline/Account of Activities “Assessment Plan Narrative” In 1-2 pages, tell the story of all the work and careful consideration you and your colleagues accomplished in your assessment work this year (Ex.: meetings, mentoring, experiments, setbacks, lessons learned) Submit this in the Document Management section in WEAVE Please follow the four outlined questions (see next slide)

28 Four Questions for the Assessment Narrative 1) Process: Please describe the specific activities and efforts used to design, implement, and analyze your assessment plan during this academic year. This narrative might be organized chronologically, listing meetings, mentoring sessions, and experiments at each stage of the developmental process including the names of people involved in various capacities, with each event given one paragraph. 2) Positives: Please describe what was most useful about the assessment process, or what went well. What did you learn about your faculty, students, or program through this experience? 3) Challenges: Please describe the challenges you encountered in terms of the development or implementation of your assessment procedures, as well as the lessons you learned from this experience and your efforts or plans for overcoming them. This section might be organized topically. 4) Support: Please describe your program’s experience during the past year with the support and administrative structures in place at UMKC for Assessment: the Provost’s Office, the University Assessment Committee, FaCET, and so on. If there are ways in which these areas could be improved to better support your efforts in assessment, please make those suggestions here.

29 Avoiding “Garbage In, Garbage Out” An assessment plan submitted for each degree is not enough Focus on encouraging best practices Enhancing overall quality through: One-on-one mentoring Multiple drafts/iterative process Timely and thorough peer review given for all degrees and programs (requiring many hours!)

30 Submission: October 1 st 2013 Final reporting complete for the 2012-2013 assessment cycle No edits allowed after 1 st of October During the fall semester, the University Assessment Committee and the Asst. VP for Assessment will give feedback on these Annual Reports

31 After October 1 st Assessment entries for AY 2013-14 begin Assessment Cycle runs from June 1, 2013 to May 30, 2014 Need to implement the Action Plans from last year Update mission statements, goals, learning outcomes, and measurements based on feedback from UAC. Items in WEAVE carry over from last year unless changed. Enter new findings and action plans.

32 Assessment Resources University Assessment website: http://www.umkc.edu/asse ssment/index.cfm http://www.umkc.edu/asse ssment/index.cfm Academic degree assessment General education assessment University Assessment Committee

33 Assessment Resources Assessment Handbook Core principles and processes regarding UMKC assessment WEAVE guidelines Assessment glossary 10 FAQs Appendices Available at http://www.umkc.edu/provost/academic- assessment/downloads/handbook-2011.pdf http://www.umkc.edu/provost/academic- assessment/downloads/handbook-2011.pdf

34 Assessment Projects from Recent Years UMKC Assessment Plan (see handout) and General Education Assessment Plan (http://www.umkc.edu/assessment/downloads/general- education-assessment-plan-6-28-12.pdf)http://www.umkc.edu/assessment/downloads/general- education-assessment-plan-6-28-12.pdf Develop assessment plans for free-standing minors and certificate programs Use the major field exams, WEPT (now RooWriter), and ETS-Proficiency Profile to inform practices across the campus Conduct pilot assessments for General Education

35 Goals for 2011-2012, 2012-2013 Here’s what we hope to see in the WEAVE reports and narratives More faculty/staff involvement within each department Additional learning outcomes measured (so that all outcomes are measured in a three-year cycle) Data showing that changes made to curriculum, pedagogy, advising, services, etc. were related to higher student learning outcomes. In other words, if scores from 2012-2013 are significantly higher than the previous year, please highlight these. Again, we need to have assessment findings and action plans from 100% of departments for our Higher Learning Commission requirements

36 Ongoing Assessment Initiatives at UMKC Helping faculty to develop their assessment plans for the new General Education courses Integrating assessment work more effectively with the Program Evaluation Committee Having departments post their student learning outcomes on their websites Encouraging departments to establish departmental level assessment committees

37 A Few More Areas of Assessment Progress Encouraging higher order thinking as students progress through the curriculum Using multiple types of assessments Assessing students’ learning in high impact experiences (internships, undergraduate research, service learning, study abroad) Student surveys gauging their learning/satisfaction in the department Making sure that the curriculum and pedagogy is more directly tied to your learning outcomes (i.e., curriculum mapping)

38 KNOWLEDGE COMPREHENSION APPLICATION ANALYSIS SYNTHESIS EVALUATION Cite Count Define Draw Identify List Name Point Quote Read Recite Record Repeat Select State Tabulate Tell Trace Underline Associate Classify Compare Compute Contrast Differentiate Discuss Distinguish Estimate Explain Express Extrapolate Interpolate Locate Predict Report Restate Review Tell Translate Apply Calculate Classify Demonstrate Determine Dramatize Employ Examine Illustrate Interpret Locate Operate Order Practice Report Restructure Schedule Sketch Solve Translate Use Write Analyze Appraise Calculate Categorize Classify Compare Debate Diagram Differentiate Distinguish Examine Experiment Inspect Inventory Question Separate Su rize Test Arrange Assemble Collect Compose Construct Create Design Formulate Integrate Manage Organize Plan Prepare Prescribe Produce Propose Specify Synthesize Write Appraise Assess Choose Compare Criticize Determine Estimate Evaluate Grade Judge Measure Rank Rate Recommend Revise Score Select Standardize Test Validate Lower level course outcomes

39 KNOWLEDGE COMPREHENSION APPLICATION ANALYSIS SYNTHESIS EVALUATION Cite Count Define Draw Identify List Name Point Quote Read Recite Record Repeat Select State Tabulate Tell Trace Underline Associate Classify Compare Compute Contrast Differentiate Discuss Distinguish Estimate Explain Express Extrapolate Interpolate Locate Predict Report Restate Review Tell Translate Apply Calculate Classify Demonstrate Determine Dramatize Employ Examine Illustrate Interpret Locate Operate Order Practice Report Restructure Schedule Sketch Solve Translate Use Write Analyze Appraise Calculate Categorize Classify Compare Debate Diagram Differentiate Distinguish Examine Experiment Inspect Inventory Question Separate Summarize Test Arrange Assemble Collect Compose Construct Create Design Formulate Integrate Manage Organize Plan Prepare Prescribe Produce Propose Specify Synthesize Write Appraise Assess Choose Compare Criticize Determine Estimate Evaluate Grade Judge Measure Rank Rate Recommend Revise Score Select Standardize Test Validate Advanced Course / Program outcomes

40 1xx K K K 7 1 2 3 4 5 6 Program Level Student Learning Outcomes K= Knowledge/Comprehension; A= Application / Analysis; S= Synthesis /Evaluation 1xx S K K 2xx A A S A A K 3xx A K A A K A 4xx S A K S Capstone S S

41 Questions?

42 Contact Information For assistance with assessment, please contact Nathan Lindsay, Assistant Vice Provost for Assessment at 235- 6084 or lindsayn@umkc.edulindsayn@umkc.edu (After November 1 st ) Barb Glesner Fines, FaCET Mentor for Assessment at 235- 2380 or glesnerb@umkc.eduglesnerb@umkc.edu


Download ppt "FaCET Workshop on Assessment Basics Nathan Lindsay September 18, 2013."

Similar presentations


Ads by Google