Presentation on theme: "What Works - Course and Program Assessment in STEM PKAL Network – 7 October 2013 Dr. Dianne Raubenheimer, Director of Research, Planning and Assessment,"— Presentation transcript:
What Works - Course and Program Assessment in STEM PKAL Network – 7 October 2013 Dr. Dianne Raubenheimer, Director of Research, Planning and Assessment, Meredith College
By the end of this session, participants will be able to: Identify some purposes of assessment Explain selected assessment terminology Articulate ways to align outcomes, teaching, learning and assessment Consider strategies for linking course and program assessment
Ice Breaker: Write down some thoughts A show of hands: Gathering information Measuring goals and outcomes Using diverse measures and sources Analyzing and interpreting information Reporting results Engaging stakeholders Making improvements
Assessment is the systematic collection, review, and use of information about educational programs undertaken for the purpose of improving student learning and development (Palomba & Banta,1999).
Task: Individually - Write down different types of assessment tasks you use in your courses. Use one small post-it for each type of assessment. As a group - Sort the post-its into two columns – direct and indirect assessment methods Direct assessments provide for direct examination or observation of student knowledge of skills against measureable learning outcomes. Indirect assessments of student learning ascertain the perceived extent of value of learning experiences. They assess opinions or thoughts about student knowledge or skills.
Task: Looking at the different assessment strategies. In your groups, discuss some thoughts about the purposes of assessment You may be able to cluster some of the methods with particular purposes
Assessment of student learning (knowledge, skills and attitudes) Improvement of student learning Improvement of teaching Program development, monitoring and improvement SoTL, SoA, SoP Accreditation Public accountability / Performance scorecard Internal / Intrinsic External / Extrinsic
An example from NC State : Using an observation rubric, observed 15 instructors in different disciplines teaching All indicated they were using innovative practices with technology Instructor noted the learning outcomes for the lesson being taught Observer used the SOLO taxonomy to score the SOLO level of instruction Observer used the SOLO taxonomy to score the SOLO level at students were engaged in the class Then gathered student work from the faculty that tested the outcome Examined the assessment task and student work using the SOLO taxonomy (Raubenheimer, et al, 2007)
SOLO categoryRepresentationType of outcome (Biggs, 2004) Unanticipated Extension (Extended Abstract) Create Synthesise Hypothesise Validate Predict Debate Theorise Logically Related (Relational) Apply Outline Distinguish Analyse Classify Contrast Summarise Categorise Multiple Points (Multistructural) Explain Define List Solve Describe Interpret Single Point (Unistructural) State Recognise Recall Quote Note Name
Scored Using SOLO Taxonom y Level of outcome taught Level of related student assessment task Single37 Multiple point23 Logically related82 Unanticipated extension01
In your groups: Discuss ways in which you align teaching and assessment of student learning outcomes? Select 3 individuals to each act as chair, scribe and reporter for the group
On your own: Use the Program Level-Assessment of Student Learning: Self-assessment worksheet to assess your program at your institution In your groups: Discuss ways in which you align teaching and assessment of student learning outcomes in your courses (columns 1-3)? Select 3 individuals to each act as chair, scribe and reporter for the group
Ensure the program outcomes reflect what you want students to have learned/attained/ achieved by the time they graduate Generating appropriate program outcomes Develop program outcomes - course assessment matrices Selecting appropriate assessment tasks/methods Develop a system for keeping track of program data Gather, review data and use data for ongoing, systematic program review and improvement
Develop an implementation matrix associated with program outcomes H= high, M=medium; Low = L Course1Course2Course3Course4Course5Etc Outcome 1 HMH Outcome 2 MHML Outcome 3 HLM Outcome 4 HLLH Etc
Develop an implementation and assessment matrix associated with program outcomes I = implement; A = assess Course1Course2Course3Course4Course5Etc Outcome 1 I & AI Outcome 2 II & AII Outcome 3 I & AI Outcome 4 I & AII Etc
Expected Outcomes BIO 183 BME 201 BME 203 BME 204 BME 210 BME 252 BME 301 BME 302 Co-curricula/ other sources Outcome (a)Graduating senior survey Sub-Outcome a1AAA Sub-Outcome a2AA Sub-Outcome a3AAA Sub-Outcome a4AAFE exam Outcome (b)Graduating senior survey Sub-Outcome b1xUG Research Sub-Outcome b2AAAUG Research Sub-Outcome b3AA Sub-Outcome b4AA Outcome (e)Graduating senior survey Sub-Outcome e1AAAADesign day Sub-Outcome e2AA Sub-Outcome e3A Sub-Outcome e4A Sub-Outcome e5A Articulate sub-outcomes on the matrix
Graduating Senior Survey Faculty Survey WPS 332 Test & homework scores WPS 360 Math-intensive test questions WPS 475 Math skills test GoalActualGoalActualGoalActualGoalActualGoalActual E: P: 30%+ C: NI:<10 % E: 16% P: 78% C: 7% NI: 0% E: P: 30%+ C: NI:<10 % E: 11% P: 67% C: 9% NI: 3% Avg. scores B or higher Avg. Score B+ Avg. scores B or higher Avg. Score B+ Avg. scores B or higher Avg. Score B+ Determine assessment methods and expected performance criteria(level of performance )
Develop a system for recording and tracking results
Best practices indicate that assessment is systematic and systemic. It occurs at multiple levels in an institution, vertically and horizontally, involving processes that are connected and integrated
Assessment: Arapahoe Community College acc/presidents-office/assessmenthttp://www.arapahoe.edu/about- acc/presidents-office/assessment Biggs, J. (2004). Teaching for quality learning at university. Berkshire, UK: Open University Press. Heer, R. (2009). A Model of Learning Objectives. Center for Excellence in Learning and Teaching, Iowa State University. Maki, P.L. (2004). Assessing for learning: Building a sustainable commitment across the institution. Sterling, VA: Stylus. Mertler, C.A. (2003). Classroom assessment: A practical guide for educators. Los Angeles: Pyrczak. Ozturk, H. & Raubenheimer, C.D. (2011). PAT: An online program A\assessment tool. Paper presented at 2011 American Society for Engineering Education International Conference, Vancouver, Canada. Palomba, C.A & Banta, T.W. (1999). Assessment essentials: Planning, implementing and improving assessment in higher education. San Francisco, CA: Jossey-Bass. Raubenheimer, C. D., Spurlin, J., Martin, S., & Mehlenbacher, B. Faculty in Technology-Rich Contexts: Connecting Teaching, Learning, and Assessment in the Classroom UNC TLT Conference Proceedings. Rogers, G. (2004). Self-Assessment: Quality Assurance of Program Level-Assessment of Student Learning. Rose Hulman Institute of Technology. Spurlin, J.E., Rajala, S.A., & Lavelle, J.P. (2008). Designing better engineering education through assessment: A practical resource for faculty and department chairs on using assessment and ABET criteria to improve student learning. Sterling, VA, Stylus Publishing.