Understanding and Using Assessment Results Linda Suskie, Vice President Middle States Commission on Higher Education 3624 Market Street, Philadelphia PA.

Slides:



Advertisements
Similar presentations
The Commissions Expectations for the Assessment of Student Learning and Institutional Effectiveness Beth Paul Interim Provost and Vice President for Academic.
Advertisements

Understanding Middle States Expectations for Assessment
Meeting MSCHE Assessment Expectations
The Assessment Toolbox Linda Suskie Middle States Commission on Higher Education AB Tech February 2005.
Setting internal Quality Assurance systems
EVALUATOR TIPS FOR REVIEW AND COMMENT WRITING The following slides were excerpted from an evaluator training session presented as part of the June 2011.
What “Counts” as Evidence of Student Learning in Program Assessment?
Introduction to Grading Grading is one of the most important activities a faculty member does. Many problems in teaching arise because of grading issues.
Understanding MSCHE Expectations for Assessment Linda Suskie, Vice President Middle States Commission on Higher Education 3624 Market Street, Philadelphia.
Project Selection (Ch 4)
Grading. Why do we grade? To communicate To tell students how they are doing To tell parents how students are doing To make students uneasy To wield power.
Developing Your Rubric. Step 1 Identify what type of rubric you want to create – holistic or analytic.
Measuring Student Learning March 10, 2015 Cathy Sanders Director of Assessment.
Authentic Assessment Abdelmoneim A. Hassan. Welcome Authentic Assessment Qatar University Workshop.
Understanding Middle States’ Expectations for Assessment Linda Suskie, Vice President Middle States Commission on Higher Education 3624 Market Street,
DEVELOPING DEPARTMENTAL OUTCOMES ASSESSMENT PLANS Jerry Rackoff Lois Huffines Kathy Martin.
ASSESSMENT SYSTEMS FOR TSPC ACCREDITATION Assessment and Work Sample Conference January 13, 2012 Hilda Rosselli, Western Oregon University.
University of Delaware Introduction to Assessment Institute for Transforming Undergraduate Education Courtesy of Sue Groh.
Welcome… The attendee will understand assessment basics with a focus on creating learning activities and identifying assessment expectations. Apply the.
Assessment 101: Back-to-Basics An Introduction to Assessing Student Learning Outcomes.
Direct vs Indirect Assessment of Student Learning: An Introduction Dr. Sheila Handy, Chair Business Management and Co-Chair University Assessment Committee.
Outcomes Assessment 101 Assessment Showcase 2009: Best Practices in the Assessment of Learning and Teaching February 11, 2009 Gwendolyn Johnson, Ph.D.
Student Learning targets
Focus on Learning: Student Outcomes Assessment and the Learning College.
January 29, 2010ART Beach Retreat ART Beach Retreat 2010 Assessment Rubric for Critical Thinking First Scoring Session Summary ART Beach Retreat.
Classroom Assessments Checklists, Rating Scales, and Rubrics
What the Heck is Going on with Assessment and Accountability? MSCHE Student Learning Assessment Institute September 2009 Linda Suskie, Vice President Middle.
Information for school leaders and teachers regarding the process of creating Student Learning Targets. Student Learning targets.
Using Assessment Results Linda Suskie, Vice President Middle States Commission on Higher Education 3624 Market Street, Philadelphia PA Web:
Student Learning Outcomes Assessment More on Measures Assessment Brown Bag Session 2 March 16-17, 2009.
Why Are We Assessing? Linda Suskie Assessment & Accreditation Consultant University at Buffalo November 22, 2013.
IDEA Student Ratings of Instruction Shelley A. Chapman, PhD Insight Improvement Impact ® University of Alabama Birmingham September 11, 2012.
Basic Workshop For Reviewers NQAAC Recognize the developmental engagements Ensure that they operate smoothly and effectively” Ensure that all team members.
Assessment of Student Learning North American Colleges and Teachers of Agriculture Cia Verschelden June 17, 2009.
Copyright © 2009 Intel Corporation. All rights reserved. Intel, the Intel logo, Intel Education Initiative, and the Intel Teach Program are trademarks.
March 26-28, 2013 SINGAPORE CDIO Asian Regional Meeting and Workshop on Engineering Education and Policies for Regional Leaders Programme Evaluation (CDIO.
Organizing for General Education Assessment Linda Suskie, Vice President Middle States Commission on Higher Education 3624 Market Street, Philadelphia.
A Common Sense Approach to Assessing Student Learning Linda Suskie Vice President Middle States Commission on Higher Education Fitchburg State College.
What Does Middle States Want? Linda Suskie, Vice President Middle States Commission on Higher Education 3624 Market Street, Philadelphia PA Web:
EDU 385 CLASSROOM ASSESSMENT Week 1 Introduction and Syllabus.
Periodic Program Review Guiding Programs in Today’s Assessment Climate LaMont Rouse Executive Director of Assessment, Accreditation & Compliance.
 Shelley A. Chapman, PhD Texas A & M University February 2013.
Grading and Analysis Report For Clinical Portfolio 1.
School improvement and teacher development 28 July Independent Schools Queensland state conference Cairns Ben Jensen.
Assessment and Testing
Fair and Appropriate Grading
Introduction to Academic Assessment John Duffield Office of Academic Assessment Georgia State University September 2013.
Standards Based Grading. How is it different? Traditional Grade for each assignment Grade may accidentally be focused more on one concept than another,
Criterion-Referenced Testing and Curriculum-Based Assessment EDPI 344.
Getting Ready for the Higher Learning Commission (NCA) July 2011 Dr. Linda Johnson HLC/Assessment Coordinator/AQIP Liaison Southeast Technical Institute.
Faculty Senate Assessment Committee Based upon MSCHE workshop by Jodi Levine Laufgraben Shirley Lin FSAC co-chair Professor, Chemistry Spring 2016.
Required Skills for Assessment Balance and Quality: 10 Competencies for Educational Leaders Assessment for Learning: An Action Guide for School Leaders.
Chapter 6 Assessing Science Learning Updated Spring 2012 – D. Fulton.
Overview of Types of Measures Margaret Kasimatis, PhD VP for Academic Planning & Effectiveness.
If I hear, I forget. If I see, I remember. If I do, I understand. Rubrics.
Improving Data, Improving Outcomes Conference Washington, DC Sept , 2013 Planful Changes: Using Self-Assessments to Improve Child and Family Outcome.
Changing Expectations for Assessment Linda Suskie Assessment & Accreditation Consultant AICUP.
NCATE Unit Standards 1 and 2
Introduction to Teacher Evaluation
Direct vs Indirect Assessment of Student Learning: An Introduction
Educational Assessments – Sound Design
Developing a Student Learning Outcomes Assessment Plan and Report
Introduction to Assessment in PBL
Director of Policy Analysis and Research
ASSESSMENT OF STUDENT LEARNING
The Assessment Toolbox
Effective Use of Rubrics to Assess Student Learning
Introduction to Assessment in PBL
What to do with your data?
Introduction to Assessment
Presentation transcript:

Understanding and Using Assessment Results Linda Suskie, Vice President Middle States Commission on Higher Education 3624 Market Street, Philadelphia PA Web: SUNY AIRPO Conference Albany, June 2010

Today… 1.Why are you assessing? 2.Understanding your audiences 3.Setting standards or benchmarks 4.Summarizing results 5.Sharing assessment results 6.Using assessment results to improve things

What Are You Assessing? Are your goals expressed as clear student learning outcomes? –Identify, analyze, and evaluate arguments as they occur in their own and others’ work. –Develop well-reasoned arguments.

Why Are You Assessing? 1.Assign grades & give feedback to individual students. 2.Improve what we’re doing. –Our curricula –Our teaching –Our support programs & infrastructure 3.Make sure quality isn’t slipping. 4.Tell the story of our success to key audiences (accountability).

Who Uses Your Results? Faculty Institutional leaders Administrators & staff in support programs Government policymakers? Prospective students & their families?

What Decisions Do They Make? What do they want to know? Why? What do they need to see to make those decisions?

Setting Benchmarks or Standards 1.Choosing the type of benchmark or standard 2.Setting performance standards –How good is good enough? 3.Setting standards for students’ collective performance –How many students should do well?

Michael earned 55 points on the midterm. Did he do well on the midterm?

To decide if Michael “did well,” we must compare his 55 against something else. –Benchmark –Standard –Target –Frame of reference –Lens –Criterion –“Brightline” The “something else” depends on what we want the test to tell us.

Suppose 35 is passing and 80 is a perfect score. Local standards Question answered: –Are our students meeting our standards? Challenge: –Establishing sound performance standards

Suppose 35 is passing and 80 is a perfect score on a published exam. External standards Question answered: –Are our students meeting external standards? Challenge: –Do the standards match what we think is important?

Suppose the class average is 65. Peer benchmark Question answered: –How do our students compare to peers? Challenge: –Identifying appropriate peers & collecting info from them

Suppose Michael scored 25 a year ago. Value-added benchmark Question answered: –Are our students improving? Challenges: –Transfers in or out –Motivating students on pre-test –Is growth due to us? –Imprecise assessments mask growth –Is this question relevant?

Suppose class average is 65 now and 40 three years ago. Historical trends benchmark Questions answered: –Are our teaching & curricula improving? –Are we getting better? Challenge: –Using the same assessment

Suppose Michael scored a 65 for knowledge and a 45 for real-world applications. Strengths and weaknesses benchmark Question answered: –What are our students’ relative strengths and areas for improvement? Challenge: –Getting “sub-scores” that are truly comparable

Suppose Michael’s 55 cost $400 and Melissa’s 55 cost $300. Productivity benchmark Question answered: –Are we getting the most for our investment? Challenge: –Calculating cost and benefit accurately –Keeping the focus on effectiveness as well as efficiency

Which benchmark or standard should you use? Each has advantages and disadvantages. Each gives a somewhat incomplete picture. Multiple perspectives give the most balanced picture of student learning.

2. Setting Performance Standards: How Good is Good Enough? Is Michael’s 55 good enough? –Why is 35 passing? –Why is being above average good enough? What level is minimally adequate? Why? What level is exemplary? Why?

Do some research. Have others set standards? –Disciplinary associations –Online searches –Colleagues in peer programs & colleges

Involve others. Employers Students Faculty in your program Faculty in more advanced programs

Use rubrics to articulate local standards. Rubrics have no rules! Minimally acceptable performance may be a minimum standard for –Every trait –The sum or average of all traits –Certain traits

3. Setting Targets for Students’ Collective Performance Are you satisfied with your students’ overall performance?

Again involve others. Employers Students Faculty in your program Faculty in more advanced programs

“90% will score 65 or better” NOT “The average will be 72.” Express targets as percentages, not means.

–Is this competency essential? Calculating dosages Making effective oral presentations –What is the cost of perfection? Vary your targets depending on the circumstances.

Isn’t poor performance the student’s fault? Sometimes, but usually few Suskie’s “50% rule”

“100% at least minimally adequate” “35% exemplary” Consider multiple targets.

Two Final Suggestions for Setting Standards and Targets 1.Use samples of student work to inform your discussions. –Would this paper embarrass us? –Would this paper make us proud? –Why? 2.View this as an iterative process.

Understanding Assessment Results

Tally the Results Percentages are better than averages. Round results. Sort results from highest to lowest.

Discuss the Results Were enough students at least minimally adequate? Were enough students outstanding? Where did students do best? Worst? Where would you like students to do better?

Sharing Assessment Results

The current quest for accountability creates a precious opportunity for educators to tell the full range of stories about learning and teaching. Lee Shulman

Everyone has a different story to tell. Tell the story of your effectiveness to your audiences in your own way. Accurately Truthfully

Share only what your audience will find useful. Most information is useless. Give yourself permission to dismiss it.

Keep it fast and easy to find, read & understand. Short, simple Charts, graphs, and lists PowerPoint presentations No jargon

Focus on big news. Drill down only if needed.

What Questions Should You Address in Your Story?

1. How do you define a successful student? What knowledge, skills, competencies, and attributes does a successful student have? Why do you and your colleagues think these are important?

2. What have you learned about your students’ learning? Do your students meet your definition of success? Systematic, not anecdotal Include “direct” evidence

3. Are you satisfied with your results? Why or why not? Distinguish meaningful from trivial differences.

Celebrate Successes!

Publicize Successes Prospective students Alumni Foundations Government policymakers Employers

4. If you’re not satisfied, what are you doing about it? When, where, and how are you doing it?

Using Assessment Results to Improve Things

Improve Assessments? Are they poorly written and misinterpreted? Do they match your key learning goals? Are they too difficult for most responsible students? Are the benefits worth time & money invested?

Improve Learning Goals? Do you have too many goals? Do your goals need to be clarified? Are your goals inappropriate or overly ambitious?

Improve Curricula? –Including placement and developmental education. Does the curriculum adequately address each learning goal?

Improve Teaching Methods? How do students learn best?

Linking to Planning & Budgeting Should improvements be a top priority? –Update goals & plans accordingly Do improvements need funding?

MSCHE’s Fundamental Expectations for Assessment 1.Read the directions. 2.Keep it useful…and used. 3.Tie assessments to important goals. 4.Include some “direct” evidence. 5.Use multiple measures. 6.Keep doing something everywhere, every year.

Volunteer for Middle States Evaluation Teams! Go to Click on “Evaluators” Consider joining as an Evaluation Team Associate.