TEN STEPS FOR ASSESSMENT. Step 1: Identify core standards (a.k.a. goals, outcomes, objectives)  These should reflect the competencies you want to show.

Slides:



Advertisements
Similar presentations
Writing an NCATE/IRA Program Report
Advertisements

Performance Assessment
Developing an Outcomes Assessment Plan. Part One: Asking a Meaningful Question OA is not hard science as we are doing it. Data that you collect is only.
Measuring Complex Achievement: Essay Questions
What “Counts” as Evidence of Student Learning in Program Assessment?
Evaluation for 1st Year Grantees Shelly Potts, Ph.D. Arizona State University
What Behaviors Indicate a Student is Meeting Course Goals and Objectives? “Indicators”
Designing Scoring Rubrics. What is a Rubric? Guidelines by which a product is judged Guidelines by which a product is judged Explain the standards for.
Using Rubrics for Evaluating Student Learning. Purpose To review the development of rubrics for the purpose of assessment To share an example of how a.
Susan Malone Mercer University.  “The unit has taken effective steps to eliminate bias in assessments and is working to establish the fairness, accuracy,
Developing Rubrics Presented by Frank H. Osborne, Ph. D. © 2015 EMSE 3123 Math and Science in Education 1.
Measuring Student Learning March 10, 2015 Cathy Sanders Director of Assessment.
New Hampshire Enhanced Assessment Initiative: Technical Documentation for Alternate Assessments Standard Setting Inclusive Assessment Seminar Marianne.
Using Rubrics for Evaluating Student Learning Office of Assessment and Accreditation Indiana State University.
Measuring Human Performance. Introduction n Kirkpatrick (1994) provides a very usable model for measurement across the four levels; Reaction, Learning,
Assessing Student Learning
Understanding the Process and the Product Professional Development Spring, 2012.
Access to HE Diploma Grading and Assessment University of the Arts London.
What should be the basis of
performance INDICATORs performance APPRAISAL RUBRIC
Aligning Academic Review and Performance Evaluation (AARPE)
BY Karen Liu, Ph. D. Indiana State University August 18,
Applying the Principles of Prior Learning Assessment Debra A. Dagavarian Diane Holtzman Dennis Fotia.
Susan Agre-Kippenhan, Portland State University Professor, Art Department Evaluating the Effectiveness of Service Learning.
PLAN AND ORGANISE ASSESSMENT. By the end of this session, you will have an understanding of what is assessment, competency based assessment, assessment.
Moodle (Course Management Systems). Assignments 1 Assignments are a refreshingly simple method for collecting student work. They are a simple and flexible.
Classroom Assessments Checklists, Rating Scales, and Rubrics
Classroom Assessment A Practical Guide for Educators by Craig A
Unit 1 – Preparation for Assessment LO 1.1&1.2&1.3.
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
Understanding Meaning and Importance of Competency Based Assessment
Alternative Assessment
© E. Kowch iD Instructional Design Evaluation, Assessment & Design: A Discussion (EDER 673 L.91 ) From Calgary With Asst. Professor Eugene G. Kowch.
Interactive Notebooks and Portfolios What? Why? How?
David Steer Department of Geosciences The University of Akron Learning objectives and assessments May 2013.
EDU 385 CLASSROOM ASSESSMENT Week 1 Introduction and Syllabus.
Programme Objectives Analyze the main components of a competency-based qualification system (e.g., Singapore Workforce Skills) Analyze the process and.
Student assessment Assessment tools AH Mehrparvar,MD Occupational Medicine department Yazd University of Medical Sciences.
Year 9 Humanities Personal Project Term 2. Contents  The task and outcome The task and outcome  The purpose The purpose  Becoming an effective learner.
Assessment for learning
Assessment Whittney Smith, Ed.D.. “Physical vs. Autopsy” Formative: Ongoing, varied assessment used as a tool for learning and diagnosing Summative:
VT University Libraries: Identifying, Teaching, and Assessing What Matters Most Office of Academic Assessment Ray Van Dyke,
Assessment Information from multiple sources that describes a student’s level of achievement Used to make educational decisions about students Gives feedback.
McGraw-Hill/Irwin © 2012 The McGraw-Hill Companies, Inc. All rights reserved. Obtaining Valid and Reliable Classroom Evidence Chapter 4:
Workshop II: Annual Departmental Planning—Assessment Planning.
Helping Students Focus on Learning, Not Grades
Criterion-Referenced Testing and Curriculum-Based Assessment EDPI 344.
ASSESSMENT TOOLS DEVELOPMENT: RUBRICS Marcia Torgrude
Why So Much Attention on Rubric Quality? CAEP Standard 5, Component 5.2: The provider’s quality assurance system relies on relevant, verifiable, representative,
The Gold Standard… Faculty are Key.  Annual Assessment based on  Address each SLO  Be specific, measurable, student- focused  Align the new.
Assessing Learners The Teaching Center Department of Pediatrics UNC School of Medicine The Teaching Center.
Candidate Assessment of Performance CAP The Evidence Binder.
N ational Q ualifications F ramework N Q F Quality Center National Accreditation Committee.
Candidate Assessment of Performance CAP The Evidence Binder.
UK Office of Assessment. The LEARNING Initiative Dual Track Implementation Strategy Completion Dates Not actively engaged in program level assessment.
Teaching and Learning Cycle and Differentiated Instruction A Perfect Fit Rigor Relevance Quality Learning Environment Differentiation.
Designing Quality Assessment and Rubrics
Designing Scoring Rubrics
EVALUATING EPP-CREATED ASSESSMENTS
Classroom Assessments Checklists, Rating Scales, and Rubrics
Understanding Standards: Nominee Training Event
Classroom Assessment A Practical Guide for Educators by Craig A
Consider Your Audience
Creating Analytic Rubrics April 27, 2017
Classroom Assessments Checklists, Rating Scales, and Rubrics
What Are Rubrics? Rubrics are components of:
Effective Use of Rubrics to Assess Student Learning
14 The Role of Assessment. 14 The Role of Assessment.
Exploring Assessment Options NC Teaching Standard 4
We VALUE HIPs Utilizing VALUE Rubrics and HIP QA Tools in Course Revitalization Presented by Melynda Conner, TBR OSS HIP Specialist 2019.
Presentation transcript:

TEN STEPS FOR ASSESSMENT

Step 1: Identify core standards (a.k.a. goals, outcomes, objectives)  These should reflect the competencies you want to show that your students have achieved. You may have other questions about the processes on learning that you will add to your assessment system, but providing evidence of learning is the core goal.  Multiple levels of standards (commonly referred to as indicators) The farther down you drill into such a standard, the greater the level of specificity about the knowledge, skills and dispositions.

Step 2: List the key assignments  Aim for 6-8 key assessments for each major element of a standard (the first level) that show good program distribution and attention to development over time. This will provide you with a rich data set capable of drilling down to see sub-classes of competency achievement.  To be most effective and least invasive, these should be work samples that are already part of the normal curriculum. Since the goal is to ascertain if the program of learning is working day-to-day, measuring behaviors that are not the norm for the organization may invalidate to a large degree the goal of getting "a read" of your current program.  Secondly, the assignments should be program embedded, therefore they will be assigned at the normal time and assessed only once.

Step 3: Analyze the assessment instruments for effectiveness and utility 1. Does the assessment instrument align with the standard? 2. Are the instruments aligned as regards a common scoring range? 3. Is the assessment written with the student in mind? Will students would have no trouble visualizing what they need to show in order to get a high score? 4. Does the instrument describe observable and measurable elements of the student work? 5. A rubric cannot assess a “standard” directly. Clearly it can only assess accurately, validly and reliably observable work that you believe to be clear evidence of competency as regards one or more elements of one or more standards.

Step 4: Determine the unit/institutional performance labels and scores  Before rubrics are defined, a common scale should be determined, however, it is not essential that all instruments follow the same scale.  We recommend a 4- or 5-point rubric A 2-point scale obscures important distinctions in performance A 3-point scale has both advantages and disadvantages (developmental vs. capstone) 6 or more levels slow down assessment and adds little real information  If there is no consistent scoring scale, aggregate data is misleading  If one department chooses a higher maximum score than others, their students will appear to be performing betterIf there is no consistent scoring scale, aggregate data is misleading  If one department chooses a higher maximum score than others, their students will appear to be performing better.

Step 5: Develop "Tables of Contents" (TOCs)  TOCs are an organized and logical outline of the key assignments. They are grouped in ways that will make sense to learners.  The organization of the TOC has no impact on the data collection.  Add Value to the TOC by incorporating Reflective practice. Reflection gives students the ability to discuss intelligently what they are learning, and helps them take personal ownership of their learning. In short, they will know the value of what they have done. Without reflective practice, so far as many students are concerned, the only reason they handed in the assignment was that they were told to in order to get a grade that would help them pass the course.

Step 6: Create assessment groups for the system (departments).  This step gives you the ability to filter which TOCs, assessment instruments, and assessors students will see while they work and submit things for assessment. It also provides you with another way to isolate data in your reporting.  You can assign one or more sub-administrators to a department.  Sub-administrators can also run all forms of reporting, but only with data that was derived from TOCs and rubrics linked to their department.

Step 7: Enter your TOCs and Rubrics into the system STANDARD SETS TABLE OF CONTENTS (SECTIONS) RUBRIC Institutional Outcomes Learning Objectives Principles Goals 1. Standard A 1.1 Indicator a1 1.2 Indicator a2 2. Standard B 2.1 Indicator b1 2.2 indicator b2 Rubric #1 Criterion #1 Criterion #2 Criterion #3 Rubric #2 Criterion #1 Criterion #2 Criterion #3 Program X Assignment 1 Assignment 2

Don’t link all criteria to all outcomes  The link from a criterion to an outcome is the path viewed - from a claimed ability level to work which demonstrates that ability  For example, if the outcome is ‘respects ethnic diversity’ and the assignment is a physics lab, there is really no relevant criterion, and the outcome should not be linked.  Linking everything to everything will result in ‘mushy’ results where all the numbers are average and no real information is provided

Orbital Dynamics Spatial Geometry Documentation Background Research Rubric 1 Understanding of Period Bibliography Essay Construction Cultural Sensitivity Rubric 2 Course Content Research Skills Writing Respect for Diversity Standard Example of linking

More about Rubrics (four main types)  Holistic rubrics  Analytical Rubrics  Rubrics with "Not applicable“ Criterion  Additive Rubrics  Note: "Not met/Met" rubrics are fine, but they tell you nearly nothing about performance. Avoid including this data in any reporting in which ranges of performance were considered.

More about Rubrics a) Holistic Rubrics: These are rubrics that have no criterion. They usually have a broad, and sometimes multi-faceted description of the desired output. This is measured on a scale representing the degree to which the learner met the goal(s) stated in the broad description. Assignments that use holistic rubrics can be quicker to assess. Such assessment is no more or less accurate and valid than rubrics that break the performance out into criteria to be specifically assessed (Analytical rubrics – see below). However, unless they also include a description of the "look-fors" for each possible score in the range, they cannot precisely inform student development. They are good tools for final summative evaluation, wherein the learner is not going to be permitted to improve the result by redoing the work.

More about Rubrics (cont.) b) Analytical Rubrics: These rubrics have multiple criteria, and usually specify the attributes of each level of performance for each criterion. These are harder to write but are invaluable to student progress as they already contain key feedback for improvement. At first, this may seem to make assessment slower, but need not if the performance level descriptions are robust, thus reducing the need for the assessor to write many, repetitive, general comments. They are also very powerful tools for program improvement, as they provide granular statistics about every dimension of a performance. They should be tested informally prior to use to assure that the language used is commonly understood. Care should be taken to write them so that students will understand the level descriptors easily. Chalk & Wire also has tools for tracking the reliability of rubrics once they are in use. There are also tools for conducting formal inter-rater reliability studies if you feel such a study is warranted (we do highly recommend the use of such studies).

Even more about rubrics c) Additive Rubrics: These rubrics are a variant of the analytical rubric in which scores are added and totaled rather than averaged to get a final result. d) Rubrics with "Not applicable" as an option: These are created by adding another level of performance to a rubric and setting this final level for each criterion to have no score (the score box is left blank) and the "Not applicable" check box is selected for that level, The level description should be entered as something like "This criterion was not applicable at this time and therefore not assessed”. Where this approach is applied, the score for this criterion will be ignored in all calculations.

Step 8: Link rubrics to the correct “main” or “sub-sections” of the TOC(s).

Step 9: Link standards to the elements of assessment instruments and the TOCs STANDARD SETS TABLE OF CONTENTS (SECTIONS) RUBRIC LINKING 1. Standard A 1.1 Indicator a1 1.2 Indicator a2 2. Standard B 2.1 Indicator b1 2.2 indicator b2 Rubric #1 Criterion #1 Criterion #2 Criterion #3 Rubric #2 Criterion #1 Criterion #2 Criterion #3 Program X Assignment 1 Assignment 2

Step 10: Enter demographic variables  If a question is relevant to all students, it should be assigned to ‘all departments’, so it is only asked once Examples: Gender, Race, Citizenship, Age  Demographic questions may be altered after assessment is under way, as long as students are continuing to submit work  Altered questions will be presented to students again, and reporting will present the new breakdown, regardless of when the assessment was done

Types of Data  Anything that describes the circumstance of a learner (race, ethnicity, gender, location, etc.) is considered demographic data.  Data that is the product of a learning event is assessment data.

Review/Overview  Tables of Contents: Sections  Structures a portfolio of assignments for a specific group of students  Rubrics: Criteria  Assessment framework, giving feedback to students on areas for improvement, and assembling work of a specific level for accreditation or institutional research  Standards: Outcomes  The entry point for accreditation, linking the internal criteria of the institution to the external or internal outcomes expected by the school or accreditors  A campus-wide method of assessing where student performance matches institutional goals  Demographics  Questions about the origins and education of students, used to break out performance data in reporting TOC Sections Rubrics Standards Demographics