Quality Enhancement Plan QEP Team and Faculty Champions

Slides:



Advertisements
Similar presentations
Assessing Student Performance
Advertisements

Performance Assessment
Fairness, Accuracy, & Consistency in Assessment
Designing Instruction Objectives, Indirect Instruction, and Differentiation Adapted from required text: Effective Teaching Methods: Research-Based Practice.
Teacher Evaluation Model
The Network of Dynamic Learning Communities C 107 F N Increasing Rigor February 5, 2011.
Mary Jo Sariscsany Assessing Health- Related Fitness and Physical Activity 13 chapter.
Designing Scoring Rubrics. What is a Rubric? Guidelines by which a product is judged Guidelines by which a product is judged Explain the standards for.
Rubric Workshop Los Angeles Valley College Fall 2008.
Writing B. Finco. A little light reading! B. Finco.
Faculty Champion Meeting March 2011
Dr. Michael Earle & Dr. Janice Thiel.  Background Deer, and Terry, and Snakes, Oh my!  Activity  Discussion.
Assessing for Critical Thinking
Learning Objectives, Performance Tasks and Rubrics: Demonstrating Understanding and Defining What Good Is Brenda Lyseng Minnesota State Colleges.
Susan Malone Mercer University.  “The unit has taken effective steps to eliminate bias in assessments and is working to establish the fairness, accuracy,
Measuring Student Learning March 10, 2015 Cathy Sanders Director of Assessment.
The mere imparting of information is not education. Above all things, the effort must result in helping a person think and do for himself/herself. Carter.
Educational Outcomes: The Role of Competencies and The Importance of Assessment.
CHAPTER 3 ~~~~~ INFORMAL ASSESSMENT: SELECTING, SCORING, REPORTING.
Assessing Student Learning
Assessing and Evaluating Learning
Incorporating Authentic Assessment in the Classroom Narrowing the Gulf Conference March/April 2011.
Assessing Critical Thinking Summer Critical Thinking Institute QEP Team, Faculty Champions, and Academic Roundtables 2008.
Purpose Program The purpose of this presentation is to clarify the process for conducting Student Learning Outcomes Assessment at the Program Level. At.
Principles of Assessment
All College Day: Our Role in Student Success Incorporating Authentic Assessment in the Classroom October 2010.
Authentic Assessment Principles & Methods
ASSESSMENT IN EDUCATION ASSESSMENT IN EDUCATION. Copyright Keith Morrison, 2004 PERFORMANCE ASSESSMENT... Concerns direct reality rather than disconnected.
Authentic Assessment Lynne E. Houtz, Ph.D. Associate Professor of Education Creighton University.
Pierce College CSUN-Pierce Paths Project Outcomes Report 2013.
Curriculum Design. A Learner Centered Approach May, 2007 By. Rhys Andrews.
January 29, 2010ART Beach Retreat ART Beach Retreat 2010 Assessment Rubric for Critical Thinking First Scoring Session Summary ART Beach Retreat.
Classroom Assessment A Practical Guide for Educators by Craig A
EDU 385 Education Assessment in the Classroom
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
Incorporating Authentic Assessment in the Classroom Narrowing the Gulf Annual Conference 2011 March/April 2011.
Understanding Meaning and Importance of Competency Based Assessment
ASSESSING CREATIVE THINKING THROUGH PORTFOLIOS October 30, 2014 MAGGIE KONICH ASSESSMENT SPECIALIST.
Assessment in Education Patricia O’Sullivan Office of Educational Development UAMS.
Teaching Today: An Introduction to Education 8th edition
Veterinary Technology Staff Meeting Incorporating Authentic Assessment in the Classroom February 2011.
Chap. 2 Principles of Language Assessment
Performance Assessment OSI Workshop June 25 – 27, 2003 Yerevan, Armenia Ara Tekian, PhD, MHPE University of Illinois at Chicago.
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
Performance and Portfolio Assessment. Performance Assessment An assessment in which the teacher observes and makes a judgement about a student’s demonstration.
EDU 385 CLASSROOM ASSESSMENT Week 1 Introduction and Syllabus.
Illustration of a Validity Argument for Two Alternate Assessment Approaches Presentation at the OSEP Project Directors’ Conference Steve Ferrara American.
Programme Objectives Analyze the main components of a competency-based qualification system (e.g., Singapore Workforce Skills) Analyze the process and.
Performance-Based Assessment HPHE 3150 Dr. Ayers.
Assessing Your Assessments: The Authentic Assessment Challenge Dr. Diane King Director, Curriculum Development School of Computer & Engineering Technologies.
THE SLO PROCESS #1 create/update SLO’s, rubrics, & assessment methods #2 assess all students in all sections of all courses #3 maintain SLO assessment.
Assessment and Testing
Assessment Information from multiple sources that describes a student’s level of achievement Used to make educational decisions about students Gives feedback.
What Are the Characteristics of an Effective Portfolio? By Jay Barrett.
Alternative Assessment Chapter 8 David Goh. Factors Increasing Awareness and Development of Alternative Assessment Educational reform movement Goals 2000,
Assessment Ice breaker. Ice breaker. My most favorite part of the course was …. My most favorite part of the course was …. Introduction Introduction How.
MAVILLE ALASTRE-DIZON Philippine Normal University
Classroom Assessment A Practical Guide for Educators by Craig A. Mertler Chapter 4 Overview of Assessment Techniques.
Alternative Assessment Larry D. Hensley University of Northern Iowa Chapter 8.
Assessment Instruments and Rubrics Workshop Series Part 1: What is a rubric? What are the required elements? What are VALUE rubrics? February 24, 2016.
Instructional Leadership Supporting Common Assessments.
Designing Quality Assessment and Rubrics
Designing Scoring Rubrics
EVALUATING EPP-CREATED ASSESSMENTS
Classroom Assessment A Practical Guide for Educators by Craig A
Test Blueprints for Adaptive Assessments
Deconstructing Standard 2a Dr. Julie Reffel Valdosta State University
Developing a Rubric for Assessment
Incorporating Authentic Assessment in the Classroom
Presentation transcript:

Quality Enhancement Plan QEP Team and Faculty Champions Assessment Rubric for Critical Thinking Rubric Validation Process Second Workshop Quality Enhancement Plan QEP Team and Faculty Champions

Authentic Assessments Authentic assessments serve dual purposes of encouraging students to think critically and providing assessment data for measuring improved student learning. These assessment techniques fall into three general categories: criterion-referenced rubrics, student reports (reflection or self-assessments), and student portfolios. December 12, 2008 Quality Enhancement Plan

Quality Enhancement Plan Rubrics What is a rubric? Scoring guidelines, consisting of specific pre-established performance criteria, used in evaluating student work on performance assessments December 12, 2008 Quality Enhancement Plan

Criterion-referenced Rubrics Complex, higher-order objectives can be measured only by having students create a unique product, whether written or oral, which may take the form of in-class essays, speeches, term papers, videos, computer programs, blueprints, or artwork (Carey, 2000). December 12, 2008 Quality Enhancement Plan

Quality Enhancement Plan Rubrics SPC currently uses rubrics in such programs as… College of Education College of Nursing Paralegal December 12, 2008 Quality Enhancement Plan

Assessment Rubric for Critical Thinking A global rubric template developed to provide a snapshot view of how student learning is being affected by the critical thinking initiative. Designed to be flexible enough to address a number of student project modalities including written and oral communications. Will evaluate the student’s use of critical thinking skills in the development of the paper as opposed to specifically evaluating the quality of student’s writing skills. December 12, 2008 Quality Enhancement Plan

Assessment Rubric for Critical Thinking Development of a rubric is an iterative process and will be improved and strengthened as it is used more widely December 12, 2008 Quality Enhancement Plan

Assessment Rubric for Critical Thinking ARC was designed by the QEP staff and the Faculty Champions to… Enhance the QEP Align with the College’s definition of critical thinking Be flexible for use in multi-disciplines December 12, 2008 Quality Enhancement Plan

Rubric Development Process Re-examine the learning objectives to be addressed by the task  Identify specific observable attributes students should demonstrate  Describe characteristics of the identified attribute  Write narrative descriptions for each level of continuum  Collect samples of student work  Score student work and identify samples that exemplify various levels  Revise the rubric as needed  Repeat as Needed December 12, 2008 Quality Enhancement Plan

Assessment Rubric for Critical Thinking December 12, 2008 Quality Enhancement Plan

Assessment Rubric for Critical Thinking December 12, 2008 Quality Enhancement Plan

Assessment Rubric for Critical Thinking December 12, 2008 Quality Enhancement Plan

ARC Assignment Profile ARC Assignment Profile is designed to provide consistency and accuracy in the evaluation of the ARC at the institutional level as well as provide guidelines for the use of the assessment at the course level. For a tool to be effective it must be used in the correct situation or ‘job.’ The ARC is essentially a ‘tool’ to evaluate critical thinking, but for a tool to be effective it must be in the correct situation or ‘job.’ The purpose of the ARC Assignment Profile is to outline the most appropriate course assignment. December 12, 2008 Quality Enhancement Plan

ARC Assignment Profile 1. Participating faculty should have one assignment during the course that can be evaluated using the ARC scoring rubric. The course assignment could be a graded homework assignment or a major assessment for the course. December 12, 2008 Quality Enhancement Plan

ARC Assignment Profile 2. The course assignment for the ARC should include all of the elements of the rubric and should be aligned with the task outlined for each element. Assignments that only evaluate some of the elements or are not aligned with the specific ARC tasks will be considered incomplete and not used in the institutional analysis. December 12, 2008 Quality Enhancement Plan

ARC Assignment Profile 3. Faculty may add additional discipline specific rubric elements (such as grammar and punctuation in a composition class), but must maintain the ARC elements as listed. December 12, 2008 Quality Enhancement Plan

ARC Assignment Profile Students should be provided a copy of the assignment rubric (ARC and any additional discipline specific elements). The specific elements and tasks include: Communication: Define the problem in your own words. Analysis: Compare & contrast the available solutions within the scenario. Problem Solving: Select one of the available solutions and defend it as your final solution. Evaluation: Identify the weaknesses of your final solution. Synthesis: Suggest ways to improve/strengthen your final solution (may use information not contained within the scenario). Reflection: Reflect on your own thought process after completing the assignment. “What did you learn from this process?” “What would you do differently next time to improve?” December 12, 2008 Quality Enhancement Plan

ARC Assignment Profile 5. The evaluating scenario (selected or created) should be stated in such a manner to allow the student to address each of the tasks. The QEP team is willing to assist with the creation of the scenario or identify possible sources of existing scenario that could be used. December 12, 2008 Quality Enhancement Plan

ARC Assignment Profile 6. At the end of the semester, please send the completed student assignments to the Janice Thiel, QEP Director, TE 1-111 (X3110). Completed student assignments should include a copy of the scenario, the assignment provided to the student (with the rubric), the students work and the final graded rubric. December 12, 2008 Quality Enhancement Plan

ARC Assignment Profile Competency (KSA) Problem with Multiple Solutions Premise with Multiple Perspectives Communication Define Analysis Compare & Contrast Solutions Compare & Contrast Alternative Perspectives Problem Solving Select & Defend Final Solution Final Perspective Evaluation Identify Weaknesses Synthesis Suggest Improvements Reflection Reflect on Thought Process December 12, 2008 Quality Enhancement Plan

Sample Scenario (Deer) Three teenagers were seriously injured in a car accident when swerving to avoid a deer on a two-lane road near a small, rural town in Florida. The residents of the town have seen more and more deer enter the town’s populated areas over recent years. Local law enforcement has been called numerous times this year to remove the animals from backyards and neighborhood streets, and one deer even caused considerable damage as it entered a restaurant in town. The mayor has been charged by the city leaders to keep the town residents safe. December 12, 2008 Quality Enhancement Plan

Sample Scenario (Deer) Local crops have even been damaged by the animals. Some long time residents have requested that the hunting season and catch limits be extended in order to reduce the deer population. One city leader even proposed that the city purchase electronic devices to deter the deer from entering populated areas. Health concerns have recently been elevated as three deer carcasses were found at the edge of town and local law enforcement suspect that the animals had been poisoned. December 12, 2008 Quality Enhancement Plan

Sample Scenario (Deer) Possible Solutions: Some long time residents have requested that the hunting season and catch limits be extended in order to reduce the deer population. One city leader even proposed that the city purchase electronic devices to deter the deer from entering populated areas. Health concerns have recently been elevated as three deer carcasses were found at the edge of town and local law enforcement suspect that the animals had been poisoned. December 12, 2008 Quality Enhancement Plan

Rubric Development Process Re-examine the learning objectives to be addressed by the task  Identify specific observable attributes students should demonstrate  Describe characteristics of the identified attribute  Write narrative descriptions for each level of continuum  Collect samples of student work  Score student work and identify samples that exemplify various levels  Revise the rubric as needed  Repeat as Needed December 12, 2008 Quality Enhancement Plan

ARC Scoring Workshop Process After the completion of this PowerPoint Presentation, the workshop will begin with introductions from the participants Workshop participants will be provided the ARC as well as scoring worksheets. Additional instruction will be provided on the scoring process. A sample test item will then be presented on the screen, and various responses will be discussed and scored based on the scoring rubric given for that specific item. Each scorer will then review the response provided for the first item on his/her first assessment, and scored it based on the scoring rubric. This process will be repeated for each of the five items on the assessment. December 12, 2008 Quality Enhancement Plan

ARC Scoring Workshop Process Scorers who encountered a response which did not clearly follow the rubric will discuss the response with the group for clarification. Each scorer will then passed the scored assessment to their scoring partner, and the same assessments will be scored by the second scorer. In the event that two scores differed significantly, the facilitator will provide the assessment to a third scorer, and a third score will be recorded. When all scoring for an assessments is completed, the assessment will be provided to the facilitator. December 12, 2008 Quality Enhancement Plan

ARC Scoring Workshop Process Finally, steps 1 through 8 will be repeated for each assessment as time allows. Workshop participants will complete the ARC Validity and Reliability Form at the end of the workshop. Interrater reliability will also be calculated from ARC ratings after the completion of the workshop. Rubric results will be reevaluated after each administration, and additional refinements and modifications may be made to the instrument as the assessment development and validation is intended to be an on-going dynamic process designed to provide the very best indicator of a student’s skills. December 12, 2008 Quality Enhancement Plan

Validity and Reliability December 12, 2008 Quality Enhancement Plan

Quality Enhancement Plan Validity Does the Rubric measure what it is suppose to measure? “Validation is the process of accumulating evidence that supports the appropriateness of inferences that are made of student responses…” (AERA, APA, & NCME, 1999) December 12, 2008 Quality Enhancement Plan

Quality Enhancement Plan Validity Consequences: The effects of the assessment Content Coverage: Comprehensiveness of assessment content  Content Quality: Consistency with current content conceptualization Transfer and Generalizability: Whether assessment is representative of a larger domain Cognitive Complexity: Whether level of knowledge assessed is appropriate Meaningfulness: The relevance of the assessment in the minds of students Fairness: Fairness to members of all groups Cost and Efficiency: The practicality or feasibility of an assessment December 12, 2008 Quality Enhancement Plan

Quality Enhancement Plan Validity Consequences The effects of the assessment Is the assessment likely to produce results that will be used to improve instructional programs or otherwise improve student learning? Content Coverage Comprehensiveness of assessment content  Does the assessment comprehensively cover the content and processes assessed? Is the content covered in sufficient breadth and depth? Does the assessment represent important (not trivial) components of the content? Together, will the assessments provide sufficient evidence about the content? December 12, 2008 Quality Enhancement Plan

Quality Enhancement Plan Validity Content Quality Consistency with current content conceptualization Is the assessment consistent with the best available conceptualization of the knowledge or skill assessed? Does the assessment represent current, rather than outdated, perspectives? Transfer and Generalizability Whether assessment is representative of a larger domain Can the assessment results be generalized to the broader domain (knowledge, skill, or learning outcome) they are intended to represent? December 12, 2008 Quality Enhancement Plan

Quality Enhancement Plan Validity Cognitive Complexity Whether level of knowledge assessed is appropriate Do the assessment tasks or questions represent the cognitive complexity of the knowledge or skill that it is intended to assess? (For example, if an outcome includes higher order or critical thinking skills--such as problem solving or synthesis--does the assessment measure them?) Does the assessment actually require students to use higher-level knowledge or skills, or can students simply respond from memory without having to think? Meaningfulness The relevance of the assessment in the minds of students Are assessment items or tasks meaningful to students? Is the assessment relevant to problems students will encounter again in school, work, or daily living? Does the assessment provide students with worthwhile or meaningful experiences? December 12, 2008 Quality Enhancement Plan

Quality Enhancement Plan Validity Fairness Fairness to members of all groups Is the assessment biased against students who are members of various racial, ethnic, and gender groups or students with disabilities? Does it contain stereotypes of any groups? Do students of similar ability, regardless of group membership, score the same? Cost and Efficiency The practicality or feasibility of an assessment Is the assessment a reasonable burden on teachers, instructional time, and finances? Is resulting information worth the required costs in money, time, and effort? December 12, 2008 Quality Enhancement Plan

Quality Enhancement Plan Reliability Consistency of the assessment scores Types of reliability… Interrater Reliability – scores vary from instructor to instructor. Intrarater Reliability – scores vary from a single instructor from paper to paper A test can be reliable and not valid, but never valid and not reliable December 12, 2008 Quality Enhancement Plan

Quality Enhancement Plan Reliability Concerns Reliability Are the score categories well defined? Are the differences between the score categories clear? Would two independent raters arrive at the same score for a given student response based on the scoring rubric? December 12, 2008 Quality Enhancement Plan

Improving Scoring Consistency Provide rubric to students prior to assessment Anonymous papers Anchor papers defining levels of proficiency for reference Use of multiple scorers Interrater reliability statistics during training and grading December 12, 2008 Quality Enhancement Plan

Quality Enhancement Plan Next Steps The new faculty champions will administer coursework using the ARC rubric within their programs during the Spring semester Faculty Champions will use the ARC assignment profile to ensure consistency Process will be repeated (Steps 5 - 7) December 12, 2008 Quality Enhancement Plan

Quality Enhancement Plan QEP Team and Faculty Champions Assessment Rubric for Critical Thinking Rubric Validation Process Second Workshop Quality Enhancement Plan QEP Team and Faculty Champions