Presentation is loading. Please wait.

Presentation is loading. Please wait.

Incorporating Authentic Assessment in the Classroom

Similar presentations


Presentation on theme: "Incorporating Authentic Assessment in the Classroom"— Presentation transcript:

1 Incorporating Authentic Assessment in the Classroom
Narrowing the Gulf Annual Conference 2011 March/April 2011

2 Narrowing the Gulf Annual Conference 2011
St. Petersburg College Presenters Dr. Carol Weideman, Mathematics Professor Dr. James Coraggio, Director, Academic Effectiveness and Assessment March 31 and April 1, 2011 Narrowing the Gulf Annual Conference 2011

3 Narrowing the Gulf Annual Conference 2011
Assessment Basics Why do we assess? To see how well we are doing To confirm what we already know To share our progress with others To see where we can improve and change In some cases to demonstrate what does not work J March 31 and April 1, 2011 Narrowing the Gulf Annual Conference 2011

4 Narrowing the Gulf Annual Conference 2011
Assessment Basics Why do we assess? Source: J March 31 and April 1, 2011 Narrowing the Gulf Annual Conference 2011

5 Purpose of an Assessment
“Clearly delineate between those that know the content and those that do not.” To determine whether the student knows the content, not whether the student is a good test-taker. Likewise, confusing and tricky questions should be avoided to prevent incorrect responses from students who know the material. J March 31 and April 1, 2011 Narrowing the Gulf Annual Conference 2011

6 Narrowing the Gulf Annual Conference 2011
Types of Assessments Objective assessments Authentic assessment J March 31 and April 1, 2011 Narrowing the Gulf Annual Conference 2011

7 Objective Assessments
Measure several types of learning (also levels) Wide content, short period of time Variations for flexibility Easy to administer, score, and analyze Scored more reliability and quickly C March 31 and April 1, 2011 Narrowing the Gulf Annual Conference 2011

8 Types of Objective Tests
Written-response Completion (fill-in-the-blank) Short answer Selected-response Alternative response (two options) Matching Keyed (like matching) Multiple choice C March 31 and April 1, 2011 Narrowing the Gulf Annual Conference 2011

9 Small Group Assignment
Multiple Choice Assessment Assignment Objectives Solving Equations using addition and multiplication principles Solving applied problems by translating to equations C March 31 and April 1, 2011 Narrowing the Gulf Annual Conference 2011

10 Issues with Objective Assessments
Limited depth of content Not able to reveal student misconceptions Limited ability to test critical thinking skills Students are ‘Test-wise’ C March 31 and April 1, 2011 Narrowing the Gulf Annual Conference 2011

11 Narrowing the Gulf Annual Conference 2011
Test-wise Students Are familiar with item formats Use informed and educated guessing Avoid common mistakes Have testing experience Use time effectively Apply various strategies to solve different problem types C March 31 and April 1, 2011 Narrowing the Gulf Annual Conference 2011

12 Authentic Experiences
Is the course aligned with the expectations for the student in the ‘real-world’? Authentic Learning Authentic Assessment J March 31 and April 1, 2011 Narrowing the Gulf Annual Conference 2011

13 Class Without Authentic Experiences
Didactic instruction where students are presented with factual information from a text book Assessment is primarily multiple choice items where students are expected to regurgitate factual information J March 31 and April 1, 2011 Narrowing the Gulf Annual Conference 2011

14 Class With Authentic Experiences
Interactive learning environment where students not only learn facts but the relationship between the facts and the application of that information Authentic assessment where students are able to model the applications of the discipline through simulations, projects, etc. J March 31 and April 1, 2011 Narrowing the Gulf Annual Conference 2011

15 Authentic Assessments
Authentic assessments serve dual purposes of encouraging students to think critically and providing assessment data for measuring improved student learning. These assessment techniques fall into three general categories: criterion-referenced rubrics, student reports (reflection or self-assessments), and student portfolios. J March 31 and April 1, 2011 Narrowing the Gulf Annual Conference 2011

16 Authentic Assessments
Authentic assessments include… Criterion-referenced rubrics. Complex, higher-order objectives can be measured only by having students create a unique product, whether written or oral [in-class essays, speeches, term papers, videos, computer programs, blueprints, or artwork] (Carey, 2000). Student Reflection. Written reflection is espoused to have several important benefits: it can deepen the quality of critical thinking, increase active involvement in learning, and increase personal ownership of the new learning by the student (Moon, 1999). Student Portfolios. Collections of students’ work over a course or a program and can be an effective method of demonstrating student progress in the area of critical thinking (Carey, 2000). J March 31 and April 1, 2011 Narrowing the Gulf Annual Conference 2011

17 Small Group Assignment
Authentic Assessment: Building a Fence Worksheet Assignment Objectives Solving Equations using addition and multiplication principles Solving applied problems by translating to equations C March 31 and April 1, 2011 Narrowing the Gulf Annual Conference 2011

18 Small Group Assignment
You want to fence in part of your backyard. The dimensions of the fenced yard are shown in the diagram below. The fence will have a 10ft gate and a vertical support pole every 10 ft. 10 ft Gate: $100 Vertical Support Poles: $5 each Chain Link Fencing: $2 per foot C March 31 and April 1, 2011 Narrowing the Gulf Annual Conference 2011

19 Small Group Assignment
A. Write and graph an equation for the cost of building a fence that has the cost per foot and one gate. Let y = total cost of the fence x = cost per foot B. Using the given information and the equation from Part A, how much will it cost to build the fence? C. Suppose chain link fencing is sold only in 50 ft length rolls. You don’t want to waste or throw away any fencing. How much wider does the yard need to be to use all the fencing in the rolls? C March 31 and April 1, 2011 Narrowing the Gulf Annual Conference 2011

20 Narrowing the Gulf Annual Conference 2011
Rubrics What is a rubric? Scoring guidelines, consisting of specific pre-established performance criteria, used in evaluating student work on performance assessments C March 31 and April 1, 2011 Narrowing the Gulf Annual Conference 2011

21 Narrowing the Gulf Annual Conference 2011
Rubrics SPC currently uses rubrics in such programs as… College of Education College of Nursing Paralegal Studies Program C March 31 and April 1, 2011 Narrowing the Gulf Annual Conference 2011

22 Rubric Development Process
Re-examine the learning objectives to be addressed by the task Identify specific observable attributes your students should demonstrate Describe characteristics at each attribute Write narrative descriptions for each level of continuum Collect samples of student work Score student work and identify samples that exemplify various levels Revise the rubric as needed Repeat as Needed J March 31 and April 1, 2011 Narrowing the Gulf Annual Conference 2011

23 Narrowing the Gulf Annual Conference 2011
Assignment Profile Designed to provide consistency and accuracy as well as provide guidelines for the use Rubric is an evaluation ‘tool’, but for a tool to be effective it must be in the correct situation or ‘job.’ It would be inefficient to use a machete to conduct heart surgery. Rubric must be aligned to the most appropriate course assignment The instructor is the assessment instrument not the rubric J March 31 and April 1, 2011 Narrowing the Gulf Annual Conference 2011

24 Narrowing the Gulf Annual Conference 2011
Full Group Assignment Rubric: STA2023 Sampling Project Assignment Objectives Identify the sampling strategies commonly employed to collect data Describe potential biases encountered with sampling strategies used in various statistical applications Suggest strategies to avoid potential biases when using sampling to collect data for a statistical application J/C March 31 and April 1, 2011 Narrowing the Gulf Annual Conference 2011

25 Narrowing the Gulf Annual Conference 2011
Full Group Assignment Management at a retail store is concerned about the possibility of drug abuse by people who work there. They decide to check on the extent of the problem by having a random sample of the employees undergo a drug test. The lawyers for the retail store have assured the management that there are no legal issues with the proposed drug testing as long as the individual test results are not identified to a specific employee. Depending the extent of illegal drugs identified in the drug testing, drug counseling may be offered to all employees under the promise of complete confidentiality. You have been hired as the statistician who will design the sampling plan. The budget for the drug testing will cover the cost of 40 drug tests. J/C March 31 and April 1, 2011 Narrowing the Gulf Annual Conference 2011

26 Narrowing the Gulf Annual Conference 2011
Full Group Assignment  Management has proposed several different ideas about the best way to obtain the random sample of 40 employees who will be drug tested. There are currently 500 employees at this retail store. There are four classifications of employees: supervisors, full-time sales clerks, part-time sales clerks and maintenance staff. These sampling possibilities are listed below: Select one of the employee classifications and sample all employees in that classification.  Choose every fourth person who clocks in for each shift.  Randomly select 10 employees from each classification.  Each employee has a three-digit employee number. Randomly select 40 employees.  J/C March 31 and April 1, 2011 Narrowing the Gulf Annual Conference 2011

27 Narrowing the Gulf Annual Conference 2011
Full Group Assignment Answer the following questions regarding this scenario: Define this problem in your own words. Compare and contrast the four proposed sampling plans. Select one of the proposed sampling plan that you feel is most appropriate for this situation and defend your choice. Describe any weaknesses in your selected sampling plan. Make suggestions on ways to improve/strengthen the sampling plan. You may include information not described in the scenario above. Reflect on your own thought process after completing the assignment. “What did you learn from this process?” “What would you do differently next time to improve?”     J/C March 31 and April 1, 2011 Narrowing the Gulf Annual Conference 2011

28 Assessment Rubric for CT
Performance Element Exemplary (4) Proficient (3) Developing (2) Emerging (1) Not Present (0) I. Communication Define problem in your own words. Identifies the main idea or problem with numerous supporting details and examples which are organized logically and coherently. Identifies the main idea or problem with some supporting details and examples in an organized manner. Identifies the main idea or problem with few details or examples in a somewhat organized manner. Identifies the main idea or problem poorly with few or no details or states the main idea or problem verbatim from the text. Does not identify the main idea or problem. II. Analysis Compare & contrast the available solutions. Uses specific inductive or deductive reasoning to make inferences regarding premises; addresses implications and consequences; identifies facts and relevant information correctly. Uses logical reasoning to make inferences regarding solutions; addresses implications and consequences; Identifies facts and relevant information correctly. Uses superficial reasoning to make inferences regarding solutions; Shows some confusion regarding facts, opinions, and relevant, evidence, data, or information. Makes unexplained, unsupported, or unreasonable inferences regarding solutions; makes multiple errors in distinguishing fact from fiction or in selecting relevant evidence. Does not analyze multiple solutions. III. Problem Solving Select & defend your final solution. Thoroughly identifies and addresses key aspects of the problem and insightfully uses facts and relevant evidence from analysis to support and defend potentially valid solutions. Identifies and addresses key aspects of the problem and uses facts and relevant evidence from analysis to develop potentially valid conclusions or solutions. Identifies and addresses some aspects of the problem; develops possible conclusions or solutions using some inappropriate opinions and irrelevant information from analysis. Identifies and addresses only one aspect of the problem but develops untestable hypothesis; or develops invalid conclusions or solutions based on opinion or irrelevant information. Does not select and defend a solution. May 21, 2010 Faculty Champion Meeting

29 Assessment Rubric for CT
Performance Element Exemplary (4) Proficient (3) Developing (2) Emerging (1) Not Present (0) IV. Evaluation Identify weaknesses in your final solution. Insightfully interprets data or information; identifies obvious as well as hidden assumptions, establishes credibility of sources on points other than authority alone, avoids fallacies in reasoning; distinguishes appropriate arguments from extraneous elements; provides sufficient logical support. Accurately interprets data or information; identifies obvious assumptions, establishes credibility of sources on points other than authority alone, avoids fallacies in reasoning; distinguishes appropriate arguments from extraneous elements; provides sufficient logical support. Makes some errors in data or information interpretation; makes arguments using weak evidence; provides superficial support for conclusions or solutions. Interprets data or information incorrectly; Supports conclusions or solutions without evidence or logic; uses data, information, or evidence skewed by invalid assumptions; uses poor sources of information; uses fallacious arguments. Does not evaluate data, information, or evidence related to final solution. V. Synthesis Suggest ways to improve/strengthen your final solution. Insightfully relates concepts and ideas from multiple sources; uses new information to enhance final solution; recognizes missing information; correctly identifies potential effects of new information. Accurately relates concepts and ideas from multiple sources; uses new information to enhance final solution; correctly identifies potential effects of new information. Inaccurately or incompletely relates concepts and ideas from multiple sources; shallow determination of effect of new information on final solution. Poorly integrates information from more than one source to support final solution; Incorrectly predicts the effect of new information on final solution. Does not identify new information for final solution. May 21, 2010 Faculty Champion Meeting

30 Assessment Rubric for CT
Performance Element Exemplary (4) Proficient (3) Developing (2) Emerging (1) Not Present (0) VI. Reflection Reflect on your own thought process. “What did you learn from this process?” “What would you do differently next time to improve?” Identifies strengths and weaknesses in own thinking: recognizes personal assumptions, values and perspectives, compares to others’, and evaluates them in the context of alternate points of view. Identifies strengths and weaknesses in own thinking: recognizes personal assumptions, values and perspectives, compares to others’, with some comparisons of alternate points of view. Identifies some personal assumptions, values, and perspectives; recognizes some assumptions, values and perspectives of others; shallow comparisons of alternate points of view. does not consider alternate points of view. Does not reflect on own thinking. May 21, 2010 Faculty Champion Meeting

31 Narrowing the Gulf Annual Conference 2011
Assessment Basics Alignment of course objectives Competency, Clarity, Bias, Level of Difficulty Validity and Reliability C March 31 and April 1, 2011 Narrowing the Gulf Annual Conference 2011

32 Narrowing the Gulf Annual Conference 2011
Assessment Basics Alignment Everything needs to align (objectives through assessment) C March 31 and April 1, 2011 Narrowing the Gulf Annual Conference 2011

33 Narrowing the Gulf Annual Conference 2011
Competency Items should test for the appropriate or adequate level of knowledge, skill, or ability (KSA) for the students. Assessing lower division students on graduate level material is an ‘unfair’ expectation. The competent student should do well on an assessment, items should not be written for only the top students in the class. C March 31 and April 1, 2011 Narrowing the Gulf Annual Conference 2011

34 Narrowing the Gulf Annual Conference 2011
Clarity Clear, precise item and instruction Correct grammar, punctuation, spelling Address one single issue Avoid extraneous material (teaching) One correct or clearly best answer Legible copies of exam C March 31 and April 1, 2011 Narrowing the Gulf Annual Conference 2011

35 Narrowing the Gulf Annual Conference 2011
Bias Tests should be free from bias… No stereotyping No gender bias No racial bias No cultural bias No religious bias No political bias C March 31 and April 1, 2011 Narrowing the Gulf Annual Conference 2011

36 Narrowing the Gulf Annual Conference 2011
Level of Difficulty Ideally, test difficulty should be aimed a middle level of difficulty. This can not always be achieved when the subject matter is based on specific expectations (i.e, workforce area). J March 31 and April 1, 2011 Narrowing the Gulf Annual Conference 2011

37 Trivial and Trick Questions
Avoid trivia and tricks Avoid humorous or ludicrous responses Items should be straight forward, they should cleanly delineate those that know the material from those that do not Make sure every item has value and that it is contributing to the final score J March 31 and April 1, 2011 Narrowing the Gulf Annual Conference 2011

38 Narrowing the Gulf Annual Conference 2011
Assessment Basics Does one size fit all? Assessments need to be valid Assessments need to be reliable J March 31 and April 1, 2011 Narrowing the Gulf Annual Conference 2011

39 Narrowing the Gulf Annual Conference 2011
Validity Does the assessment measure what it is suppose to measure? “Validation is the process of accumulating evidence that supports the appropriateness of inferences that are made of student responses…” (AERA, APA, & NCME, 1999) J March 31 and April 1, 2011 Narrowing the Gulf Annual Conference 2011

40 Types of Validity Evidence
Content Related - the extent to which a student’s responses to a given assessment reflect that student’s knowledge of the content area Construct Related - the extent to which the responses being evaluated are appropriate indicators of the underlying construct Criterion Related - the extent to which the results of the assessment correlate with a current or future event Consequential – the consequences or use of the assessment results J March 31 and April 1, 2011 Narrowing the Gulf Annual Conference 2011

41 Narrowing the Gulf Annual Conference 2011
Reliability Consistency of the assessment scores Types of reliability… Interrater Reliability – scores vary from instructor to instructor. Intrarater Reliability – scores vary from a single instructor from paper to paper A test can be reliable and not valid, but never valid and not reliable J March 31 and April 1, 2011 Narrowing the Gulf Annual Conference 2011

42 Narrowing the Gulf Annual Conference 2011
Reliability Concerns Reliability Are the score categories well defined? Are the differences between the score categories clear? Would two independent raters arrive at the same score for a given student response based on the scoring rubric? J March 31 and April 1, 2011 Narrowing the Gulf Annual Conference 2011

43 Improving Scoring Consistency
Provide grading rubrics or scoring criteria to students prior to assessment Grade papers anonymously Use anchor papers to define levels of proficiency for reference Use multiple scorers Calculate reliability statistics during training and grading J March 31 and April 1, 2011 Narrowing the Gulf Annual Conference 2011

44 Narrowing the Gulf Annual Conference 2011
Assessment Basics Multiple Measures Always to good to implement multiple measures when possible Ideally direct and indirect measures of competency C March 31 and April 1, 2011 Narrowing the Gulf Annual Conference 2011

45 Narrowing the Gulf Annual Conference 2011
Indirect Methods “indirect measures …help deepen the interpretation of student learning” (Maki, 2004). SSI is a good example of an indirect measure. C March 31 and April 1, 2011 Narrowing the Gulf Annual Conference 2011

46 Narrowing the Gulf Annual Conference 2011
Questions/Next Steps March 31 and April 1, 2011 Narrowing the Gulf Annual Conference 2011

47 Incorporating Authentic Assessment in the Classroom
Narrowing the Gulf Annual Conference 2011 March/April 2011


Download ppt "Incorporating Authentic Assessment in the Classroom"

Similar presentations


Ads by Google