Download presentation

Presentation is loading. Please wait.

Published byAurora Castillo Modified over 2 years ago

1
Assessment in the Service of Learning: the roles and design of highstakes tests Hugh Burkhardt MARS: Mathematics Assessment Resource Service Shell Center, University of Nottingham and UC Berkeley Oakland Schools, October 2012 Mathematics Assessment Project

2
Structure of this talk A word on the Common Core The roles of assessment Tasks and tests Task difficulty and levels of understanding SBAC (and PARCC) Computer-based testing Testing can be designed to serve learning.

4
Content: Getting Richer Practices: Much deeper and richer

5
The Practices in CCSS-M: Make sense of problems and persevere in solving them. Reason abstractly and quantitatively. Construct and critique viable arguments Model with mathematics Use appropriate tools strategically Attend to Precision Look for and make use of structure Look for and express regularity in repeated reasoning.

6
The Roles of Assessment The traditional view: Tests are just measurement, valid and reliable (if a little strange looking) Reality: tests have three roles: Measuring a few aspects of math-related performance Defining the goals by which students and teachers are judged Driving the curriculum This implies a huge responsibility on those who test

7
High-stakes assessment implicitly Exemplifies performance objectives For most teachers, and the public, test tasks are assumed to exemplify the standards – so they effectively replace them Determines the pattern of teaching and learning FACT: most teachers teach to the test perfectly reasonable bottom line Taking the standards seriously implies designing tests that meet them: Tests worth teaching to that enable all students to show what they can do

8
WHAT YOU TEST IS WHAT YOU GET

9
Mathematical Practices Proficient students expect mathematics to make sense. They take an active stance in solving mathematical problems. When faced with a non-routine problem, they have the courage to plunge in and try something, and they have the procedural and conceptual tools to carry through. They are experimenters and inventors, and can adapt known strategies to new problems. They think strategically. CCSSM How far do our current tests assess this? Not far?

10
Tasks and Tests

11
Levels of mathematical expertise It is useful to distinguish task levels, showing increasing emphasis on mathematical practices. Novice Tasks Short items, each focused on a specific concept or skill, as set out in the standards cf ELA spelling, grammar Apprentice Tasks Rich tasks with scaffolding, structured so that students are guided through a ramp of increasing challenge Expert Tasks Rich tasks in a form they might naturally arise – in the real world or in pure mathematics cf ELA writing

12
Task examples

14
Some Expert Tasks Tasks that are not predigested. Problems as they might arise: in the world outside the math classroom in really doing math

15
Expert Tasks Traffic Jam 1. Last Sunday an accident caused a traffic jam 11 miles long on a two lane highway. How many cars do you think were in the traffic jam? Explain your thinking and show all your calculations. Write down any assumptions you make. (Note: a mile is approximately equal to 5,000 feet.) 2.When the accident was cleared, the cars drove away from the front, one car from each of the lanes every two seconds. Estimate how long it took before the last car moved.

16
Airplane turnaround How quickly could they do it?

17
Ponzi Pyramid Schemes Max has just received this email From: A. Crook To: B. Careful Do you want to get rich quick? Just follow the instructions carefully below and you may never need to work again: 1. Below there are 8 names and addresses. Send $5 to the name at the top of this list. 2. Delete that name and add your own name and address at the bottom of the list. 3. Send this email to 5 new friends.

18
Ponzi continued If that process goes as planned, how much money would be sent to Max? What could possibly go wrong? Explain your answer clearly. Why do they make Ponzi schemes like this illegal? This task involves Formulating the problem mathematically Understanding exponential growth Knowing it cant go on for ever, and why

21
PYTHAGOREAN TRIPLES

22
(3, 4, 5), (5, 12, 13), (7, 24, 25) and (9, 40, 41) satisfy the condition that natural numbers (a, b, c) are related by c 2 = a 2 + b 2 Investigate the relationships between the lengths of the sides of triangles which belong to this set Use these relationships to find the numerical values of at least two further Pythagorean Triples which belong to this set. Investigate rules for finding the perimeter and area of triangles which belong to this set when you know the length of the shortest side.

23
Which sport will give a graph like this? Describe in detail how your answer fits the graph – as in a radio commentary Which sport? task from the literature, 1982

24
Table tiles Maria makes square tables, then sticks tiles to the top. Square tables have sides that are multiples of 10 cm. Maria uses quarter tiles at the corners and half tiles along edges. How many tiles of each type are needed for a 40 cm x 40 cm square? Describe a method for quickly calculating how many tiles of each type are needed for larger, square table tops.

27
Apprentice tasks Expert tasks with added scaffolding to: ease entry reduce strategic demand Ramp of difficulty within the task, with increasing: complexity abstraction demand for explanation Balanced Assessment in Mathematics (BAM) tests are of this kind – complementing state test (novice)

30
Apprentice tasks: design Guide students through a ramp of challenge Patchwork gives: Multiple examples that ease understanding Specific numerical cases to explore – counting A helpful representation – the table only then Asks for a generalization – rule, formula Presents an inverse problem A step in growing expertise: climbing with a guide

33
Task Difficulty The difficulty of a task depends on various factors: Complexity Unfamiliarity Technical demand Autonomy expected of the student Expert Tasks fully involve the mathematical practices and all four aspects, so must not be too technically demanding Apprentice Tasks involve the mathematical practices at a modest level, with little student autonomy Novice Tasks present mainly technical demand, so this can be up to grade, including concepts and skills just learnt

34
Levels of understanding Imitation Retention Explanation chains of reasoning (2 nd sentence? ) Adaptation requires non-routine problems Extension offer opportunities Jean Piaget

35
The Practices in CCSS-M: Make sense of problems and persevere in solving them. Reason abstractly and quantitatively. Construct and critique viable arguments Model with mathematics Use appropriate tools strategically Attend to Precision Look for and make use of structure Look for and express regularity in repeated reasoning.

36
These havent been a focus of testing … but they will be – maybe

37
Smarter Balanced Assessment Consortium http://www.k12.wa.us/smarter/ (Just google SBAC)

39
Here are some of the headlines.

40
SMARTER Balanced content spec Claim #1 Concepts & Procedures Students can explain and apply mathematical concepts and interpret and carry out mathematical procedures with precision and fluency. Claim #2 Problem Solving Students can solve a range of complex well-posed problems in pure and applied mathematics, making productive use of knowledge and problem solving strategies. Claim #3 Communicating Reasoning Students can clearly and precisely construct viable arguments to support their own reasoning and to critique the reasoning of others. Claim #4 Modeling and Data Analysis Students can analyze complex, real-world scenarios and can construct and use mathematical models to interpret and solve problems. PARCC so far seems less specific; mainly CCSSM content standards

42
Total Score for Mathematics Content and Procedures Score 40% Problem Solving Score 20% Communicating Reasoning Score 20% Mathematical Modeling Score 20%

43
So: A large part of the exam will be devoted to things we havent tested before. but– there is THE CAT

44
Computer-based testing Promises of cheap instant adaptive testing Great strengths and, even after 70 years, weaknesses Key questions: for rich tasks does CBT provide Effective handling of the testing process? Better ways for presenting tasks? A natural medium for students to work on math? Effective ways to capture a students reasoning? Reliable ways to score a students response? Effective ways for collecting and reporting results?

46
Computer-based testing: summary Best way to manage high-stakes testing Fine on its own for Novice level tasks (short items) Expert and Apprentice tasks essentially involve: long chains of autonomous student reasoning sketching and doodling: diagrams, numbers, equations This needs image capture (paper, scan, or ? off tablet screen) human scoring (on screen) responses too diverse for computer Can improve testing in various ways; for analysis see Educational Designer lead article in Issue 5, out soon

47
SBAC test structure Three components planned: CAT: computer-adaptive on-line test set of rich constructed response items a classroom-based performance task (up to 2 periods) Task types: extended examples in content spec PARCC also has end of course CAT + periodic assessments during year – nature open to creative input by educators and vendors

48
SBAC and PARCC some impressions and comments on plans and challenges Some seem desperate to stick with Computer- based testing

49
Heres a sample PARCC modeling item.

51
What? Madlibs on a math test? Think about WYTIWYG! 20 days of test prep – playing math-related video games

52
Cost-effective human scoring? Some standard approaches on-screen professional scorers on-screen trained teacher-scorers get-together training-scoring meetings Factors to be weighed (cf ~ $2,000 per year) marginal cost consistency (reliability), with monitoring professional development gain non-productive test prep class time saved Needs collaboration at system level: math; assess; PD

53
Some comments Realising these goals takes people outside their comfort zone, particularly in the design of: o rich tasks that work well with kids in exams o implementation mechanisms that work smoothly o processes that will have public credibility Cost containment requires some integration of curriculum, tests and professional development Lots of experience, worldwide and some US, using: o literature of rich tasks o modes of teacher involvement (eg SVMI)

54
Pushback is inevitable from: fear of time, cost, litigation, …. anything new psychometric tradition and habit: o testing is just measurement o focus on statistics, ignoring systematic error ie not measuring what youre interested in overestimating in-house expertise (principles fine; tasks often lousy) Good outcomes will depend on close collaboration of assessment folk, math folk, outside expertise

55
But the quality of the tests is crucial It is now widely recognized that high stakes assessments establish the ceiling with regard to performance expectations in most classrooms – or, to put it another way, the lower the bar, the lower people will aim. Accordingly, SBAC will seek to ensure that a students success on its assessments depends on a learning program that reflects the Common Core State Standards for Mathematics in a rich and balanced way. This is the nations best chance – for the next decade at least – to move the system in the right directions. Because of the high stakes testing ceiling effect described above, credibly assigned scores on performance tasks will need to be a major part of the score reporting. From earlier draft of SBAC content specs

56
TESTING HAS BEEN DONE WELL – AND COULD BE AGAIN

57
Structure of this talk A word on the Common Core The roles of assessment Tasks and tests Task difficulty and levels of understanding SBAC (and PARCC) Computer-based testing Testing can be designed to serve learning.

58
Thank you Contact: Hugh.Burkhardt@nottingham.ac.uk Lessons and tasks: http://map.mathshell.org.uk/materials/ also ISDDE report on assessment designhttp://www.mathshell.org/papers/http://www.mathshell.org/papers/

Similar presentations

© 2017 SlidePlayer.com Inc.

All rights reserved.

Ads by Google