Struggling for meaning in standards-based assessment Mark Wilson UC Berkeley.

Slides:



Advertisements
Similar presentations
Willing to spend the time! Self motivated! Self responsibility! (If you need something Ask For IT!!!!!) Ability to communicate! (Vocabulary) Write,
Advertisements

Performance Assessment
School Based Assessment and Reporting Unit Curriculum Directorate
NWSC November Math Cohort Meeting WELCOME! Nancy Berkas Cyntha Pattison.
Computer-Based Performance Assessments from NAEP and ETS and their relationship to the NGSS Aaron Rogat Educational Testing Service.
Response to Intervention Finding RTI-Ready Measures to Assess and Track Student Academic Skills Jim Wright
Planning Value of Planning What to consider when planning a lesson Learning Performance Structure of a Lesson Plan.
Major Outcomes of Science Instruction
1 Welcome back!. Vision for Science Teaching and Learning 2 View free PDF from The National Academies Press at *Will also be posted.
Integrating the Life Sciences from Molecule to Organism The American Physiological Society Transform a Cookbook Lab Moving Toward More Student-Centered.
MATHEMATICS KLA Years 1 to 10 Understanding the syllabus MATHEMATICS.
Challenges in Developing a University Admissions Test & a National Assessment A Presentation at the Conference On University & Test Development in Central.
Crosscutting Concepts and Disciplinary Core Ideas February24, 2012 Heidi Schweingruber Deputy Director, Board on Science Education, NRC/NAS.
NCCSAD Advisory Board1 Research Objective Two Alignment Methodologies Diane M. Browder, PhD Claudia Flowers, PhD University of North Carolina at Charlotte.
Who Am I? Hallie Booth – Special Education (K-12) – Science 6-8 (Gifted and Talented 6 th ) – Science Coach 6-12 – CTE LDC Coach 9-12 – Middle School LDC.
MADE-CLEAR CCEP Grant J. Randy McGinnis and Chris McDonald University of Maryland 2.
Common Core Mathematics, Common Core English/Language Arts, and Next Generation Science Standards. What’s the common thread?
Australian Curriculum Science K-6
Building Effective Assessments. Agenda  Brief overview of Assess2Know content development  Assessment building pre-planning  Cognitive factors  Building.
Conceptual Framework for the College of Education Created by: Dr. Joe P. Brasher.
Maryland College and Career Readiness Conference Summer 2014.
Literacy is...  the quality or state of being literate, esp. the ability to read and write  An individual’s ability to construct, create, and communicate.
TEACHERS’ KNOWLEDGE AND PEDAGOGICAL CONTENT KNOWLEDGE
Assessment Practices That Lead to Student Learning Core Academy, Summer 2012.
Learning Progressions: Some Thoughts About What we do With and About Them Jim Pellegrino University of Illinois at Chicago.
1 Issues in Assessment in Higher Education: Science Higher Education Forum on Scientific Competencies Medellin-Colombia Nov 2-4, 2005 Dr Hans Wagemaker.
Welcome to AP Biology Mr. Levine Ext. # 2317.
Committee on the Assessment of K-12 Science Proficiency Board on Testing and Assessment and Board on Science Education National Academy of Sciences.
Review of BAS Learning progression Item design Outcomes/Sco ring Assessment quality Outline OVERVIEW OF BEAR ASSESSMENT SYSTEM.
TEA Science Workshop #1 September 27, 2012 Kim Lott Utah State University.
1 Historical Perspective... Historical Perspective... Science Education Reform Efforts Leading to Standards-based Science Education.
Performance and Portfolio Assessment. Performance Assessment An assessment in which the teacher observes and makes a judgement about a student’s demonstration.
Construct-Centered Design (CCD) What is CCD? Adaptation of aspects of learning-goals-driven design (Krajcik, McNeill, & Reiser, 2007) and evidence- centered.
Teaching to the Standard in Science Education By: Jennifer Grzelak & Bonnie Middleton.
Sustainability Education and the Next Generation Science Standards.
Welcome Science 5 and Science 6 Implementation Workshop.
Research Strategies. Why is Research Important? Answer in complete sentences in your bell work spiral. Discuss the consequences of good or poor research.
How People Learn – Brain, Mind, Experience, and School (Bransford, Brown, & Cocking, 1999) Three core principles 1: If their (students) initial understanding.
NGSS-Health Science August Connection to the Common Core.
MakingConnections Assessment.
Lecture # 32 SCIENCE 1 ASSOCIATE DEGREE IN EDUCATION Professional Standards for Teaching Science.
111 MakingConnections Focus on Assessment. 222 Facilitator/s: Date:
National Standards in Reading & Writing Sources : NZ Ministry of Education websites. G Thomas, J Turner.
Student Learning Objectives (SLO) Resources for Science 1.
The Value in Formative Assessment Prepared By: Jen Ramos.
Maryland College and Career Readiness Conference Summer 2015.
A K-12 LEARNING PROGRESSION TO SUPPORT ENVIRONMENTAL LITERACY MICHIGAN STATE UNIVERSITY Environmental Literacy Research Group.
Click to edit Master title style Overview of the NGSS Framework.
The Basics About NGSS
The case for scientific literacy? so pretty i never knew mars had a sun.
Agenda Introductions Objectives and Agenda Review Principal Evaluation: So far Revised resources Collect evidence from a “faculty meeting” Debrief Planning.
Aligning Assessments to Monitor Growth in Math Achievement: A Validity Study Jack B. Monpas-Huber, Ph.D. Director of Assessment & Student Information Washington.
A Vision for K-12 Science Education as Described in the Framework for K-12 Science Education and Next Generation Science Standards How is NGSS different.
Foundations of American Education: Perspectives on Education in a Changing World, 15e © 2011 Pearson Education, Inc. All rights reserved. Chapter 11 Standards,
Creative Curriculum and GOLD Assessment: Early Childhood Competency Based Evaluation System By Carol Bottom.
AP Biology Revised Framework MSTA 2012 Tina Wagner.
Inquiry Primer Version 1.0 Part 4: Scientific Inquiry.
1 Using Standards Aligned System to Ensure 21st Century Teaching and Learning December 8, 2009 HOMEROOM 3 Fair Assessment & Materials and Resources.
The Future for Assessment? Assessing Pupil Progress (APP) as a tool for effective Teacher Assessment in Primary Science.
Module 1: Overview of the Framework for K–12 Science Education
Board on Science Education Draft released 15 July 2011
How can understanding the Civics EOC Item Specification Document impact teaching and increase student achievement? The Item Specifications is a resource.
Computational Reasoning in High School Science and Math
How can understanding the Civics EOC Item Specification Document impact teaching and increase student achievement? The Item Specifications is a resource.
How can understanding the Civics EOC Item Specification Document impact teaching and increase student achievement? The Item Specifications is a resource.
Why do we assess?.
2009 AERA Annual Meeting, San Diego
Assessment - Getting it Right
How can understanding the Civics EOC Item Specification Document impact teaching and increase student achievement? The Item Specifications is a resource.
Presentation transcript:

Struggling for meaning in standards-based assessment Mark Wilson UC Berkeley

Outline What do we mean by “standards-based” assessments? Some current solutions to the problem of assessing standards An alternative –Learning performances –Learning progressions –Progress variables

What do we mean by “standards- based” assessments? What people often think they are getting: –A useful result for each standard “ideal approach” –The illusion of “standards-based” assessments What they are usually getting: –A single result that is somehow related to all, or a subset of, the standards –The reality of “standards-based” assessments

How standards-based is “standards-based”? “Fidelity”--how well do the assessments match the standards? High Fidelity: each standard has its own useable result Moderate Fidelity: each standard is represented by at least one item in the assessments Low Fidelity: the items match some of the standards

Why can’t each standard be assessed? Fidelity versus Cost when total cost is fixed Number of items Fidelity $ per item i.e., in the “ideal approach” we need so many items per standard that we can’t afford it.

Common Solutions: “Standards-based” One (more or less) items per standard –not enough for actual assessments of standards –Also used to provide emphasis among standards (i.e., “gold standards”) Sample standards over time Assess only a certain subset of the standards Validate through “alignment review” Decide to have a much smaller set of standards –Popham’s “Instructionally-sensitive assessments”

E.g. #1

Eg. #2

“Standards-based” assessments Do not have high fidelity to standards Are what can be afforded Still maintain “threat” effect –Although low density of items per standard means that “threat” on any one standard is low

Thinking about an Alternative “A mile wide and an inch deep” –now-classic criticism of US curricula in Mathematics and Science Need for standards to be interpretable by educators, policy-makers, etc. Need to enable long-term view of student growth Need to find a more efficient way to use item information than in “ideal approach”

Learning Performances Learning performances: a way of elaborating on content standards by specifying what students should be able to when they achieve a standard –E.g., students should be able to describe phenomena, use models to explain patterns in data, construct scientific explanations, or test hypotheses –Reiser (2002), Perkins (1998)

Learning performance example Benchmark (AAAS, 1993): –[The student will understand that] Individual organisms with certain traits are more likely than others to survive and have offspring LP expansion (Reiser et al, 2003): –Students identify and represent mathematically the variation on a trait in a population. –Students hypothesize the function a trait may serve and explain how some variations of the trait are advantageous in the environment. –Students predict, supported with evidence, how the variation on the trait will affect the likelihood that individuals in the population will survive an environmental stress.

Learning progressions Learning progressions: descriptions of the successively more sophisticated ways of thinking about an idea that follow one another as students learn –Aka learning trajectories, progressions of developmental competence, and profile strands More than one path leads to competence Need to engage in curriculum debate about which learning progressions are most important –Try and choose them so that we end up with fewer standards per grade level

Learning progression examples Evolutionary Biology –Catley, K., Reiser, B., and Lehrer, R. (2005). Tracing a prospective learning progression for developing understanding of evolution. Atomic-Molecular Theory –Smith, C., Wiser, M., Anderson, C.W., Krajcik, J., and Coppola, B. (2004). Implications of research on children’s learning for assessment: matter and atomic molecular theory. Both available at: – K-12_Science.htmlhttp://www7.nationalacademies.org/bota/Test_Design_ K-12_Science.html

Progress Variables Progress variable: Assessment expression of a learning progression Aim is to use what we know about meaningful differences in item difficulty to make the interpretation of the results more efficient –Borrow interpretative and psychometric strength from easier and more difficult items, so that we don’t need as many as does the “ideal approach”. Progress variables are a principal component of the BEAR Assessment System (Wilson, 2005; Wilson & Sloane, 2000):

The BEAR Assessment System 4 principles: 4 building blocks Examples provided by:

Principle 1: Developmental Perspective Building Block 1: Construct Map Developmental perspective –assessment system should be based on a developmental perspective of student learning Progress variable –Visual metaphor for how the students develop and how we think about how their item responses might change

Example: Why things sink and float

Principle 2: Match between curriculum and assessment Building Block 2: Items design Instruction & assessment match –there must be a match between what is taught and what is assessed Items design –a set of principles that allows one to observe the students under a set of standard conditions that span the intended range of the item contexts

Example: Why things sink and float Please answer the following question. Write as much information as you need to explain your answer. Use evidence, examples and what you have learned to support your explanations. Why do things sink and float?

Principle 3: Interpretable by teachers Building Block 3: Outcome space Management by teachers –that teachers must be the managers of the system, and hence must have the tools to use it efficiently and use the assessment data effectively and appropriately Outcome space –Categories of student responses must make sense to teachers

Example: Why things sink and float

Principle 4: Evidence of quality Building Block 4: Measurement model Evidence of quality –reliability and validity evidence, evidence for fairness Measurement model –multidimensional item response models, to provide links over time both longitudinally within cohorts and across cohorts

Example: Evaluate progress of a group OTUFPMMVMVDRD

Evaluate a student’s locations over time Embedded Assessments

BEAR Assessment System: Principles Developmental Perspective Need a framework for communicating meaning Match between Instruction and Assessment Need methods of gathering data that are acceptable and useful to all participants Interpretable by Teachers Need a way to value what we see in student work Evidence of QualityNeed a technique of interpreting data that allows meaningful reporting to multiple audiences

In conclusion… Achieving meaningful measures is tough under any circumstances, but especially so in an accountability situation, –where the requirements for accountability and the scale of the evaluation make it very expensive. Strategies like learning performances, learning progressions and progress variables are needed to make meaning possible, and affordable.

References American Association for the Advancement of Science (1993). Benchmarks for Science Literacy. New York: Oxford University Press. Catley, K., Reiser, B., and Lehrer, R. (2005). Tracing a prospective learning progression for developing understanding of evolution. Commissioned paper prepared for the National Research Council’s Committee on Test Design for K-12 Science Achievement, Washington, DC.( Reiser, B.J., Krajcik, J., Moje, E., and Marx, R. (2003). Design strategies for developing science instructional materials. Paper presented at the National Association for Research in Science Teaching Annual Meeting, March, Philadelphia, PA. Smith, C., Wiser, M., Anderson, C.W., Krajcik, J., and Coppola, B. (2004). Implications of research on children’s learning for assessment: matter and atomic molecular theory. Commissioned paper prepared for the National Research Council’s Committee on Test Design for K–12 Science Achievement, Washington, DC..( Wilson, M. (2005). Constructing measures: An item-response modeling approach. Mahwah, NJ: Lawrence Erlbaum Associates.( ) Wilson, M, & Bertenthal, M. (Eds.). (2005). Systems for state science assessment. Report of the Committee on Test Design for K-12 Science Achievement. Washington, D.C.: National Academy Press. ( Wilson, M., and Sloane, K. (2000). From principles to practice: An embedded assessment system. Applied Measurement in Education, 12(2), 181–208. Available at: