Presentation is loading. Please wait.

Presentation is loading. Please wait.

UCLA Graduate School of Education & Information Studies National Center for Research on Evaluation, Standards, and Student Testing Accountability and Student.

Similar presentations


Presentation on theme: "UCLA Graduate School of Education & Information Studies National Center for Research on Evaluation, Standards, and Student Testing Accountability and Student."— Presentation transcript:

1 UCLA Graduate School of Education & Information Studies National Center for Research on Evaluation, Standards, and Student Testing Accountability and Student Learning Eva L. Baker CRESST Conference Los Angeles, CA – January 23, 2007

2 2/19 Accountability and Student Learning A dialectic to explore the relationship between accountability and student learning Likely impact of accountability policy on U.S. education for future Models of learning The limits of current test-based accountability Finding, recognizing and using serious learning to support students’ development, accomplishment, and adaptability

3 3/19 Are Accountability Systems Affecting Learning? Of some students at some grade levels in some content areas Of adults, although questionable utility of behaviors relative to student learning In time to make a difference?

4 4/19 Test-Based Accountability and U.S. Educational Needs The current education system does not prepare students to meet and adapt successfully to the challenges of the current and future competitive marketplace Large segments of the public perceive that the U.S. K-12 school system is “broken” and cannot be rapidly repaired Inadequacy and instability characterize the levels, allocations, and effective uses of educational resources (funding, books, technology, course offerings, facilities, and people) X PRIZE for education, draft problem statement, January 2007.

5 5/19 Test-Based Accountability and U.S. Educational Needs (Cont’d) Without the flexibility (and wherewithal) to adapt to the needs of students with different backgrounds, the educational system fails to address existing gaps in learning in any reasonable time period Many students lack support and encouragement from influential figures in their lives, thus perpetuating students’ low levels of educational motivation and self-efficacy X PRIZE for education, draft problem statement, January 2007.

6 6/19 Have Extant Accountability Systems Changed the Probabilities? Adapt and compete No Fix system rapidly No Inadequacy and instability of resources ? Gap reduction in reasonable time No Influential figures No

7 7/19 Defining Learning Content Constructs—domains—specific tasks Ontologies—structured relationships Contexts or situations Cognitive complexity or tasks Acquisition—subtasks, enabling behaviors, prerequisites Behavioral models of acquisition—networks Display and response modes

8 8/19 Some Models of Learning Bloom et al. Gagn é Glaser Merrill Anderson Mayer, Sweller CRESST Knowledge, comprehension, application, analysis, synthesis, evaluation Verbal learning, discrimination, procedures, cognitive strategies Expertise, patterns, automaticity, rich content Activation, demonstration, application, integration, problem Declarative, procedural, strategic (or systemic) Schema support, working memory conservation Content explanation, problem solving, communication, teamwork, metacognition

9 9/19 Domain Independence and Test Design All learning frameworks require embedding in content If generalization, transfer, application to the real world is desired, instruction must explicitly address them, and tests must assess them Learning structures rather than content (as data) should be the starting point of test development (if economics of design and support of learning generalization for teaching and measurement are criteria) BTW, KWSK emphasized domain-specific learning models but with limited examples

10 10/19 Facts About Testing For policymakers and public, test scores and learning are equivalent and tests are interchangeable Multiple purposes for accountability tests: evaluation, improvement, criteria for sanctions/rewards, communication, status, job stability, equity Not designed or validated for these purposes* Tests and relevant instruction rarely reflect models of complex learning

11 11/19 Learning models have often been reduced to “multiple choice” or “open-ended” response modes—faux correspondence to complexities of learning outcomes Current methods of alignment of tests apologize and reduce focus of instruction and learning for both content and cognition Thus, tested content and skills are frequently practiced in test-like contexts* Performance is narrowed, inflated, ignores adaptation, with little mindful application, transfer, integration At odds with needs for innovation, creativity, future competitive requirements Narrow and Narrower

12 12/19 Let Tests Be Tests Exigencies of time, cost, profit, and tradition have loaded testing and learning into different silos Testing starts and ends with content, not learning models Test development is burdened with ever-decreasing development schedules and opportunities for validation Learning structures as the core of test design architecture take more time at the outset, cost less over time, support generalization of teaching and learning, are smarter, yet Short-term cost and delivery wins and may always win Giving in, but not giving up

13 13/19 Let It Be For the foreseeable future, testing will not change Zip incentives No more Sisyphus type alignment The accountability test is it Live with it Don’t try to fix it with more or different or cooler tests

14 14/19 Instead, Focus on the Missing Learning in Accountability Systems In standards but is neither tested nor judged (examples) Serious writing Research projects Innovative solutions Neither in essential standards nor tested but essential for success (examples) and growth Realms of content and accomplishment Art, music, citizenship, history, exploratory and integrative learning Understanding other cultures

15 15/19 Learning Remedies for Accountability Create explicit opportunities for students to learn beyond currently tested content on a national scale Create multi-sized, bounded instructional parcels with integrated measures Design parcels to reflect domain-independent learning models, KWSK models where data exist, and structured knowledge Broad instructional choices (games, people, must have technology component to scale) Mixed initiative (some required, some open choice) Validated—completion means learning Completion of learning tasks a countable outcome (like AP courses)

16 16/19 Collecting Learning Commercial or teacher made Like POWERSOURCE © parcels—aligned with ontology and “scaled” by expert performance so value is known Some will require extended, integrated, and team interaction Reproducible and equitably accessed over the Web by teachers, students, parents Driven by parent desire, IHE understanding, and later, modified accountability systems

17 17/19 Accountability and Learning Summary Accountability tests will be whatever the bureaucracy mandates and can afford Formative assessment vs. benchmark tests limited by resources, expertise, time Instead of the “outcome,” count the learning Recorded in unique portfolios for students as higher education and business change their entry reviews

18 18/19 A System of Accountability Tests and Learning Tasks to Improve Long-Term U.S. Capacity Provide flexibility and adaptability Allow students to learn, explore, and succeed in an accelerated fashion Provide continuing venue for alignment buffs Reduce the narrowness of present teaching patterns Credit broader and more complex learning than that practiced for test performance

19 19/19 next presentation Eva L. Baker voice fax email 310.206.1530 310.267.0152 baker@cse.ucla.edu ©2006 Regents of the University of California


Download ppt "UCLA Graduate School of Education & Information Studies National Center for Research on Evaluation, Standards, and Student Testing Accountability and Student."

Similar presentations


Ads by Google