C R E S S T / U C L A Center for the Study of Evaluation National Center for Research on Evaluation, Standards, and Student Testing (CRESST) New Models.

Slides:



Advertisements
Similar presentations
Progress Towards Reading Success: The Reading First Evaluation Prepared by: Amy Kemp, Ph.D. Research Associate and Patricia A. Muller, Ph.D. Associate.
Advertisements

The Enterprise Skills Story
The Teacher Work Sample
Parents as Partners in Education
Progress Monitoring project DATA Assessment Module.
Warren County Middle School Business and Information Technology Program IC 3 Certification Program 2 Year Progress Report.
Minnesota Manual of Accommodations for Students with Disabilities Training Guide
How to Integrate Students with Diverse Learning Needs in a General Education Classroom By: Tammie McElaney.
SOWK 6003 Social Work Research Week 4 Research process, variables, hypothesis, and research designs By Dr. Paul Wong.
Standards-based Instruction and Assessment Ohio State ABLE Director’s Meeting October 29, 2002 Presenter: Mahna Schwager, PhD WestEd.
C R E S S T / U C L A Improving the Validity of Measures by Focusing on Learning Eva L. Baker CRESST National Conference: Research Goes to School Los Angeles,
Teaching with Depth An Understanding of Webb’s Depth of Knowledge
Minnesota Manual of Accommodations for Students with Disabilities Training Guide
Project-Based Learning
Tools of the Trade: Using Technology in Your Course Tools of the Trade: Using Technology in Your Course 1 Ms. Darla Runyon Assistant Director/Curriculum.
performance INDICATORs performance APPRAISAL RUBRIC
Community Planning Training 1-1. Community Plan Implementation Training 1- Community Planning Training 1-3.
Performance-Based Assessment June 16, 17, 18, 2008 Workshop.
Welcome What’s a pilot?. What’s the purpose of the pilot? Support teachers and administrators with the new evaluation system as we learn together about.
Combined Grades Making Them Work Fall 2007 Building Classes of Combined Grades “In successful schools, classrooms are organized to meet the learning.
1 Ohio’s Entry Year Teacher Program Review Ohio Confederation of Teacher Education Organizations Fall Conference: October 23, 2008 Presenter: Lori Lofton.
Rediscovering Research: A Path to Standards Based Learning Authentic Learning that Motivates, Constructs Meaning, and Boosts Success.
Evaluating Educational Technology and Integration Strategies By: Dakota Tucker, Derrick Haney, Demi Ford, and Kent Elmore Chapter 7.
Instructional Accommodations Inservice. Who deserves accommodations? Everyone! Instructional accommodations are not just for students who are struggling.
New Advanced Higher Subject Implementation Events
Robert Reid Torri Ortiz Lienemann.  Session I: ◦ Introductions of group members, facilitators, and text ◦ Review format for the book study ◦ Choose partners/small.
Warren County Middle School Business And Information Technology Program “ A Different Approach”
Welcome to EDTL1720 – Instructional Design School of Education Facilitator: Debra Ferdinand,PhD May
1. 2 Why is the Core important? To set high expectations –for all students –for educators To attend to the learning needs of students To break through.
Becoming a Teacher Ninth Edition
Authentic Assessment Principles & Methods
What should teachers do in order to maximize learning outcomes for their students?
Interstate New Teacher Assessment and Support Consortium (INTASC)
Small County Data Center Project: Phase 1
Organizational Change
Classroom Assessments Checklists, Rating Scales, and Rubrics
Four Basic Principles to Follow: Test what was taught. Test what was taught. Test in a way that reflects way in which it was taught. Test in a way that.
Forum - 1 Assessments for Learning: A Briefing on Performance-Based Assessments Eva L. Baker Director National Center for Research on Evaluation, Standards,
Electronic Portfolios Preparing Our Students for the 21 st Century The Future.
Classroom Assessment A Practical Guide for Educators by Craig A. Mertler Chapter 7 Portfolio Assessments.
Teaching Today: An Introduction to Education 8th edition
PRINCIPAL SESSION 2012 EEA Day 1. Agenda Session TimesEvents 1:00 – 4:00 (1- 45 min. Session or as often as needed) Elementary STEM Power Point Presentation.
1. Housekeeping Items June 8 th and 9 th put on calendar for 2 nd round of Iowa Core ***Shenandoah participants*** Module 6 training on March 24 th will.
Performance-Based Assessment Authentic Assessment
Baker ONR/NETC July 03 v.4  2003 Regents of the University of California ONR/NETC Planning Meeting 18 July, 2003 UCLA/CRESST, Los Angeles, CA ONR Advanced.
ONR/NSF Technology Assessment of Web-Based Learning, v3 © Regents of the University of California 6 February 2003 ONR/NSF Technology Assessment of Web-Based.
Using NAEP To Improve Instruction. Instructional Strategies.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
Copyright © 2008, Intel Corporation. All rights reserved. Intel, the Intel logo, Intel Education Initiative, and Intel Teach Program are trademarks of.
The selection of appropriate assessment methods in a course is influenced by many factors: the intended learning outcomes, the discipline and related professional.
What Are the Characteristics of an Effective Portfolio? By Jay Barrett.
Information Systems Management EDLD 5362 Week 5 Assignment By Jason Mansfield L
From Assessment to SoTL Nitya Jacob and Miriam Segura-Totten BSP Research Residency July 22, 2015.
Minnesota Manual of Accommodations for Students with Disabilities Training January 2010.
Candidate Assessment of Performance CAP The Evidence Binder.
Assessment for Instruction Douglas Fisher YouTube channel: fisherandfrey.
Creating a Standards-Based Classrooms An Overview of Adapting and Adopting Research Based Instruction to Enhance Student Learning.
1 Science, Learning, and Assessment: (Eats, Shoots, and Leaves) Choices for Comprehensive Assessment Design Eva L. Baker UCLA Graduate School of Education.
What does it mean to be a RETA Instructor this project? Consortium for 21 st Century Learning C21CL
The Differentiated Classroom
Gifted and Talented Academy Year 2 Curriculum and Instruction Session 4 HAEAnet-public Password: education0309.
The New Face of Assessment in the Common Core Wake County Public Schools Common Core Summer Institute August 6, 2013.
Access and Equity: Equitable Pedagogy. Quiz Quiz: Productive or Unproductive Belief 1.Students possess different innate levels of ability in mathematics,
Instructional Leadership and Application of the Standards Aligned System Act 45 Program Requirements and ITQ Content Review October 14, 2010.
Designing Quality Assessment and Rubrics
Outcomes By the end of our sessions, participants will have…  an understanding of how VAL-ED is used as a data point in developing professional development.
PRAGMATIC Study Designs: Elderly Cancer Trials
Classroom Assessments Checklists, Rating Scales, and Rubrics
New Goal Clarity Coach Training October 27, 2017
Classroom Assessments Checklists, Rating Scales, and Rubrics
Presentation transcript:

C R E S S T / U C L A Center for the Study of Evaluation National Center for Research on Evaluation, Standards, and Student Testing (CRESST) New Models of Technology Sensitive Evaluation: Giving Up Old Program Evaluation Ideas Eva L. Baker and Joan L. Herman SRI February 25, 2000

C R E S S T / U C L A Goals of Presentation è Outline Purposes and Challenges of Technology Evaluation è Describe Present Limitations of Technology Evaluation è Suggest Improvements

C R E S S T / U C L A Purposes of Technology Evaluation è Soothe Anxiety è Justify Expenditures è Judge Impact è Identify Short-Falls è Improve Outcomes è Shore Up Managers’ Images è Demonstrate Technology Use in Evaluation

C R E S S T / U C L A Limitations of Current Approaches to the Evaluation of Technology è Conception of Evaluation è Designs è Measures è Validity of Results è That About Covers It

C R E S S T / U C L A Limitations: Concept of Evaluation è The Scholarly Overlay of “Formative” and “Summative” Evaluation Makes Little Sense in Education in General and No Sense in Technology Implementations è Focus on “Value Added” Using New Outcomes Instead of Limiting Measures to Lowest Common Denominator è Evaluation Should Match Cycle-Time of Technology, e.g., No Five-Year Studies

C R E S S T / U C L A Limitations: Designs è Existing Designs Are Usually Messy, Volunteer Studies of Available Classrooms è Randomized Treatment Allocations Are Possible, But Compensation from Home and Other Environments as Well as Pressures for Equal Access Make Them Impractical in the Long Run Without Creative Strategies è Treatments Need to Be Reconceptualized in Terms of Control and Uniformity

C R E S S T / U C L A Limitations: Design/Procedures è Need for Collective Bargaining Agreements for Participation in Evaluation—Data Provision, Types of Information, Ability to Monitor Children and Adults è Human Subjects and Informed Consent

C R E S S T / U C L A Limitations: Measures of Technology Effects è Opinion, Implementation, Smile-Tests è Student Performance Measures Insensitive to Implementations n Mismatch Between Desired Goals and Measures n Standardized Measures n Mom and Pop Measures Lacking Technical and Credible Qualities n Masked by “Standards-Based” Rhetoric

C R E S S T / U C L A Families of Cognitive Demands Self-Regulation Communication Content Understanding Problem Solving Teamwork and Collaboration Learning

C R E S S T / U C L A Cross-Walk to Guide the Simultaneous Design of Assessment and Instruction è Cognitive Models (Task Specification, Scoring Guides) Become Implemented Different Subject Matters è Domain-Independent and Domain-Dependent Components è Used for Design and/or Administration and/or Scoring

C R E S S T / U C L A Next Generation: Authoring Systems for Multiple Purposes è Not an Item Bank è Capture Content, Task, Process è Instant Scoring and Feedback è Expert-Based è Beyond the Screen

C R E S S T / U C L A Limitations: Validity of Results è Source: Inappropriate Measures, Unclear Assignment, Treatment Vagaries è Even in the Best Case: Generalizing to What? By the Time We Complete a Study, the Treatments Are Obsolete

C R E S S T / U C L A Suggestions for Improvement è Defining the Objectives of Evaluation è Fixing Measures è Addressing the Concept of Evaluation

C R E S S T / U C L A Distributed Evaluation Characteristics and Functions è Conceptually Congruent with Distribution and Decentralization è Provides Information for Users and Program Managers è Allows Flexibility in Implementation Measures, e.g., Measures of Engagement, While Raising the Standards for Validity

C R E S S T / U C L A An Indicators Approach è Flexibility è Longitudinal è Data Capture è Disaggregation and Aggregation è Data Export/Import è Local Engagement and Feedback

C R E S S T / U C L A An Indicators Approach è Report Generation to Appropriate Constituencies è Updatable è Operational è Creates the Right “Management, High Tech” Patina

C R E S S T / U C L A Quality School Portfolio è Longitudinal Database è Standards-Based è Multi-Purpose è Multi-User è Multiple Occasions è Local Goals è Automated Reports

C R E S S T / U C L A Key Attributes of Distributed Evaluation è Measures: Fixed and Flexible n Some Common Performance Measures n And “Indicator” Mentality for Outcome Measures from Archival Sources n Common and Site Specific Implementation Measures n Fixed and Flexible Data Collection Schedule n Feedback Is a Feature in Every Design

C R E S S T / U C L A More Characteristics è Local Site Is a Participant in Rather Than a Recipient of Evaluation è Software Based for Easy Data Entry Feedback, and Update è Tailored Reports è Design Independent