Can Democratic Evaluation be Scientific? Ann Ooms Frances Lawrenz University of Minnesota College of Education and Human Development Department of Educational.

Slides:



Advertisements
Similar presentations
ACCOMMODATIONS MANUAL
Advertisements

TEACHING FOR CIVIC CHARACTER AND ENGAGEMENT Alternatives to Large, Traditional High Schools: Can They Enhance Students Preparation for Work, College &
ALAMEDA UNIFIED SCHOOL DISTRICT Superintendent Search.
Transition So many questions…. So few answers!. Department of Public Instruction Goals for Transition Planning To arrange for opportunities and services.
Accelerated Schools Will the Missouri Accelerated Schools project suit Central Middle School to a ?
A Vehicle to Promote Student Learning
Postgraduate Course 7. Evidence-based management: Research designs.
Teacher Evaluation New Teacher Orientation August 15, 2013.
Evaluation Procedures
Experimental designs Non-experimental pre-experimental quasi-experimental experimental No time order time order variable time order variables time order.
Freehold Borough Teacher Evaluation System Freehold Intermediate School Friday – February 15, 2013 Rich Pepe Director of Curriculum & Instruction.
“Scientifically Based Evaluation Methods” Presented by Paula J. Martin COE Conference, September 13, 2004.
Deconstructing Standard 2c Angie Gant, Ed.D. Truett-McConnell College 1.
Promoting Faculty Development & Continuous Program Improvement Through Action Research 2014 Mini-Lilly Presentation Cynthia L. Carver C. Suzanne Klein.
RTI as a Lever for School Change School Partnerships for Change in Teacher Education Tom Bellamy—February 2, 2011.
Project Monitoring Evaluation and Assessment
Relationships between Involvement and Use in the Context of Multi-site Evaluation American Evaluation Association Conference November 12, 2009.
Scrutiny Scrutiny is a major tool for evaluating and then effecting change. Reviewing and evaluating what is done and measuring its success is key to better.
Learning Objectives, Performance Tasks and Rubrics: Demonstrating Understanding and Defining What Good Is Brenda Lyseng Minnesota State Colleges.
An Assessment Primer Fall 2007 Click here to begin.
Experimental Design Week 9 Lecture 1.
Evaluating Discipline-based Goals and Educational Outcomes in Developmental Psychology Anne L. Law Department of Psychology Rider University.
Adapted Physical Education 6 Service Delivery Options Available in Physical Education and the Role of the Adapted Physical Education Specialist.
By Paula Jacobsen Chapter 12
Refining Your Research Question. In this session, we will… Discuss guidelines for creating a ‘good’ research question Provide time to revisit and revise.
Orientation to the Accreditation Internal Evaluation (Self-Study) Flex Activity March 1, 2012 Lassen Community College.
Minnesota Manual of Accommodations for Students with Disabilities Training Guide
A Vision of Powerful Social Studies Teaching and Learning
School Climate Control Does it Matter?. Key Messages Student achievement and behavior are impacted by school climate. School climate can be influenced.
Experiments and Observational Studies.  A study at a high school in California compared academic performance of music students with that of non-music.
ELIGIBILITY PROCEDURES FOR SPECIAL EDUCATION SERVICES Chapter Seventeen.
TIMELESS LEARNING POLICY & PRACTICE. JD HOYE President National Academy Foundation.
Developing an Effective Evaluation to Check for Understanding Susan E. Schultz, Ph.D. Evaluation Consultant PARK Teachers.
ABSTRACT Key Terms: Parent involvement, Common Core State Standards, Homework, K – 2 Mathematics In this study, the 2015 REU mathematics team from Elizabeth.
Meeting SB 290 District Evaluation Requirements
Emporia State University Phil Bennett (Some Slides by Dr. Larry Lyman) Teacher Work Sample The Teachers College.
Colorado Families as Faculty Project Families as Faculty: Improving Home-School Communication Beth Schaffner.
Copyright Allyn & Bacon 1997 Woolfolk : Educational Psychology Chapter 1 Teachers, Teaching, and Educational Psychology.
Skunk Works Evaluation Tools: How do we know if we are having an impact?
Accommodations in Oregon Oregon Department of Education Fall Conference 2009 Staff and Panel Presentation Dianna Carrizales ODE Mike Boyles Pam Prosise.
Differential Effects of Participatory Evaluation in a National Multi-site Program Evaluation Frances Lawrenz University of Minnesota.
HECSE Quality Indicators for Leadership Preparation.
NCATE Standard 3: Field Experiences & Clinical Practice Monica Y. Minor, NCATE Jeri A. Carroll, BOE Chair Professor, Wichita State University.
Evidence of Student Learning Fall Faculty Seminar Office of Institutional Research and Assessment August 15, 2012.
Intel ® Teach to the Future Pre Service Evaluation across Asia - Gaining the Country Perspective - Deakin University Faculty of Education Consultancy and.
Oral Exit Report Quality Assurance Review Team Grandview High School March 9-10, 2009.
How does the definition match your thoughts? How does the definition differ from your thoughts? Assessment Definition Assessment is the ongoing process.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
A STUDY OF THREE MIDDLE SCHOOLS IN UTAH – TWO EXEMPLARY COUNCILS WERE CHOSEN, AS WELL AS ONE STRIVING TO ACHIEVE A FUNCTIONAL SCHOOL COUNCIL. MIDDLE SCHOOL.
Assessment of Student Learning in General Education AAHE/NCA 2003 Assessment Workshop Omaha, Nebraska ● June 2003.
Kimberly B. Lis, M.Ed. University of St. Thomas Administrative Internship II Dr. Virginia Leiker.
1 General Elements in Evaluation Research. 2 Types of Evaluations.
Educatioanal Management
Minnesota Manual of Accommodations for Students with Disabilities Training January 2010.
WASC Western Association for Schools and Colleges.
Collaboration. © 2010 Pearson Education, Inc. All Rights Reserved. 2  Collaboration refers to “ongoing participation of two or more individuals who are.
U SING D ATA TO ASSESS AND INFORM SCHOOL CHANGE Data for change Presenter: Judith Marty Principal Mater Academy Charter Middle High.
Regression Discontinuity Design Case Study : National Evaluation of Early Reading First Peter Z. Schochet Decision Information Resources, Inc.
SIX PLUS ONE COLUMBUS CITY SCHOOLS IMPLEMENTATION MODEL OF PARENT ENGAGEMENT = 7.
Protecting Human Subjects Overview of the Issues Applications to Educational Research The IRB Process.
Research and Evaluation
CAEP Standard 4 Program Impact Case Study
Inclusion of Exceptional Learners
Field Experiences and Clinical Practice
FINANCIAL ACCOUNTING ACCOUNTING-I ACCT 2003
Instructional Learning Cycle:
Establishing the Direction of the Relationship
Background on Provincial Report Cards
Evidence-Based Practices Under ESSA for Title II, Part A
Presentation transcript:

Can Democratic Evaluation be Scientific? Ann Ooms Frances Lawrenz University of Minnesota College of Education and Human Development Department of Educational Psychology

Overview of Presentation What is Democratic Evaluation? History of Democratic Evaluation. What is Scientific Evaluation? Two examples of Democratic Scientific Evaluation Efforts. Conclusion.

Democratic Evaluation: defined “Government by the people” Exercised directly or through elected representatives Based on principles of social equality and respect for the individual

Democratic Evaluation: History MacDonald (1973, 1977): democratic evaluation Bryk (1983): stakeholder-based evaluation Fetterman (1994): empowerment oriented evaluation

Democratic Evaluation: History Floc’hlay and Plottu (1998): Model for the Operationalization of Democratic Evaluation Empowerment Evaluation Participatory Evaluation Multi-criteria Evaluation Counterpower exercised by those who do not agree

Democratic Evaluation: History House and Howe (1999): Model of Democratic Deliberative Evaluation Inclusion Dialogue Deliberation Impact: Better informed decision-making parties A thoughtful and deliberated population

Democratic Evaluation: History Patton (2002): think evaluatively suggests including a methodological dialogue to maintain methodological quality.

Democratic Evaluation: our definition If an evaluation is democratic: All interests are represented Procedures for controlling any imbalances of power All groups participate seriously and authentically in meaningful ways Groups participate in appropriate ways Reflective deliberation about findings and implications

Scientific Evaluation: defined No Child Left Behind Act of 2001 U.S. National Research Council’s Committee of Scientific Principles for Education Research

Scientific Evaluation: our definition Scientific Evaluations are: Evaluations that use experimental and comparison groups or some other sort of quasi- experimental design that can demonstrate causality

Democratic, Scientific Evaluation: Two Examples Collaboratives for Excellence in Teacher Preparation (CETP) Program Frances Lawrenz AlphaSmart Evaluation Ann Ooms

CETP Collaboratives for Excellence in Teacher Preparation evaluation Frances Lawrenz

CETP: Democratic? There were power imbalances Evaluation was required by the funder and was conducted for the funder There was no full participation in the CETP evaluation There were limited decisions about how results were to be used

CETP: Scientific? The pre post assessments of faculty instructional approaches Comparison of existing qualities of institutions, classrooms, teachers, etc. to pre determined standards A random selection of participants Comparison of students in matched classes

CETP: Scientific? None of these meet the “gold standard” of randomly assigned experimental and comparison groups but they do have some elements of experimental design and causality Also through the negotiation process, additional “non scientific” data were also included

AlphaSmart Evaluation Ann Ooms

AlphaSmart: Democratic? Not all stakeholders were represented. For example, there were no principals, parents or students in the inquiry group There were minor imbalances of power in the group: the teachers had more power

AlphaSmart: Democratic? The members of the inquiry group were involved seriously and authentically and they participated in ways which matched their skills There was reflective discussion about the findings and whether or not the AlphaSmarts should be used

AlphaSmart: Scientific? There was no comparison group Students were not pre tested: the study was a post test only However, there were pre and post interviews and regular meetings with the teachers which might provide some indication of causality

AlphaSmart: Scientific? The students’ retrospective opinions were gathered Additionally the evaluators and the teachers observed the students as they used the AlphaSmarts and were able to form their own opinions of the success or failure of the technology

Conclusion of two examples As in past work, producing democratic evaluations appears to be quite difficult The two examples did NOT accomplish a full democratic scientific evaluation

Limitations of Scientific Evaluation Scientific Evaluation narrows the meaning of evaluation to achievement of specific outcomes Scientific Evaluation deemphasizes the importance of understanding the process and meaning

Limitations of Scientific Evaluation Scientific Evaluation does not accommodate for methodological dialogs Scientific Evaluation precludes the participation of anyone who has different philosophical perspectives

Conclusion Scientific Evaluation seems only possible if everyone involved has the same philosophical perspective. Democratic, Scientific Evaluation: a wonderful but inaccessible dream?

Contact Information Ann Ooms: Frances Lawrenz: