Dr. Kathleen Haynie Haynie Research and Evaluation November 12, 2010.

Slides:



Advertisements
Similar presentations
Progress Towards Reading Success: The Reading First Evaluation Prepared by: Amy Kemp, Ph.D. Research Associate and Patricia A. Muller, Ph.D. Associate.
Advertisements

Continuous Improvement in the Classroom -Professional Learning Communities.
Enhancing our guiding philosophy of continuous improvement
CLASSROOM ASSESSMENT FOR STUDENT LEARNING
Ohio Improvement Process (OIP) August Core Principles of OIP  Use a collaborative, collegial process which initiates and institutes Leadership.
Session 5 Intellectual Merit and Broader Significance FISH 521.
Decision Making Tools for Strategic Planning 2014 Nonprofit Capacity Conference Margo Bailey, PhD April 21, 2014 Clarify your strategic plan hierarchy.
 Reading School Committee January 23,
1 Why is the Core important? To set high expectations – for all students – for educators To attend to the learning needs of students To break through the.
1 Exploring NSF Funding Opportunities in DUE Tim Fossum Division of Undergraduate Education Vermont EPSCoR NSF Research Day May 6, 2008.
National Science Foundation: Transforming Undergraduate Education in Science, Technology, Engineering, and Mathematics (TUES)
Developing and Supporting Highly Effective Teachers in Every Classroom Leaders of Learning Implementation Norman Public Schools Date.
How to Write Goals and Objectives
Toolkit Series from the Office of Migrant Education Webinar: CNA Toolkit August 21, 2012.
Evaluating Physical Activity Intervention Programs Thomas Schmid, PhD Physical Activity and Health Branch CDC Atlanta, Georgia, USA.
Program Evaluation Debra Spielmaker, PhD Utah State University School of Applied Sciences, Technology & Education - Graduate Program Advisor USDA-NIFA,
Evaluation of Math-Science Partnership Projects (or how to find out if you’re really getting your money’s worth)
Diane Schilder, EdD and Jessica Young, PhD Education Development Center, Inc. Quality Rating and Improvement System (QRIS) Provisional Standards Study.
Directorate for Education and Human Resources (EHR) EHR Core Research Program (ECR) Program Announcement: NSF
ISTEP: Technology Field Research in Developing Communities Instructor: M. Bernardine Dias CAs: Sarah Belousov and Ermine Teves Spring 2009.
MARYLAND STATE DEPARTMENT OF EDUCATION  DIVISION OF SPECIAL EDUCATION/EARLY INTERVENTION SERVICES JOHNS Hopkins University Center for Technology in Education.
Assessment & Evaluation Committee A New Road Ahead Presentation Dr. Keith M. McCoy, Vice President Professor Jennifer Jakob, English Associate Director.
Striving for Quality Using continuous improvement strategies to increase program quality, implementation fidelity and durability Steve Goodman Director.
The Common Core Curriculum By Dean Berry, Ed. D. Gregg Berry, B.A.
District Workforce Module Preview This PowerPoint provides a sample of the District Workforce Module PowerPoint. The actual Overview PowerPoint is 62 slides.
Assistant Principal Meeting August 28, :00am to 12:00pm.
PRESENTED BY THERESA RICHARDS OREGON DEPARTMENT OF EDUCATION AUGUST 2012 Overview of the Oregon Framework for Teacher and Administrator Evaluation and.
Research Policies and Mechanisms: Key Points from the National Mathematics Advisory Panel Joan Ferrini-Mundy Director, Division of Research on Learning.
Dr. Timothy Mitchell Rapid City Area Schools
Technology Use Plan Bighorn County School District #4 Basin / Manderson, Wyoming “Life-long learning through attitude, academics, and accountability.”
Approaches to Assessment Workshop for College of the Redwoods Fred Trapp August 18, 2008.
March 26-28, 2013 SINGAPORE CDIO Asian Regional Meeting and Workshop on Engineering Education and Policies for Regional Leaders Programme Evaluation (CDIO.
Strategic Planning and AdvancEd Accreditation In partnership with Quality New Mexico Taos NMSBA Leadership Conference July 13, 2012.
Integrating Knowledge Translation and Exchange into a grant Maureen Dobbins, RN, PhD SON, January 14, 2013.
1. Housekeeping Items June 8 th and 9 th put on calendar for 2 nd round of Iowa Core ***Shenandoah participants*** Module 6 training on March 24 th will.
Laying the Foundation for Scaling Up During Development.
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
Building and Recognizing Quality School Systems DISTRICT ACCREDITATION © 2010 AdvancED.
20081 E-learning Lecture-10: EVALUATING THE IMPACTS OF E-LEARNING week 12- Semester-4/ 2009 Dr. Anwar Mousa University of Palestine Faculty of Information.
March Madness Professional Development Goals/Data Workshop.
1 Universal Pre-Kindergarten (UPK) Scope of FY08 Evaluation Activities December 11, 2007.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
Notes by Ben Boerkoel, Kent ISD, based on a training by Beth Steenwyk.
Copyright © 2008 by Educational Testing Service. All rights reserved. ETS, the ETS logo and LISTENING. LEARNING. LEADING. are registered trademarks of.
DR K-12 Program PRESENTATION HBCU-UP LEADERSHIP DEVELOPMENT INSTITUTE SPONSORED BY QEM Dr. Julia V. Clark Program Director August 13, 2009.
Response to Intervention (RtI) How can we make it work in Wisconsin?
Urban Middle Grades Mathematics Curriculum Implementation Karen D. King Monica Mitchell Candace Barriteau Phaire Jessica Tybursky.
Evaluation Matt Shreeve 29 July 2009 Relationship Management Programme Meeting.
What Are the Characteristics of an Effective Portfolio? By Jay Barrett.
ANNOOR ISLAMIC SCHOOL AdvancEd Survey PURPOSE AND DIRECTION.
Securing External Federal Funding Janice F. Almasi, Ph.D. Carol Lee Robertson Endowed Professor of Literacy University of Kentucky
Course, Curriculum, and Laboratory Improvement (CCLI) Transforming Undergraduate Education in Science, Technology, Engineering and Mathematics PROGRAM.
Project Evaluation for MSP Targeted Partnerships Eric R. Banilower Horizon Research, Inc.
Considering the Roles of Research and Evaluation in DR K-12 Projects December 3, 2010.
Office of Service Quality
Evaluating Engagement Judging the outcome above the noise of squeaky wheels Heather Shaw, Department of Sustainability & Environment Jessica Dart, Clear.
1 DEMONSTRATION PROJECTS TO ENSURE STUDENTS WITH DISABILITIES RECEIVE A QUALITY HIGHER EDUCATION PROGRAM Performance Measurement, Program and Project Evaluation.
NSF INCLUDES Inclusion Across the Nation of Learners of Underrepresented Discoverers in Engineering and Science AISL PI Meeting, March 1, 2016 Sylvia M.
UPDATE ON EDUCATOR EVALUATIONS IN MICHIGAN Directors and Representatives of Teacher Education Programs April 22, 2016.
Kansas Multi-Tier System of Supports: Living a Culture of Engagement.
1 Introduction Overview This annotated PowerPoint is designed to help communicate about your instructional priorities. Note: The facts and data here are.
Indicator 5.4 Create and implement a documented continuous improvement process that describes the gathering, analysis, and use of student achievement.
Aligning Teacher Effectiveness to the Common Core Standards March 7, 2013 Sandra Alberti Student Achievement Partners.
New Haven, A City of Great Schools MOVING FROM COMPLIANCE TO COHERENCE IN EVALUATION AND DEVELOPMENT: THE IMPACT OF THE E3 PROGRAM NEW HAVEN PUBLIC SCHOOLS.
Stages of Research and Development
Office of Planning & Development
PUBLIC SCHOOL CHOICE RENEWAL PROCESS
February 21-22, 2018.
SGM Mid-Year Conference Gina Graham
Presentation transcript:

Dr. Kathleen Haynie Haynie Research and Evaluation November 12, 2010

DR K-12: Seeks to enable significant advances in student and teacher STEM learning Begin with a research question about how to improve preK-12 STEM learning and teaching Develop, implement, and study effects of innovative educational resources, models, or technologies NSF makes awards per year Exploratory projects 3 years, <= $450,000 Full R&D projects up to 5 years, <= $3,500,000 Scale up projects up to 5 years, <= $5,000,000

Scale Up and Study Effectiveness Synthesize and Theorize Explore Hypothesize, and Clarify Design, Develop, and Test Implement, Study Efficacy, and Improve Cycles of Research and Development

Proposals must include plans for formative and summative evaluation of development and research work Evaluation questions Data to be gathered and analysis plans Expertise of evaluators Evaluation should focus on the validation of, fidelity to, and the usefulness of the development and research processes to achieve the targeted outcomes. Objectives include: (1)ensuring that the project is making satisfactory progress toward its goals; (2)recommending reasonable, evidenced-based adjustments to project plans; (3)determining the value of the outcomes of the project; and (4)attesting to the integrity of outcomes reported by the project

Formative Evaluation Focus Summative Evaluation Focus Proposal Materials Development Usability Piloting Implementation Piloting Project Foci Broad Strokes Evaluation Plan, Adequate Budget Logic Modeling, Evaluation Plan, Evidence of Process Instrument Design, Data Collection and Analysis, Formative Feedback, Evidence of Outcomes/Impact Evaluation Foci

Example of DR K-12 Evaluation Timeline

Need for planning and teaching resources in STEM area Expertise: *Inquiry- based curricula *Innovative design *STEM content *Educational research & evaluation Partners: *Primary PI’s organization *HRE, *Teacher- partners InputsActivities Outcomes Impacts Develop materials: *Define user needs *Establish criteria for materials *Define features *Define content *Write materials Build collaborative team (curriculum, tech, research) Successful model of collaboration across areas of expertise STEM planning and teaching resources Teacher Usability *Planning needs *Instruction value Teacher learning Research and Project Evaluation Conduct Research and Evaluation Contribute to available open educational resources Development Model of planning and teaching resources in other content areas Research knowledge and Evaluation results: Leveraging teaching resources in educational settings STEM Logic Model Process for developing teaching supports in other STEM areas Pilot Testing: *Iteratively with mats development *Involve teacher- partners *User-testing *Implementation study Implementation fidelity= OTL (students)

 Logic modeling process lays foundation for determining evaluative questions and methods  NSF asks that the evaluation plan “pose significant questions that can be addressed empirically and that are central to the project’s goals and objectives, as well as contributing to understanding that meets current and expected educational demands of the nation on world- class criteria.”  Evaluation plan should be vetted with project team members, and, ideally, with project Advisors

General Evaluative Approach Matrix for NSF Projects

 Process data ◦ Evidence of team collaboration (e.g., observe team meetings, team notes) ◦ Evidence of decision-making in materials design and development  What are the design specifications?  Who are the decision-makers?  What decisions are made, and associated rationales?  What opportunity costs of decisions?  Outcomes data ◦ What features, tools, content, functions, etc.? ◦ To what extent meet design specifications? ◦ Evidence from stakeholders as to usability, value, affordances, limitations, likely impact

 Outcomes data ◦ How valuable are different features or content sections of the materials? ◦ Are resources / materials likely to be engaging to students? ◦ Are materials clear/intuitive to teachers? ◦ How easily are instructional plans /activities performed? ◦ What is the likely added value of these resources? ◦ What refinements to materials result from user testing?  Data sources ◦ Usability studies with potential user group  Observations, interviews, focus groups, tracking of materials use (i.e., mouse-clicks on electronic materials), surveys  Solicit feedback on what is helpful, useful, feasible, user- friendly for their purposes in actual educational settings

 Data sources ◦ Implementation study in actual educational settings  (Classroom?) observations, teacher/administrative/ student interviews, focus groups, surveys  Solicit feedback on how well resources function within that setting  Outcomes data ◦ What subset of the materials or resources do educators use in actual settings? How much time spent on each? ◦ What educational practices are tied to use? ◦ What opportunities for student learning are afforded by the materials? How do materials contribute to school, district goals or attainment of educational standards? ◦ How do educational impacts compare with baseline or with comparison group? ◦ What refinements to materials result from piloting?

 From day 1, build strong and trusting relationships with all decision-makers  Determine the decision-makers  Stick with the evaluative evidence  Assessment your audience’s listening - readiness to hear and digest findings  Provide balanced feedback: what’s working, what might be improved

 Start well in advance of May, when most Annual reports are due  Use the evaluation plan as a guide ◦ Be sure to focus on evaluation questions and methods ◦ Can include emergent findings, but try to connect with original project goals  Make sure evaluation write-up is well-vetted with project team  Unless final evaluation report, be sure to include formative recommendations that are realistic

 Thank you, and go to it!!!  Dr. Kathleen Haynie Director of Haynie Research and Evaluation haynieresearch.com