MassCUE Evaluators Day 1 – January 11, 2012. Greetings! zLogistics and Locations yWiFi SSID and Password yBathrooms, etc. zTeams yWho are you? yTeam members?

Slides:



Advertisements
Similar presentations
Template: Making Effective Presentation about Your Evidence-based Health Promotion Program This template is intended for you to adapt to your own program.
Advertisements

Introduction to Impact Assessment
Strategic Planning An Overview. General Definition The process of strategic planning involves deciding where you want to go, how you want to be positioned,
Grantee Program Plan. Components of the Program Plan Goals  Objectives  Activities  Techniques & Tools  Outcomes/Change.
Creating Coherence Work Session: Part 1 Copyright © 2013 American Institutes for Research. All rights reserved. Connecting Teacher Evaluation and Support.
Wynne HARLEN Susana BORDA CARULLA Fibonacci European Training Session 5, March 21 st to 23 rd, 2012.
A plain English Description of the components of evaluation Craig Love, Ph.D. Westat.
EQuIP Rubric and Quality Review Curriculum Council September 26, 2014.
Evaluation is a professional and ethical responsibility and is a core part of PHN professional practice Commitment to evaluation helps build the PHN intelligence.
DEVELOPING DEPARTMENTAL OUTCOMES ASSESSMENT PLANS Jerry Rackoff Lois Huffines Kathy Martin.
How to Write Goals and Objectives
Evaluating Physical Activity Intervention Programs Thomas Schmid, PhD Physical Activity and Health Branch CDC Atlanta, Georgia, USA.
Standards and Guidelines for Quality Assurance in the European
Community Planning Training 1-1. Community Plan Implementation Training 1- Community Planning Training 1-3.
Continuous Quality Improvement (CQI)
Assessment Assessment Planning Assessment Improvements Assessment Data Dialogue & Reflection.
How to Develop a Project Evaluation Plan Pat Gonzalez Office of Special Education Programs
How to Develop the Right Research Questions for Program Evaluation
How to Focus and Measure Outcomes Katherine Webb-Martinez Central Coast and South Region All Staff Conference April 23, 2008.
1 Qualitative Evaluation Terms Coding/categorization The process of condensing qualitative data through the identification of common themes. Data Matrix.
Program Evaluation Tools and Strategies for Instructional Technology.
Session Goals: To redefine assessment as it relates to our University mission. To visit assessment plan/report templates and ensure understanding for.
ISTEP: Technology Field Research in Developing Communities Instructor: M. Bernardine Dias CAs: Sarah Belousov and Ermine Teves Spring 2009.
Evaluation Process/Requirements for CAPP Algebra Project.
CONFERENCE EVALUATION PLANNING Good planning is essential ! (‘’fail to plan is plan to fail’’)
Evaluation of the Indianapolis, Indiana September 2002 – August 2003 Stamp Out Syphilis Coalition.
Lessons Learned from District Tech Audits Sun Associates
1 Evaluation is a Partnership Dr. Darla M. Cooper Associate Director Center for Student Success Basic Skills Coordinators Workshop September 17, 2009.
Elizabeth Godfrey 1.  Periodic assessment of results Appropriateness, effectiveness, efficiency, impact, sustainability  Identifies intended and unintended.
Evaluating Your Technology Initiative How do you know it’s working? The MassCUE Evaluators Program Let us know you’re here!
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
Session three Audit, action, implementation and reflection.
March 26-28, 2013 SINGAPORE CDIO Asian Regional Meeting and Workshop on Engineering Education and Policies for Regional Leaders Programme Evaluation (CDIO.
Evaluating Educational Technology Initiatives: How Do You Know It’s Working?
Demonstrating Effectiveness Background and Context.
 Read through problems  Identify problems you think your team has the capacity and interest to solve  Prioritize the problems and indicate the.
Medicine Hat School District #76 PLC’s Building Capability Through Collaborative Learning Developing tomorrow’s citizens through improved learning, living.
Why Do State and Federal Programs Require a Needs Assessment?
LANSING, MI APRIL 11, 2011 Title IIA(3) Technical Assistance #2.
Disciplined Curriculum Innovation Developing and implementing the curriculum is a major investment for all involved and it is important for schools and.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
MassCUE Evaluators – Maximizing Your Technology Investment Documenting the Impact of Instructional Technology Initiatives.
Professional Development PLC Lead Training Together, we can make a difference.
Suggested Components of a Schoolwide Reading Plan Part 1: Introduction Provides an overview of key components of reading plan. Part 2: Component details.
Leadership Module 3: Implementation Plan Framework.
Healthy Schools Briefing Outcomes Tuesday 18 th November 2008 Salvation Army 4.30pm – 5.30pm.
Evaluating 1:1 Initiatives and Other Technology Initiatives The MassCUE Evaluators Program.
Systems Accreditation Berkeley County School District School Facilitator Training October 7, 2014 Dr. Rodney Thompson Superintendent.
Adrienne DiTommaso, MPA, CNCS Office of Research and Evaluation
The Literacy and Numeracy Secretariat Le Secrétariat de la littératie et de la numératie October – octobre 2007 The School Effectiveness Framework A Collegial.
PLMLC Leadership Series Thunder Bay Region Day 1 Brian Harrison, YRDSB Connie Quadrini, YCDSB Thursday February 3 rd, 2011.
Mathematics Performance Tasks Applying a Program Logic Model to a Professional Development Series California Educational Research Association December.
The School Effectiveness Framework
School Development Goal Development “Building a Learning Community”
Defining Clear Goals and ObjectivesDefining Clear Goals and Objectives Barbara A. Howell, M.A. PAD 5850.
Lesson Observation, Grading Criteria and What Observers are Looking For Alex Banks (Totton College) NQT Induction 24 th June 2014.
Using Assessments to Monitor and Evaluate Student Progress 25 Industrial Park Road, Middletown, CT · (860) ctserc.org.
Learning AP ILD November 8, 2012 Planning and Monitoring for Learning.
1 DEMONSTRATION PROJECTS TO ENSURE STUDENTS WITH DISABILITIES RECEIVE A QUALITY HIGHER EDUCATION PROGRAM Performance Measurement, Program and Project Evaluation.
The Literacy and Numeracy Secretariat Le Secrétariat de la littératie et de la numératie October – octobre 2007 The School Effectiveness Framework A Collegial.
CAREER AND SKILLS TRAINING STRATEGIC FRAMEWORK Planning is key to success.
Selection Criteria and Invitational Priorities School Leadership Program U.S. Department of Education 2005.
Preparing a Showcase Awards nomination 2016 Showcase Award Guidelines showcase/2016/guidelines.html
What are Learning Outcomes and how to create good Learning Outcomes
J. Sterling Morton High Schools
QUALITY IMPROVEMENT [SECOND]/[THIRD] QUARTERLY COLLABORATIVE WORKSHOP
Assessment planning: It’s the parts that make up the whole!
Multi-Sectoral Nutrition Action Planning Training Module
Assessment components
Presentation transcript:

MassCUE Evaluators Day 1 – January 11, 2012

Greetings! zLogistics and Locations yWiFi SSID and Password yBathrooms, etc. zTeams yWho are you? yTeam members? yBrief overview of your project

Your Facilitators zSun Associates yJeff Sun yJeanne Clark ywww.sun-associates.com/masscuewww.sun-associates.com/masscue zDamian Bebell - Boston College zShelley Chamberlain - MassCUE

Agenda and Expectations zReview the agenda zExpectations yAll team members will attend all 4 days ySome work will occur outside of these sessions yCollaborate! yProduce a concrete evaluation plan yProvide feedback to the process

Your Evaluation Plan Will… zBe organized around the goal(s) of your project zDefine success through the creation of indicators keyed to your project goal(s) zProduce data specifically targeted at measuring success relative to your project goal(s) zProvide feedback and recommendations for improvement

Why Evaluate? zDetermine if your investment in instructional technology is “paying off” zMeasure progress toward meeting your project goals zSupport action planning with data zIncrease eligibility for funding

Schematic View of the Evaluation Process

Goal What is your project aiming to accomplish? zThe goal should be big-picture and over- arching…encompassing the spirit of what you want to accomplish. yFurther detail on what it takes to meet the goal is covered in your indicators.

Indicators What does it look like when your goal is met? zIndicators are the organizing principle of your evaluation zIndicators should reflect your project’s unique goals and aspirations zIndicators should clearly describe what it looks like when/as you meet your goal(s) yIndicators need to be highly descriptive and can include both qualitative and quantitative measures ySome indicators are more “measurable” than others zIndicators guide your data collection

Data What do you need to examine in order to find out if you’re meeting your indicators? zCurriculum yReview of curriculum units/lesson plans yClassroom observations (of student impact) yStudent skills assessments yReview of student work zProfessional Development yTeacher interviews yProfessional development plans yReview of developed lesson plans, etc. yClassroom observation (of pedagogy) zInfrastructure yClassroom observations ySurveys yCost data yHelp desk data

Coordinating with Existing Data zWhat data is already collected and available for you to use? zOutcomes data on student achievement? zExisting student assessments? zYou may choose to use these other data sources in your analysis to address aspects of your indicators. zSometimes, this additional data will be one of the sources that you use to “triangulate” your findings.

Showing Impact and Change How do you determine that what you see happening in your program is different from the status quo? zIs there control/comparison group available for comparison? zAre pre-measures available for comparison?

Analysis and Findings zData analysis – comparing data to indicators -- will show the degree to which actions (PD, pedagogy, student work, infrastructure implementation) come together to produce the intended result. zFindings are the results of this analysis.

Recommendations zFindings (analysis) lead to recommendations and reporting. zRecommendations lead to new action items. yWhat are you going to do to implement the recommendations?

Developing Indicators zIndicators are the organizing principle of your evaluation zIndicators should reflect your project’s unique goals and aspirations zYour goals need to clearly state the purpose of your project yWhat is it that you intend to create with this project? zIndicators should clearly describe what it looks like when/as you meet your goal(s) yIndicators need to be highly descriptive and can include both qualitative and quantitative measures ySome indicators are more “measurable” than others zIndicators guide your data collection

Goal Statement: Project / PlanEvaluation What will your project be doing? What will be the desired outcome of these actions? What do these desired outcomes look like in the context of your project? What do you need to find out in order to assess how effectively your actions are producing the desired outcomes and fulfilling the indicators? How closely does your data match your indicators? Actions Desired OutcomesIndicator Data Avenues/S ources Questions/ Probes Data Analysis Curriculum PD Infrastructure Etc….

Day 1 Wrap-Up zQuestions? zDaily Evaluation zSee you tomorrow!