Welcome! These slides are designed to help you think through presenting your evaluation planning and results. Feel free to pick and choose the slides that.

Slides:



Advertisements
Similar presentations
Assessment Adapted from text Effective Teaching Methods Research-Based Practices by Gary D. Borich and How to Differentiate Instruction in Mixed Ability.
Advertisements

Teacher Excellence and Support System
Team 6 Lesson 3 Gary J Brumbelow Matt DeMonbrun Elias Lopez Rita Martin.
EDD/581 Action Research Proposal
Program Evaluation. Lecture Overview  Program evaluation and program development  Logic of program evaluation (Program theory)  Four-Step Model  Comprehensive.
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
Florida MIECHV State CQI Team Meeting AUGUST 21, 2014 Teamwork is the ability to work together toward a common vision. The ability to direct individual.
PPA 502 – Program Evaluation Lecture 10 – Maximizing the Use of Evaluation Results.
Summarizing Community-Based Participatory Research: Background and Context for the Review Lucille Webb, MEd Eugenia Eng, DrPH Alice Ammerman, DrPH Meera.
1 CCLI Proposal Writing Strategies Tim Fossum Program Director Division of Undergraduate Education National Science Foundation Vermont.
INACOL National Standards for Quality Online Teaching, Version 2.
Customer Focus Module Preview
Community Planning Training 1-1. Community Plan Implementation Training 1- Community Planning Training 1-3.
EVIDENCE BASED WRITING LEARN HOW TO WRITE A DETAILED RESPONSE TO A CONSTRUCTIVE RESPONSE QUESTION!! 5 th Grade ReadingMs. Nelson EDU 643Instructional.
OCTOBER ED DIRECTOR PROFESSIONAL DEVELOPMENT 10/1/14 POWERFUL & PURPOSEFUL FEEDBACK.
+ STARS Evaluation Assistant Webinar 1 September 19, 2014 Evaluation Projects.
Portfolios: A Useful Resource For All Cynthia Cuellar Astrid Fossum Beth Schefelker The Milwaukee Mathematics Partnership (MMP), an initiative of the Milwaukee.
Literature Review and Parts of Proposal
Performance Measures AmeriCorps Project Director Training Saratoga, NY October 7 th – 9 th, 2013.
1 Enhancing Services in Natural Environments Presenter: Mary Beth Bruder March 3, :00- 2:30 EST Part of a Web-based Conference Call Series Sponsored.
Lecture 8A Designing and Conducting Formative Evaluations English Study Program FKIP _ UNSRI
HW425 Health & Wellness Programming: Design and Administration Unit 1 Seminar: Needs Assessment The Big Picture Instructor Beth Edwards, Ph.D.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
General Presentation Guidelines The object is to interest and inform, not to entertain. Time: Too hurried a pace will not allow your audience to digest.
Systematic Reviews.
INTERNATIONAL SOCIETY FOR TECHNOLOGY IN EDUCATION working together to improve education with technology Using Evidence for Educational Technology Success.
NBPTS - SEEKING EXCELLENCE IN SCIENCE TEACHING. EXCELLENCE National Board for Professional Teaching Standards -- Draft Report (1993). National Board for.
General Presentation Guidelines The object is to interest and inform, not to entertain. Time: Too hurried a pace will not allow your audience to digest.
General Presentation Guidelines The object is to interest and inform, not to entertain. Time: Too hurried a pace will not allow your audience to digest.
 Read through problems  Identify problems you think your team has the capacity and interest to solve  Prioritize the problems and indicate the.
Quantitative and Qualitative Approaches
BCO Impact Assessment Component 3 Scoping Study David Souter.
Evaluating Impacts of MSP Grants Hilary Rhodes, PhD Ellen Bobronnikov February 22, 2010 Common Issues and Recommendations.
Lecture by: Chris Ross Chapter 7: Teacher-Designed Strategies.
Evaluating Impacts of MSP Grants Ellen Bobronnikov Hilary Rhodes January 11, 2010 Common Issues and Recommendations.
(c) 2007 McGraw-Hill Higher Education. All rights reserved. Accountability and Teacher Evaluation Chapter 14.
The Proposal AEE 804 Spring 2002 Revised Spring 2003 Reese & Woods.
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov February 16, 2011.
Abstract Title of Poster Authors Department / Division, Advocate Children’s Hospital Title of Poster Authors Department / Division, Advocate Children’s.
Unit 11: Evaluating Epidemiologic Literature. Unit 11 Learning Objectives: 1. Recognize uniform guidelines used in preparing manuscripts for publication.
PBL Instructional Design. PBL Instructional Design Name: Name of PBL: Grade Level: Content Area:
PowerPoint Guidance Layout for title or holding page By Creative Services Text should be ranged left, used in white only and main headings be set in 30.
Greenbush. An informed citizen possesses the knowledge needed to understand contemporary political, economic, and social issues. A thoughtful citizen.
[Presentation location] [Presentation date] (Confirm ABT logo) Building Bridges and Bonds (B3): An introduction.
Evaluation What is evaluation?
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
Outcomes By the end of our sessions, participants will have…  an understanding of how VAL-ED is used as a data point in developing professional development.
Welcome! These slides are designed to help you think through presenting your benchmark planning and progress. Feel free to pick and choose the slides that.
Remove this white box to use the template
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov March 23, 2011.
Remove this white box to use the template
Welcome to the Linguistic Instructional Alignment Guide Training
ESL 433 N Competitive Success-- snaptutorial.com
ESL 433 N Education for Service-- snaptutorial.com
SPE 574 Education for Service/snaptutorial.com
SPE 574 Education for Service/tutorialrank.com
ESL 433 N Teaching Effectively-- snaptutorial.com
Remove this white box to use the template
Title INTRODUCTION/PROBLEM/ BACKGROUND METHODS RESULTS/OUTCOMES
Remove this white box to use the template
Title (fit into 1-2 lines max, 72 point font)
EDD/581 Action Research Proposal (insert your name)
4.2 Identify intervention outputs
Project Title Subtitle: make sure to specify that project is an improvement project (see SQUIRES elaboration article) Presenter(s) Date of presentation.
The Role of a Teacher.
Your name Your faculty mentor’s name Department
Project Title Subtitle: make sure you specify it is a research project
EDD/581 Action Research Proposal (insert your name)
Designing & Conducting Formative Evaluation
Your name Your faculty mentor’s name Department
Presentation transcript:

Welcome! These slides are designed to help you think through presenting your evaluation planning and results. Feel free to pick and choose the slides that are appropriate for your presentation. Whether you use one slide or all of them, we hope you find it helpful. Some of the language or content in the presentation may not fit your community or the particular audience. Don’t be afraid to edit any of the content, provide more explanation, or use different examples when you present. This can be technical information and you will be the best judge of what works for your audience. Don’t forget to save this as a new document, change the background and erase the notes, brackets, and examples prior to presenting!

Guide to the Slides The BLUE slides are designed to provide standard language regarding the Tribal MIECHV grant and evaluation requirements. This can be used without edit. Throughout the GREEN slides you will see brackets ([ ]) that instruct you to insert content from your Section Six document. The YELLOW slides are designed to give you examples of ways to think about presenting the material.

What is program evaluation? Program evaluation is the systematic collection of data to assess program processes or outcomes The goal of Tribal MIECHV program evaluation is to answer questions relevant to each community and to contribute to the broader knowledge base of the field

Rigorous Evaluation The Tribal MIECHV Grants require rigorous evaluation. For this grant, rigor means: Credibility ‐ Ensuring what is intended to be evaluated is actually being evaluated and that the proposed research data collection and analysis appropriately answer the research questions of interest Applicability ‐ Ensuring results can be generalized beyond this project and that the reader can believe the results accurately represent a population or context Consistency ‐ Ensuring that the process and method are articulated in advance and closely followed Neutrality ‐ Producing results that are as objective as possible while acknowledging the bias that may be brought to data collection, analysis, and interpretation of the results.

Evaluation Goals Building community knowledge [Insert content from Section Six that describes your rationale for exploring your particular question. How will your project going to provide your community with needed information?] Adding to the “knowledge base” Most home visiting research does not account for the unique cultures and contexts of tribal communities or provide information about what strategies work in tribal contexts [Insert content from the Summary of the Current Knowledge Base of your Section Six]

Evaluation Planning

[Describe your community’s specific planning process] How did you get community input? How did you engage with technical assistance? Who approved your questions and design?

Evaluation question [Insert Evaluation Question from Section Six Document (You may want to provide question in PICO format)]

Importance of Evaluation Question [Describe why your program selected the question you did. Why is it an important undertaking for the community? What is the value of this effort to your program and your community? ]

How will we answer our question? [ Describe the design you selected] Why did you select the design How will it allow you to answer your question What are some limitations of your design

Evaluation Diagram [Insert type of evaluation design (i.e. waitlist control design, historical comparison, randomized control trial, etc.)] [Describe evaluation diagram (X= intervention, O= observation, T 1 = time point one, etc.)] [Insert evaluation diagram] You can insert the diagram from your section six document as an image or create the diagram in PowerPoint using text and shapes (Insert-> Shapes)

Evaluation Outcome Measures [Insert brief description of each of your stated outcomes and their corresponding measures. This content can come from your Planned Measures and Instruments Section.]

Initial Results [Insert bar or line graph of initial results– Graphs can be easily inserted from Excel into Powerpoint.] One helpful resource for determining what type of graph and tips for displaying data is Using Graphics to Report Evaluation Results (Michaud, 2003):

Implications of Results [This content should draw on your content in the Summary of Current Knowledge Base section of Section Six] Are your evaluation results different or similar to what you expected and what you found in the literature? How will your evaluation results inform the field? Have your evaluation results prompted you to modify your program? Have your evaluation results changed how your agency does x,y,z?

Community-wide impact Community impact [Insert content on ways you have seen the evaluation effort impact your community (if you have). If you haven’t reached this phase of the evaluation, what are some potential ways the process or results could change service delivery for this or other programs, community partnerships, program capacity, mindsets about data, etc.?] Sustainability [How will your results improve your agency’s ability to support future funding? What other ways might your evaluation improve your ability to continue your program?]

Lessons Learned [Insert content on lessons learned] What has gone well in your evaluation? What would you do differently in the evaluation? How might what you learned benefit other communities?

Evaluation Question- Example Do families who receive home visiting demonstrate improved parenting attitudes and behaviors compared to families who receive services as usual? Community members told us that promoting positive parenting was a critical need. For our home visiting program to be successful we will need to promote positive attitudes and actions with young parents. Evaluation effort helps us figure out if our services are working in this way.

How will we answer our question?- Example Chose to use a “waitlist control design” Families that receive the home visiting immediately are compared to those who are waitlisted for the services Design allowed our program to see if home visiting was improving parenting while still serving everyone Drawbacks of the design: Some families had to wait for services Other factors could be impacting parents during the “waiting” time

Evaluation Diagram- Example Waitlist Control Design R=Randomly Assigned Participants O=Observation Point X=Intervention

Evaluation Outcome Measures- Example OutcomeMeasure Parenting AttitudesKeys to Interactive Parenting Scale (KIPS)- The KIPS is a structured observational parenting assessment Parenting BehaviorKIPS

Initial Results- Example

Next steps What we plan to do next [provide some examples of specific next steps for how evaluation findings will be used, or of changes that will be made]