Professional Development: Evaluation Strategies to Maximize Impact

Slides:



Advertisements
Similar presentations
Five -Year Strategic Title I School Plan. Session Objectives Review the five year components utilizing the rubric Organize actions steps to meet the requirements.
Advertisements

Using Technical Assistance Teams for Tertiary PBS Carol Davis, Ed. D., Ilene S. Schwartz, Ph. D. University of Washington
First, a little background…  The FIT Program is the lead agency for early intervention services under the Individuals with Disabilities Education Act.
IES e-PATT Grant e-PATT: Parents and Teachers Together.
Presented by Monica Ballay, LASPDG Staff
Using Online Instruction in Positive Behavior Support Training Kansas Institute for Positive Behavior Support (KIPBS) Rachel Freeman.
Identifying and Assessing Learning Outcomes for Professional Development Programming Diane E. Waryas, Ph.D. Kim E. VanDerLinden, Ph.D.
Linking Implementation Data to the SPDG’s PD Worksheet The contents of this presentation were developed under a grant from the U.S. Department of Education,
District Trainer Program Helping you to plan and conduct training meetings that support effective Rotary clubs.
2014 TCTW State Leaders’ Forum Oklahoma City, Oklahoma January 29, 2014 CTE Teacher Preparation Project SREB.
Professional Development to Practice The contents of this presentation were developed under a grant from the US Department of Education to the Missouri.
The contents of this presentation were developed under a grant from the US Department of Education, #H323A However, these contents do not necessarily.
1 SINA Implementation Action Plan Professional Development Assessment Evaluation Questions Requires ongoing Specifies Monitors Student Success Teacher.
Data Use Professional Development Series Day The contents of this slideshow were developed under a Race.
Blended Practices for Teaching Young Children in Inclusive Settings Jennifer Grisham-Brown, Ed.D. Mary Louise Hemmeter, Ph.D.
Carol M. Trivette, Ph.D. Orelena Hawks Puckett Institute Asheville and Morganton, North Carolina Presentation prepared for Helping Extend Learning and.
Professional Development to Practice The contents of this presentation were developed under a grant from the US Department of Education to the Missouri.
Ohio Department of Education March 2011 Ohio Educator Evaluation Systems.
The Center for IDEA Early Childhood Data Systems National Meeting on Early Learning Assessment June 2015 Assessing Young Children with Disabilities Kathy.
Illinois Department of Children & Family Service/Chicago State University STEP Program - NHSTES May THE STEP PROGRAM Supervisory Training to Enhance.
Annie McLaughlin, M.T. Carol Davis, Ed.D. University of Washington
Implementation Drivers: Selection The contents of this presentation were developed under a grant from the U.S. Department of Education, #H323A However,
M & E System for MI’s Training Program & Guidelines for MI’s Completion Report Presented by Monitoring and Evaluation Officer Mekong Institute January.
Professional Development to Practice The contents of this presentation were developed under a grant from the US Department of Education to the Missouri.
Tier 1 Instructional Delivery and Treatment Fidelity Networking Meeting February, 2013 Facilitated/Presented by: The Illinois RtI Network is a State Personnel.
Response to Invention (RTI) A Practical Approach 2016 Mid-Level Conference.
The Center for IDEA Early Childhood Data Systems Improving Data, Improving Outcomes Conference, September 2014 Digging into “Data Use” Using the DaSy Framework.
Fidelity: Maximizing the Effectiveness of Tier 2 September 24, 2013 Facilitated/Presented by: The Illinois RtI Network is a State Personnel Development.
Creating Alternative Pathways for Students to Achieve Academic Credit in School The contents of this PowerPoint were developed under a grant from the US.
The Center for IDEA Early Childhood Data Systems The Importance of Personnel Data Donna Spiker Co-Director, DaSy Center OSEP 2016 Virtual leadership Conference.
Am I Making a Difference? Using Data to Improve Practice Megan Vinh, PhD Lise Fox, PhD 2016 National Inclusion Institute May 12, 2016.
Facilitator: Angela Kapp Authentic Assessment Session 2 Session 2 Level 2 Minnesota Department of Human Services.
Page 1: Five Ws and One H of Credentialing Column 1 School Year Cohort 1 Year 1 (Pilot Year) Column 2 School Year Cohort 1 Year 2 Cohort.
Equity, Inclusion, and Opportunity: Getting Results by Addressing Success Gaps [PRESENTATION 2-4: ADD DATE]
Improving Data, Improving Outcomes New Orleans, LA August 16, 2016
HRM-755 PERFORMANCE MANAGEMENT OSMAN BIN SAIF LECTURE: TWENTY THREE 1.
Health Literacy Summit Madison, WI
Engaging Families and Creating Trusting Partnerships to Improve Child and Family Outcomes More on Infusing Partnership Principles and Practices into Family.
Interview Responses: Job Satisfaction
Supporting Families’ and Practitioners’ Use of the DEC Recommended Practices Chelsea Guillen-Early Intervention Training Program at the University of.
Status of Part B 619 State Data Systems
Part C Data Managers — Review, Resources, and Relationship Building
Pacific and Caribbean States/Entities Early Intervention and
Using Formative Assessment
Family-Guided Routines-Based Intervention Introduction Module
Supporting Improvement of Local Child Outcomes Measurement Systems
Evaluating PD Activities: Learning from State Examples
Transforming Hidalgo County CSCD into an Evidence Based Agency
National Webinar Presented by: Amy Nicholas Cathy Smyth
ECO Family Experiences and Outcomes measurement system
Improving Data, Improving Outcomes Conference, September 2014
Evaluating Practices Workshop Series
Parent-Teacher Partnerships for Student Success
Evaluating Practices Workshop Series
2018 OSEP Project Directors’ Conference
Leveraging Evaluation Data: Leading Data-Informed Discussions to Guide SSIP Decisionmaking Welcome Mission of IDC- we provide technical assistance to build.
2018 OSEP Project Directors’ Conference
Extended Learning Opportunities
Supporting Improvement of Local Child Outcomes Measurement Systems
Let’s Talk Data: Making Data Conversations Engaging and Productive
7-Point Rating Scale Jeopardy
Parent-Teacher Partnerships for Student Success
Pacific and Caribbean States/Entities Early Intervention and
Evidence-Based Practices Under ESSA for Title II, Part A
Integrating Results into Accountability Procedures and Activities
Georgia’s Tiered System of Supports for Students Karen Suddeth, Project Director Carole Carr, Communications & Visibility Specialist
Intensive Intervention – Tier 3
Cynthia Curry, Director National AEM Center
COS Training to Promote Data Quality: What’s Working, What’s Needed
Data Culture: What does it look like in your program?
Presentation transcript:

Professional Development: Evaluation Strategies to Maximize Impact Jeffri Brookfield National Center for Systemic Improvement & IDEA Data Center Chelsea Guillen Early Intervention Training Program at the University of Illinois

Purposes To examine a framework for developing an evaluation of improvement strategies that rely on professional development. To examine the usefulness of information yielded by various measurement tools/approaches. To examine sample evaluation plans with attention to pivotal decision points in the implementation process.

Importance of Evaluating Professional Development Professional development (PD) is costly, too often ineffective, and still necessary to establish changes in professional practice. A good evaluation plan for PD initiatives will: Answer the most important questions, In the most useful manner, Allowing for timely corrections/modifications, While minimizing the cost and labor-burden of evaluation.

Implementing a PD improvement strategy Develop training & select trainees Conduct Training Implement Practice Implementation Support Evaluation Topics How did participants perceive the training? Did participants acquire new knowledge/skills? Did the intervention get implemented as planned? What implementation support is needed? Did implementation improve after support was offered? What content? Who will participate?

Measures: Satisfaction/Perception How well did the training meet your needs? Were the trainers knowledgeable? Did you gain sufficient training in order to…? What was most valuable (least valuable) about the training? And so forth……

Example Satisfaction Survey Satisfaction with a consultant Satisfaction with a training

Measures: Knowledge/Skill Written assessment multiple choice, short answer Written product IFSP Behavioral intervention plan Assessment report Other…

Example Knowledge/Skills Measures

Measures: Observational (direct or video) Rubric Checklist Scale PLACHEK, event, duration, number, frequency, rate….

Sample Observational Measures

Participants & Initial Training Characteristics of participants Any prerequisites? Characteristics of the training: Explicit explanation & illustration of practice Opportunities for practice Reflection Sufficient duration and intensity …..

Immediately after training Knowledge acquired? Satisfaction with training experience? Confidence in skills? Action plan for implementation of new skills?

Transfer of training to practice What practices are being implemented? Fidelity of implementation? Frequency/intensity of practice? Adherence to a standard?

Follow-up support Frequency and quality of feedback? Support/action plan? Satisfaction of support experienced?

Sample measurement tools used by states EITP Quality Professional Development Rubric RI Services Rendered Form (monitoring tool) GVSU Coaching Self-Assessment UIC ERF Observation Coaching Session Summary TACSEI Family Coaching Checklist Implementing the Pyramid Model in Home Visiting Programs: Benchmarks of Quality Evaluation of Coaching/Support Received Recommended Practices Performance Checklists ……...

Implementing a PD improvement strategy : Pivotal evaluation points Develop training & select trainees Conduct Training Implement Practice Implementation Support Evaluation Topics How did participants perceive the training? Did participants acquire new knowledge/skills? Did the intervention get implemented as planned? What implementation support is needed? Did implementation improve after support was offered? What content? Who will participate?

Planning Guide Implementation Step: ___________________ Evaluation question(s) [i.e. what do you need to know]? What data will be collected and what tools will be used? When and how will data be collected? When, how, and by whom will data be analyzed? Who will use the results to make decisions, and how and when will these decisions be made? What are potential modifications you may make to the implementation of the improvement strategy based on these data?

Discussion Questions Is your state using any of the following? Post Training Satisfaction Survey Pre/Post Training Knowledge-Based Assessment Implementation Fidelity Checklist Coaching Satisfaction Survey What additional measurement tools would be useful to your state for determining how well interventions are being implemented? What is your plan for evaluating the data being collected? How will the data be used for continuous improvement?

Contact information Jeffri Brookfield jbrookf@wested.org https://ncsi.wested.org/ https://ideadata.org/ Chelsea Guillen cguillen@Illinois.edu http://eitp.education.illinois.edu/

Disclaimer The contents of this presentation were developed, in part, under grants from the U.S. Department of Education, #H326R140006 and #H373Y130002. However, the contents do not necessarily represent the policy of the Department of Education, and you should not assume endorsement by the Federal Government. Project Officers: Perry Williams and Shedeh Hajghassemali Richelle Davis and Meredith Miceli