Presentation is loading. Please wait.

Presentation is loading. Please wait.

Jennifer Coffey, Ph.D. SPDG Program Lead August 30, 2011.

Similar presentations


Presentation on theme: "Jennifer Coffey, Ph.D. SPDG Program Lead August 30, 2011."— Presentation transcript:

1 Jennifer Coffey, Ph.D. SPDG Program Lead August 30, 2011

2  Performance Measurement 1: Projects use evidence-based professional development practices to support the attainment of identified competencies.  Performance Measurement 2: Participants in SPDG professional development demonstrate improvement in implementation of SPDG-supported practices over time.

3  Performance Measurement 3: Projects use SPDG professional development funds to provide follow-up activities designed to sustain the use of SPDG- supported practices. (Efficiency Measure)  Performance Measurement 4: Highly qualified special education teachers that have participated in SPDG supported special education teacher retention activities remain as special education teachers two years after their initial participation in these activities.

4  2007 grantees will not be using the new program measures:  Everyone else will have 1 year for practice › Grantees will use the revised measures this year for their APR › This continuation report will be a pilot  OSEP will learn from this round of reports and make changes as appropriate  Your feedback will be appreciated › You may continue to report on the old program measures, if you like

5

6 In review:  Opportunities for OSEP and the Office of Planning, Evaluation, and Policy Development to hear from you › 4 + Calls for you to give feedback  Small group to discuss revising the Program Measures and Creating Methodology

7 › To all who joined the large group discussions › To the small working group members:  Patti Noonan  Jim Frasier  Susan Williamson  Nikki Sandve  Li Walter  Ed Caffarella  Jon Dyson  Julie Morrison

8  Monthly Webinars – “Directors’ Calls” › Professional Development Series  Evaluator Community of Practice  Resource Library  “Regional Meetings”  Project Director’s Conference  PLC’s

9  Directors’ Calls focused on the specific measures and overall “how-to” › Please provide feedback about information and assistance you need  Written guidance & tools to assist you  Continuation reporting guidance Webinar that will focus on the program measures  We will be learning from you about necessary flexibility (feedback loop)

10  Projects use evidence-based professional development practices to support the attainment of identified competencies.

11 To view the SPDG Regional Meeting Materials go to: http://signetwork.org/content_pages/27

12 Go to the Home Page to link each webinar segment: http://signetwork.org

13  Models of and Evaluating Professional Development › Date: January 12, 3:00-4 :30pm ET › Speakers: Julie Morrison, Alan Wood, & Li Walter (SPDG evaluators)  SPDG REGIONAL MEETINGS › Topic: Evidence-based Professional Development

14  Innovation Fluency › Date: March 24, 3:00-4:30pm ET › Speaker: Karen Blase, SISEP  Professional Development for Administrators › Date: April 19, 3:00-4:30pm ET › Speakers:Elaine Mulligan, NIUSI Leadscape › Rich Barbacane, National Association of Elementary School Principals  Using Technology for Professional Development › Date:May 18, 2:00-3:30pm ET › Speaker: Chris Dede, Ph.D., Learning Technologies at Harvard’s Graduate School of Education

15 Evidence-Based Intervention Practices  Insert your SPDG initiative here (identified competencies) Evidence-Based Implementation Practices  Professional Development  Staff Competence: Selection, Training, Coaching, and Performance Assessment Drivers  Adult learning methods/principles  Evaluation 15 Two Types of Evidence-Based Practices

16 16 H OW ?

17  The Program Guide articulates a comprehensive set of practices for all stakeholders. 17 Implementation PracticesIntervention Practices Initial Training Team-based Site-level Practice and Implementation Implementation Rubric facilitates self-eval Ongoing Coaching Booster Trainings Implementation Rubric reflection on next steps The 5 Steps of ERIA Data-informed Decision-making Screening and Assessment Progress Monitoring Tiered Interventions and Learning Supports Enhanced Literacy Instruction

18  Program Guide articulates PD model › introduces and illustrates › contextualizes the training › gets away from “you had to be there”  Implementation Rubric operationalizes PD model › drives ongoing implementation › enables fidelity checks › is possible to evaluate  Everyone is on the same page  Sustainability (beyond funding, staff turnover)  Scale-up (recruit new sites/districts, beyond SPDG)  Diversity of approaches enabled 18

19 19 H OW ?

20  Training must be … › Timely › Theory grounded (adult learning) › Skill-based  Information from Training feeds back to Selection and feeds forward to Coaching SelectionTraining Coaching (Blase, VanDyke, & Fixsen, 2010) 20

21  Design a Coaching Service Delivery Plan  Develop accountability structures for Coaching – Coach the Coach!  Identify on-going professional development for coaches Coaching Performance Assessment Training (Blase, VanDyke, & Fixsen, 2010) 21

22  Must be a transparent process  Use of multiple data sources  Fidelity of implementation should be assessed at the local, regional, and state levels  Tied to positive recognition  Information from this driver feeds back to Selection, Training, and Coaching and feeds forward to the Organization Drivers 22

23  “No intervention practice, no matter what its evidence base, is likely to be learned and adopted if the methods and strategies used to teach or train students, practitioners, parents, or others are not themselves effective.” "Let's Be Pals: An Evidence-based Approach to Professional Development." Dunst & Trivette, 2009

24 Using Research Findings to Inform Practical Approaches to Evidence-Based Practices Carl J. Dunst, Ph.D. Carol M. Trivette, Ph.D. Orelena Hawks Puckett Institute Asheville and Morganton, North Carolina Presentation Prepared for a Webinar with the Knowledge Transfer Group, U.S. Department of Health and Human Services, Children’s Bureau Division of Research and Innovation, September 22, 2009

25  “ Adult learning refers to a collection of theories, methods, and approaches for describing the characteristics of and conditions under which the process of learning is optimized.”

26 Planning Introduce Engage the learner in a preview of the material, knowledge or practice that is the focus of instruction or training Illustrate Demonstrate or illustrate the use or applicability of the material, knowledge or practice for the learner Application Practice Engage the learner in the use of the material, knowledge or practice Evaluate Engage the learner in a process of evaluating the consequence or outcome of the application of the material, knowledge or practice Deep Understanding Reflection Engage the learner in self-assessment of his or her acquisition of knowledge and skills as a basis for identifying “next steps” in the learning process Mastery Engage the learner in a process of assessing his or her experience in the context of some conceptual or practical model or framework, or some external set of standards or criteria a Donovan, M. et al. (Eds.) (1999). How people learn. Washington, DC: National Academy Press.

27  The smaller the number of persons participating in a training (<20), the larger the effect sizes for the study outcomes.  The more hours of training over an extended number of sessions, the better the study outcomes.  The practices are similarly effective when used in different settings with different types of learners.

28 Practices Number Mean Effect Size (d) 95% Confide nce Interval Studies Effect Sizes Pre-class exercises991.02.63-1.41 Out of class activities/self- instruction 1220.76.44-1.09 Classroom/workshop lectures 26108.68.47-.89 Dramatic readings1840.35.13-.57 Imagery718.34.08-.59 Dramatic readings/imagery 411.15-.33-.62 Effect Sizes for Introducing Information to Learners

29 Practices Number Mean Effect Size (d) 95% Confide nce Interval Studies Effect Sizes Using learner input for illustration 66.89.28-1.51 Role playing/simulations 2064.87.58-1.17 Real life example/real life + role playing 610.67.27-1.07 Instructional video549.33.09-.59 Effect Sizes for Illustrating/Demonstrating Learning Topic

30 Practices Number Mean Effect Size (d) 95% Confide nce Interval Studie s Effect Sizes Real life application + role playing 5201.10.48-1.72 Problem solving tasks 1629.67.39-.95 Real life application1783.58.35-.81 Learning games/writing exercises 911.55.11-.99 Role playing (skits, plays) 1135.41.21-.62 Effect Sizes for Learner Application

31 Practices Number Mean Effect Size (d) 95% Confid ence Interva l Studie s Effect Sizes Assess strengths/weakne sses 1448.96.67- 1.26 Review experience/make changes 1935.60.36-.83 Effect Sizes for Learner Evaluation

32 Practices Number Mean Effect Size (d) 95% Confid ence Interva l Studie s Effect Sizes Performance improvement 9341.07.69- 1.45 Journaling/behavi or suggestion 817.75.49- 1.00 Group discussion about feedback 1629.67.39-.95 Effect Sizes for Learner Reflection

33 Practices Number Mean Effect Size (d) 95% Confid ence Interva l Studie s Effect Sizes Standards-based assessment 1344.76.42- 1.10 Self-assessment1629.67.39-.95 Effect Sizes for Self-Assessment of Learner Mastery

34  To be most effective need to actively involve the learners in judging the consequences of their learning experiences (evaluate, reflection, & mastery) › Need learner participation in learning new knowledge or practice › Need learner engagement in judging his or her experience in learning and using new material

35  Definition: Innovation Fluency refers to the degree to which we know the innovation with respect to: › Evidence › Program and Practice Features › Implementation Requirements 35

36  After you › Have chosen based on student needs › Looked for “best evidence” to address the need An Evidence-Based Practice or Program An Evidence-Informed Initiative or Framework Systems Change and Its Elements 36

37  After you › Have chosen based on student needs › Looked for “best evidence” to address the need An Evidence-Based Practice or Program An Evidence-Informed Initiative or Framework Systems Change and Its Elements  Then it’s time to: › Clearly identify and operationalize the elements 37

38 Professional Problem Solving 9 Critical Components  Parent Involvement  Problem Statement  Systematic Data Collection  Problem Analysis  Goal Development  Intervention Plan Development  Intervention Plan Implementation  Progress Monitoring  Decision Making Professional Practices in Problem Solving: Benchmarks and Innovation Configurations Iowa Area Education Agency Directors of Special Education, 1994 38

39 Interaction of Leadership and Implementation Support Drivers Regarding Administrators Project Level Providing Leadership District Level Providing Leadership Building Level Providing Leadership Develop systems for district and building administrators to implement practices with success Develop systems for building administrators to implement practices with success Develop systems for building staff to implement practices with success Purpose: To Develop project Capacity (e.g., data systems, information resources, incentives) and Competency (e.g., selection, training, coaching) so administrators can implement practices with success

40 Provides guidance, visibility, funding, political support for MiBLSiStudentsStudents Building Staff Building Leadership Team LEA District Leadership Team Across State Multiple District/Building Teams All staff All students Multiple schools w/in local district Who is supported? How is support provided? Provides guidance, visibility, funding, political support Provides coaching for District Teams and technical assistance for Building Teams Provides guidance and manages implementation Provides effective practices to support students Improved behavior and reading ISD Leadership Team Regional Technical Assistance Michigan Department of Education/MiBLSi Leadership Michigan Multiple schools w/in intermediate district Provides guidance, visibility, funding, political support MiBLSi Statewide Structure of Support 40

41 Developing Capacity Through “Manualization”  Manuals are created to provide information and tools for implementation  Various levels  District Level  Building Level 41

42 Developing Capacity Through “Practice Profiles” (Implementation Guides) Implementation Guides have been Developed for –Positive Behavioral Interventions and Supports at the Building Level –Reading Supports at the Building Level –Building Leadership Team –District Leadership Team Quick Guides have been developed for –Principals –Coaches 42

43 Practice Profile: Building Leadership Team Example 43

44 To Capture These Professional Development Elements  Created a rubric for evidence-based professional development  Implementation drivers = domains  Each domain has components  Each component will be measured by a panel of external evaluators  Evaluators will likely be chosen from the US Department of Education 44

45 45

46 Rules for developing good rubrics (Zhang & Fiore, 2011)  Process  Determine what products/practices will be evaluated.  Define each dimension and associated indicators.  Determine a scale for describing the range of / products/practices.  Write descriptors for each of the categories.  Pilot test with users and revise—iterative process  Train the evaluators. Check inter-rater reliability  Finalize the rubric and share with evaluatees. 46

47 Use of the Rubric  4 Domains, each with 6 components  Selection  Training  Coaching  Performance Assessment  Components from the National Implementation Research Network, Learning Forward (NSDC), Guskey  Each component of the domains will be rated from 1 - 4 47

48 Component Themes  Assigning responsibility for major professional development functions (e.g., measuring fidelity and outcomes; monitoring coaching quality)  Expectations stated for all roles and responsibilities (e.g., PD participants, trainers, coaches, school & district administrators)  Data for each stage of PD (e.g., selection, training, implementation, coaching, outcomes) 48

49 PD Needed for SPDG Project Personnel  Adult learning principles for coaches  More on adult learning principles for training  How to create a professional development plan  How describe the elements of the professional development plan 49

50 What Initiatives will you report on? (from project feedback) 1. “If a SPDG has 1 Goal/Initiative, they report on the performance measure for it. If a SPDG has two Goals/Initiatives, they report on one. If they have three, they report on two. If they have four, they report on two. The pattern would be that SPDGs would report on half of their Goals/Initiatives if they have an even number of them and would report on 2 of 3, 3 of 5, 4 of 7, etc for odd numbers. This is a simple method which could easily be applied.” 50

51  “An alternative would be to have SPDGs report on the measures for any Goals/Initiatives which involve PD which includes workshops/conferences designed not to just impart knowledge (Awareness Level of Systems Change Theory) but to implement an evidence based practice (e.g. SWPBIS, Reading Strategies, Math Strategies, etc.) …” 51

52  “l think having the OSEP Project Officers assigned to the various states negotiate with their respective state SPDG directors which of their SPDG initiatives are appropriate for this measure. This could be done each year immediately after the annual project report is submitted to OSEP by the SPDG Directors via a phone call and email exchanges. IF the negotiation could be completed in the early summer, individual meetings (if necessary) could be conducted at the July OSEP Project Director's Conference in July. This type of negotiation could provide OSEP Project Officers with information necessary for making informed decisions about each state SPDG award for the upcoming year.” 52

53 We will do…  A combination of the three ideas:  We will only have you report on those initiatives that lead to implementation (of the practice/program you are providing training on)  If you have 1 or 2 of these initiatives, you will report on both. If you have 3, you will report on 2. If you have 4 you will report on 2, (report on 3 if you have 5, and so on)  This is all per discussion with your Project Officer. 53

54 Setting Benchmarks  “Perhaps a annual target could be that at least 60% of all practices receive at least a score of 3 or 4 in the first year of a five year SPDG funding cycle; at least 70% of all practices receive at least a score of 3 or 4 in the second year of funding; at least 80% of all practices receive at least a score of 3 or 4 in the third year of funding; at least 90% of all practices receive at least a score of 3 or 4 in the fourth and fifth year of funding.” 54

55 We will do…  This basic idea:  1 st year of funding: baseline  2 nd yr: 50% of components will have a score of 3 or 4  3 rd yr: 70% of components will have a score of 3 or 4  4 th yr: 80% of components will have a score of 3 or 4  5 th yr: 80% of components will have a score of 3 or 4 (maintenance yr) 55

56 Other feedback  Have projects fill out a worksheet with descriptions of the elements of their professional development system  We will do this and have the panel of evaluators work from this worksheet and any supporting documents the project provides  Provide exemplars  We will create practice profiles for each component to demonstrate what would receive a 4, 3, 2, or 1 rating 56

57 Ideas for Guidance  “It would be helpful to have a few rows [of the rubric] completed as an example with rating scores provided.  We will do this  Other ideas 57


Download ppt "Jennifer Coffey, Ph.D. SPDG Program Lead August 30, 2011."

Similar presentations


Ads by Google