Presentation is loading. Please wait.

Presentation is loading. Please wait.

Grant Management PLC Session Discussion facilitated by Jennifer Coffey November 2011 Performance Measurement Discussion Dial-in: 1-888-447-7153 Participant.

Similar presentations


Presentation on theme: "Grant Management PLC Session Discussion facilitated by Jennifer Coffey November 2011 Performance Measurement Discussion Dial-in: 1-888-447-7153 Participant."— Presentation transcript:

1 Grant Management PLC Session Discussion facilitated by Jennifer Coffey November 2011 Performance Measurement Discussion Dial-in: 1-888-447-7153 Participant Code: 899594

2 Webinar Ground Rules Mute your phones: -To Mute or Un-Mute Press *6 -Please do not put your phones on ‘Hold’ For webinar technical difficulties: -Send email to adesjarl@uoregon.eduadesjarl@uoregon.edu Q & A Process (audio/chat): -Ask questions in two ways: 1.Audio/Voice 2.Type your question in the Chat Pod Archive Recording, PPT, & Materials -To be posted to; http://signetwork.org/events/108http://signetwork.org/events/108

3 Click the Person icon to: Raise Your Hand Agree/Disagree Other…. Click the Full Screen: To maximize presentation screen

4 Click This Icon to: -Send private messages -Change text size or chat color To Post Chat Messages: Type your message in this box Then hit ‘enter’ on your keyboard to send

5 New Resources Posted 5 1)Performance Reports Page: Maine’s Final Report (2006-2011) http://signetwork.org/content_pages/13 9 2) Grant Management Page: OSEP Key Resources for Grant Management (10/2011 http://signetwork.org/content_pages/13 9

6 Next Call 6 January 23, 2012 2:00-3:00pm Eastern Topic: Effective Online Project Management Tools Audrey – Demonstrate Basecamp Round Robin sharing

7  1. How will you need to change your professional development plan to capture the components shared in the evidence- based professional development rubric?  2. Do you have an implementation fidelity measure? If not, how do you plan to find one or create one?  3. How do you track your technical assistance funding? Specifically, are initial training costs separated from ongoing technical assistance costs? 7

8  Performance Measurement 1: Projects use evidence-based professional development practices to support the attainment of identified competencies.  Performance Measurement 2: Participants in SPDG professional development demonstrate improvement in implementation of SPDG-supported practices over time.

9  Performance Measurement 3: Projects use SPDG professional development funds to provide follow-up activities designed to sustain the use of SPDG- supported practices. (Efficiency Measure)  Performance Measurement 4: Highly qualified special education teachers that have participated in SPDG supported special education teacher retention activities remain as special education teachers two years after their initial participation in these activities.

10  2007 grantees will not be using the new program measures  Everyone else will have 1 year for practice Will use the revised measures this year for their APR This continuation report will be a pilot  OSEP will learn from this round of reports and make changes as appropriate  Your feedback will be appreciated You may continue to report on the old program measures, if you like

11  Projects use evidence-based professional development practices to support the attainment of identified competencies.

12 Evidence-Based Intervention Practices  Insert your SPDG initiative here (identified competencies) Evidence-Based Implementation Practices  Professional Development  Staff Competence: Selection, Training, Coaching, and Performance Assessment Drivers  Adult learning methods/principles  Evaluation 12 Two Types of Evidence-Based Practices

13 13 H OW ?

14  The Program Guide articulates a comprehensive set of practices for all stakeholders. 14 Implementation PracticesIntervention Practices Initial Training Team-based Site-level Practice and Implementation Implementation Rubric facilitates self-eval Ongoing Coaching Booster Trainings Implementation Rubric reflection on next steps The 5 Steps of ERIA Data-informed Decision-making Screening and Assessment Progress Monitoring Tiered Interventions and Learning Supports Enhanced Literacy Instruction

15  Training must be … Timely Theory grounded (adult learning) Skill-based  Information from Training feeds back to Selection and feeds forward to Coaching SelectionTraining Coaching (Blase, VanDyke, & Fixsen, 2010) 15

16  Design a Coaching Service Delivery Plan  Develop accountability structures for Coaching – Coach the Coach!  Identify on-going professional development for coaches Coaching Performance Assessment Training (Blase, VanDyke, & Fixsen, 2010) 16

17  Must be a transparent process  Use of multiple data sources  Fidelity of implementation should be assessed at the local, regional, and state levels  Tied to positive recognition  Information from this driver feeds back to Selection, Training, and Coaching and feeds forward to the Organization Drivers 17

18  Assess fidelity of implementation at all levels and respond accordingly  Identify outcome measures that are … Intermediate and longer-term Socially valid Technically adequate: reliable and valid Relevant data that is feasible to gather, useful for decision making, widely shared and reported frequently 18

19  “Adult learning refers to a collection of theories, methods, and approaches for describing the characteristics of and conditions under which the process of learning is optimized.”

20 Planning IntroduceEngage the learner in a preview of the material, knowledge or practice that is the focus of instruction or training IllustrateDemonstrate or illustrate the use or applicability of the material, knowledge or practice for the learner Application Practice Engage the learner in the use of the material, knowledge or practice EvaluateEngage the learner in a process of evaluating the consequence or outcome of the application of the material, knowledge or practice Deep Understanding ReflectionEngage the learner in self-assessment of his or her acquisition of knowledge and skills as a basis for identifying “next steps” in the learning process MasteryEngage the learner in a process of assessing his or her experience in the context of some conceptual or practical model or framework, or some external set of standards or criteria a Donovan, M. et al. (Eds.) (1999). How people learn. Washington, DC: National Academy Press.

21 21 1. How will you need to change your professional development plan to capture the components shared in the evidence-based professional development rubric?

22 22 Performance Measurement 2: Participants in SPDG professional development demonstrate improvement in implementation of SPDG-supported practices over time.

23  Fidelity of implementation is traditionally defined as “the extent to which the user’s current practice matches the ideal (Loucks, 1983). 23

24  Dusenbury, Brannigan, Falco, & Hansen (2003)  Dane & Schneider (1998)  O’Donnell (2005)  Blase “Innovation Fluency” presentation: http://signetwork.org/content_pages/15 4  Mowbray, Holter, Teague & Bybee (2003) 24

25  “All five studies consistently showed statistically significantly higher outcomes when the program was implemented with greater fidelity.  The studies reviewed here suggest that fidelity of implementation is more probable when an intervention manual is in place that clearly defines the critical components of the intervention and articulates a theory.  Distinctions should be made between measuring fidelity to the structural components of a curriculum intervention and fidelity to the processes that guide its design.” 25

26  The projects will report on those initiatives that they are reporting on for Program Measure 1  Each initiative should have a fidelity measure that notes the presence or absence of the core features of the innovation/program/system that the initiative is focused on 26

27  Use implementation measures that have already been created For example – new RTI implementation measure presented to the RTI PLC Literacy implementation – Planning and Evaluation Tool – Revised (PET-R) Schoolwide Evaluation Tool (SET) Others? 27

28  To develop fidelity criteria, researchers often reported starting with a curriculum profile or analysis that outlined the critical components of the intervention along with an indication of the range of variations for acceptable use. The researcher or developer then outlined acceptable ranges of variation (Songer & Gotwals, 2005).  A component checklist was then developed to record fidelity to these components (Hall & Loucks, 1977). 28

29  2. Do you have an implementation fidelity measure? If not, how do you plan to find one or create one? 29

30  Projects use SPDG professional development funds to provide follow-up activities designed to sustain the use of SPDG-supported practices. (Efficiency Measure) 30

31  3. How do you track your technical assistance funding? Specifically, are initial training costs separated from ongoing technical assistance costs? 31


Download ppt "Grant Management PLC Session Discussion facilitated by Jennifer Coffey November 2011 Performance Measurement Discussion Dial-in: 1-888-447-7153 Participant."

Similar presentations


Ads by Google