Presentation is loading. Please wait.

Presentation is loading. Please wait.

The Process we Used and What we Learned…. Renee Bradley, Judy Shanley – OSEP Herb Baum, Danielle Schwarzmann – ICF Macro 2009 OSEP Project Director’s Meeting.

Similar presentations


Presentation on theme: "The Process we Used and What we Learned…. Renee Bradley, Judy Shanley – OSEP Herb Baum, Danielle Schwarzmann – ICF Macro 2009 OSEP Project Director’s Meeting."— Presentation transcript:

1 The Process we Used and What we Learned…. Renee Bradley, Judy Shanley – OSEP Herb Baum, Danielle Schwarzmann – ICF Macro 2009 OSEP Project Director’s Meeting July 20 th, 2009 1/14/20161

2 Thank you to those projects who participated It’s never too early to think about Performance Measurement… Performance Measures: Under the Government Performance and Results Act of 1993 (GPRA), the Department has established a set of performance measures, including long-term measures, that are designed to yield information on various aspects of the effectiveness and quality of the Technical Assistance and Dissemination to Improve Services and Results for Children with Disabilities program. These measures focus on the extent to which projects provide high quality products and services, the relevance of project products and services to educational and early intervention policy and practice, and the use of products and services to improve educational and early intervention policy and practice. The grantee will be required to provide information related to these measures. The grantee also will be required to report information on the project’s performance in annual reports to the Department (34 CFR 75.590).  Applications, work scope, reporting requirements 1/14/20162

3  Provides aggregate picture of our performance  Affects potentially, future program funding  Supports profession of TA&D  Enhances legitimacy of the field of TA&D 1/14/20163

4  Annual ◦ Quality, usefulness, relevance, cost (efficiency)  Long-term ◦ Implementation of evidence-based practices ◦ Implementation of model demonstrations 1/14/20164

5 5 Annual Measure Percentage of Special Education Technical Assistance and Dissemination (TA&D) Products and services deemed to be of high quality by an independent (science) review panel (annual).  Substance: ◦ Does the product or service reflect an evidence-based approach or one grounded in current legislation or policy?  Communication: ◦ Is the presentation of the product or service description clear, well-formatted, organized? 1/14/2016

6 6 Annual Measure Percentage of Special Education Technical Assistance and Dissemination (TA&D) Products and services deemed to be of useful by an independent (science) review panel (annual).  Ease: ◦ Is the product or service description easily understood, providing guidance and direction?  Replicability: ◦ Will the product or service eventually be used by the target group to achieve the benefit intended?  Sustainability: ◦ Will the product or service eventually be used in more than one setting and over time? 1/14/2016

7 7 Annual Measure Percentage of Special Education Technical Assistance and Dissemination (TA&D) Products and services deemed to be of relevant by an independent (science) review panel (annual).  Need: ◦ Does the product or service solve an important problem or address a critical issue?  Pertinence: ◦ Does the product or service relate directly to the problem or issue facing the targeted group?  Reach: ◦ Does the product or service apply to diverse populations within the target group? 1/14/2016

8 8  OMB requests each Federal program to provide information on cost. ◦ Cost is defined as the $ per unit of output. ◦ Cost includes labor and other direct costs. ◦ Cost includes only those expenses incurred by Federal project. ◦ For TA&D this is the cost of producing the ‘best’ practice/service. 1/14/2016 How is Cost Defined?

9 9  Implemented this year for TA&D program for the first time  Twenty-two TA&D projects and 10 State Deaf-blind projects randomly selected  Two Long-term Measures: ◦ Implementation of Evidence-Based Practices ◦ Implementation of Model-Demonstration Projects 1/14/2016

10 10  Percentage of school districts and service agencies receiving Special-Education TA&D services regarding scientifically or evidenced- based practices for infants, toddlers, children and youth with disabilities that implement those practices (long-term ). 1/14/2016

11 11  The percentage of TA&D projects responsible for developing models that identify, implement, and evaluate effective models (long-term). 1/14/2016

12 12 Dr. Herbert Baum, Danielle Schwarzmann ICF Macro

13  Selection ◦ Random Selection  Using placemat ◦ Additional sites suggested by OSEP to ensure Evidence—based practice areas were represented  Data Collection ◦ Initial e-mail ◦ Follow-up/ Reminder e-mails ◦ Phone calls when necessary 1/14/201613

14 1. E-mail requesting ‘best’ product/service (including costs), and list of services/products 2. Randomly selected a product/service as a ‘typical’ submission 3. Requested the program complete the ‘typical’ description 1/14/201614

15  Products/services are evaluated for 3 criteria 1.Quality  Substance  Communication 2.Relevance  Need  Pertinence  Reach 3.Usefulness  Ease  Likelihood of Use  Replicability 1/14/201615

16  OMB requests each federal program to provide information on cost.  Cost is defined as the $ per unit of output.  For TA&D this is the cost of producing the ‘best’ practice/service. 1/14/201616

17  TA&D centers submitted forms indicating their practices/programs and where they were being implemented  ICF Macro chose a practice, and a place of implementation to complete a form 1/14/201617

18 Best Product/Service DescriptionTypical Product/Service Description ProductService Have not submitted Response Rate ProductService Have not submitted Response Rate TA&D1750100%124672% Deaf/ Blind 1000100%72190% List of Products/ServicesBest CostMeasure 1.1 Submitted Have not Submitted Response Rate Submitted Have not Submitted Response Rate Submitted Have not Submitted Response Rate TA&D20291%17577%220100% Deaf/ Blind 9190%100100% 1/14/201618

19  Consider using electronic submissions - may not need to mail hard copies  You save money on shipping  Decreases response time  Trees are saved  Submit accessible products - Diminish use of copywritten products as exemplars  Products – not links to websites must be submitted  Talk with your Project Officer and TA recipients about the Program Evaluation 1/14/201619

20  April ◦ Final review of methodology by OSEP ◦ Preparation of protocol materials  May ◦ Select sample of projects ◦ Send requests to projects 1/14/201620

21  June ◦ Obtain description of ‘best” and “typical”” product and service ◦ Obtain lists of products and services ◦ Obtain products ◦ Obtain cost data  July - Review by expert panels  August – Generate Measures  September – Report finings and recommendations to OSEP 1/14/201621

22  Collect feedback from 2009 samples and use to improve process  Share results  Discuss with grantees in kick-off meetings and during the year  Explore ways to integrate content from annual continuation reports into the annual program evaluation  Learn from other Federal agencies  Support and engage OSEP Project Officers 1/14/201622

23  Comments, Questions, Concerns? ◦ What could we do to improve our process? ◦ What could we do to enhance definitions and instructions? ◦ What can we do to make life easier for you? 1/14/201623


Download ppt "The Process we Used and What we Learned…. Renee Bradley, Judy Shanley – OSEP Herb Baum, Danielle Schwarzmann – ICF Macro 2009 OSEP Project Director’s Meeting."

Similar presentations


Ads by Google