Office of Special Education Programs April 13, 2017

Slides:



Advertisements
Similar presentations
Presented at Annual Conference of the American Evaluation Association Anaheim, CA, November 2011 Lessons Learned about How to Support Outcomes Measurement.
Advertisements

Funding Opportunities at the Institute of Education Sciences Elizabeth R. Albro, Ph.D. Acting Commissioner, National Center for Education Research.
Special Education Accountability Reviews Let’s put the pieces together March 25, 2015.
New Hampshire Enhanced Assessment Initiative: Technical Documentation for Alternate Assessments Standard Setting Inclusive Assessment Seminar Marianne.
1 Program Performance and Evaluation: Policymaker Expectations 2009 International Education Programs Service Technical Assistance Workshop Eleanor Briscoe.
Coordinating Center Overview November 18, 2010 SPECIAL DIABETES PROGRAM FOR INDIANS Healthy Heart Project Initiative: Year 1 Meeting 1.
Coordinating Center Overview November 16, 2010 SPECIAL DIABETES PROGRAM FOR INDIANS Diabetes Prevention Program Initiative: Year 1 Meeting 1.
1 Susan Weigert, Project Officer GSEGs Overview of GSEG Management.
Funding Opportunities at the Institute of Education Sciences Elizabeth R. Albro, Ph.D. Associate Commissioner Teaching and Learning Division National Center.
Emily Lynn Grant Administrator Office of Sponsored Projects and Research Administration.
Literature Review and Parts of Proposal
1 OSEP Project Directors’ Conference April 28 th, 2015.
HECSE Quality Indicators for Leadership Preparation.
WELCOME WELCOME PERSONNEL DEVELOPMENT PROGRAM OSEP PROJECT DIRECTORS’ VIRTUAL CONFERENCE APRIL 27, 2015.
Comprehensive Educator Effectiveness: New Guidance and Models Presentation for the Virginia Association of School Superintendents Annual Conference Patty.
WELCOME WELCOME PERSONNEL DEVELOPMENT PROGRAM OSEP PROJECT DIRECTORS’ VIRTUAL CONFERENCE APRIL 27, 2015.
ANNUAL AND FINAL PERFORMANCE REPORTS 524B FORM REPORTING PERIOD BUDGET EXPENDITURES INDIRECT COST RATE PERFORMANCE MEASURES.
0 Personnel Development to Improve Services and Results for Children with Disabilities PERFORMANCE MEASURES Craig Stanton Office of Planning, Evaluation,
Using Individual Project and Program Evaluations to Improve the Part D Programs Dr. Herbert M. Baum.
Office of Special Education Programs U.S. Department of Education GRANT PERFORMANCE REPORT FOR CONTINUATION FUNDING.
WELCOME WELCOME PERSONNEL DEVELOPMENT PROGRAM OSEP PROJECT DIRECTORS’ VIRTUAL CONFERENCE APRIL 27, 2015.
Office of Special Education Programs U.S. Department of Education GRANT PERFORMANCE REPORT FOR CONTINUATION FUNDING.
AuthorAID Workshop on Research Writing Tanzania June 2010.
Developing Monitoring and Pre-Scoring Plans for Alternate/Alternative Assessments Virginia Department of Education Division of Student Assessment and School.
The Process we Used and What we Learned…. Renee Bradley, Judy Shanley – OSEP Herb Baum, Danielle Schwarzmann – ICF Macro 2009 OSEP Project Director’s Meeting.
Office of Special Education Programs U.S. Department of Education GRANT PERFORMANCE REPORT FOR CONTINUATION FUNDING FY 2015.
Funding Opportunities at the Institute of Education Sciences Elizabeth R. Albro, Ph.D. Teaching and Learning Division National Center for Education Research.
OSEP-Funded TA and Data Centers David Guardino, Office of Special Education Programs, U.S. Department of Education.
WELCOME WELCOME PERSONNEL DEVELOPMENT PROGRAM OSEP PROJECT DIRECTORS’ VIRTUAL CONFERENCE APRIL 27, 2015.
GRANT PERFORMANCE REPORT FOR CONTINUATION FUNDING FY 2013 Office of Special Education Programs U.S. Department of Education.
325K: COMBINED PRIORITY FOR PERSONNEL PREPARATION Webinar on the Annual Performance Report for Continuation Funding Office of Special Education Programs.
325D: PREPARATION OF LEADERSHIP PERSONNEL Webinar on Project Objective and Performance Measures for the Annual Performance Report for Continuation Funding.
HEALTH RESOURCES & SERVICES ADMINISTRATION OFFICE OF FINANCE AND MANAGEMENT DIVISION OF GRANTS MANAGEMENT OPERATIONS LaToya Ferguson.
Presented at Annual Conference of the American Evaluation Association Anaheim, CA, November 2011 Lessons Learned about How to Support Outcomes Measurement.
Educational Technology, Media, and Materials
With Parent Centers December 15, 2016
Federal Programs Department: Plan4Learning
Consultant Log Data Entry Review
OSEP TA&D Program Performance Measurement
OSEP TA&D Program Performance Measurement
9/16/2018 The ACT Government’s commitment to Performance and Accountability – the role of Evaluation Presentation to the Canberra Evaluation Forum Thursday,
Perkins Annual Reports
Measuring Project Performance: Tips and Tools to Showcase Your Results
Performance Measure Collection for the OSEP Parent Program 2018
Office of Special Education Programs March 7, 2018
Writing to Learn vs. Writing in the Disciplines
Information and Tips for Preparing and Submitting an Application
Measuring Outcomes for Programs Serving Young Children with Disabilities Lynne Kahn and Christina Kasprzak ECO/NECTAC at FPG/UNC June 2,
2018 OSEP Project Directors’ Conference
ANNUAL PERFORMANCE REPORTS
Idendification of and Consultation with Census Data Users
Partnership Collections
Grantee Guide to Project Performance Measurement
Office of Special Education Programs U.S. Department of Education
Why Collect Outcome Data?
Early Childhood and Family Outcomes
Parent-Teacher Partnerships for Student Success
Measuring Outcomes for Programs Serving Young Children with Disabilities Lynne Kahn and Christina Kasprzak ECO/NECTAC at FPG/UNC June 2,
Supporting SEACs across the Province:
Sam Catherine Johnston, Senior TA Specialist National AEM Center
1915(c) WAIVER REDESIGN 2019 Brain Injury Summit
Refresher: Background on Federal and State Requirements
Cynthia Curry, Director National AEM Center
Measuring Child and Family Outcomes Conference August 2008
PROJECT PERFORMANCE MEASURES
Dr. Phyllis Underwood REL Southeast
An overview of course assessment
S-STEM (NSF ) NSF Scholarships for Science, Technology, Engineering, & Mathematics Information Materials 6 Welcome! This is the seventh in a series.
Implementing, Sustaining and Scaling-Up High Quality Inclusive Preschool Policies and Practices: Application for Intensive TA September 10, 2019 Lise.
MCAS-Alt “Grade-level” and “Competency” Portfolios
Presentation transcript:

Office of Special Education Programs April 13, 2017 Educational Technology, Media, and Materials Program FY 2016 GPRA Performance Measures Briefing Office of Special Education Programs April 13, 2017

Briefing Objectives Enhance awareness of the Program GPRA Performance Measures requirements Provide overview of processes used to support ETechM2 Program GPRA performance data-- Collection and information gathering Analysis Reporting

ETechM2 Program GPRA Performance Measures Requirements

Government Performance and Results Act (GPRA) GPRA requires performance assessments of Government programs for purposes of assessing agency performance and improvement The Office of Management and Budget, together with the Federal agencies, determines how programs will be assessed Congress use program performance assessment data to justify program funding

ETechM2 Program Goals and Objectives Program Goal: To promote the development, demonstration, and use of ETechM2 program products and services to improve results for infants, toddlers, children, and youth with disabilities Objective 1: Improve the quality of Educational Technology, Media, and Materials projects. Objective 2: Improve projects’ understanding of the expectations for this year’s GPRA reporting process.

ETechM2 Program Measures Measure 1: The percentage of Educational Technology, Media, and Materials Program products and services judged to be of high quality by an independent review panel of experts qualified to review the substantive content of the products and services (Annual measure) Measure 2: The percentage of Educational Technology, Media, and Materials Program products and services judged by an independent review panel of qualified experts to be of high relevance to improving outcomes of infants, toddlers, children and youth with disabilities (Annual Measure) Measure 3: The percentage of Educational Technology, Media, and Materials Program products and services judged by an independent review panel of qualified experts to be useful in improving results for infants, toddlers, children and youth with disabilities (Annual Measure)

Development, demonstration, and use of-- Project products and services – A product is a piece of work, in tangible or electronic form developed and disseminated by an OSEP-funded project to inform a specific audience on a topic relevant to improvement of outcomes for children with disabilities. A service is work performed by an OSEP-funded project to provide information or assistance to a specific audience on a topic relevant to the improvement of outcomes for children with disabilities.

ETechM2 Program Performance Measures Criteria Measure 1: The percentage of Educational Technology, Media, and Materials Program products and services judged to be of high quality by an independent review panel of experts qualified to review the substantive content of the products and services. Substance - Does the product content or the content delivered through the service reflect evidence of conceptual soundness and quality, grounded in recent scientific evidence, legislation, policy, or accepted professional practice? Communication - Is the product content or the content delivered through the service presented in such a way so as to be clearly understood, as evidenced by being well-organized, free of editorial errors and appropriately formatted?

Performance Measures (cont’d) Measure 2:  The percentage of Educational Technology, Media, and Materials Program products and services judged by an independent review panel of qualified experts to be of high relevance to improving outcomes of infants, toddlers, children and youth with disabilities. Need - Does the product content or the content delivered through the service attempt to solve an important problem or critical issue? Pertinence - Does the product content or the content delivered through the service tie directly to a problem or issue recognized as important by the target audience(s)? Reach – To what extent is the product content or the content delivered through the service applicable to diverse segments of the target audience(s)?

Performance Measures (cont’d) Measure 3:The percentage of Educational Technology, Media, and Materials Program products and services judged by an independent review panel of qualified experts to be useful in improving results for infants, toddlers, children and youth with disabilities. Ease – Does the product content or the content delivered through the service address a problem or issue in an easily understood way, with directions or guidance regarding how the content can be used to address the problem or issue? Suitability - Does the product or service provide the target audience(s) with information or resources that an can be used again or in different ways to address the problem or issue?

Performance Measures (cont’d) There are additional measures specifically for our (327) accessible educational materials (AEM) projects, (327C) Media Description projects and (327E) National Instructional Materials Access Center: Measure 4: The federal cost per unit of accessible educational materials funded by the Educational Technology, Media, and Materials Program. Measure 5: The federal cost per unit of video description funded by the Educational Technology, Media, and Materials Program.    Measure 6: The Federal Cost Per Unit of Accessible Educational Materials from the NIMAC funded by the Educational Technology, Media, and Materials Program.  

Performance Measures (cont’d) The cost measures are not reported in the APR, but collected though request via email from the OSEP Project Officer.

Overview of data collection, analysis, and reporting processes

Collection Methodology The Study Group (TSG) - OSEP’s technical support contractor - coordinates data collection, analysis, and reporting Product and Service descriptions and supporting materials are requested Edited.

Collection Methodology (cont’d) Sample Selection (Annual Measures – QRU) Constitutes a census of all-- ETechM2 program Media Services and all other 84.327 grants receiving funds in FY 2016 Does not include grants in the first year of operation.

Collection Methodology (cont’d) One data request to Media Services grants New Media Product Description Guide Video clips Two data requests to all other 84.327 grants List of FY2016 new products and services Description Guides for one randomly selected new product and new service For the 84.327 grants (not Media Services grants), TSG first asks for a list of up to 10 new products and 10 new services released in FY2016. TSG randomly selects one item from each list. These are the new product and new services that will be reviewed. When developing your lists, remember: You don’t have to include 10 new products or services. 10 is just the maximum. Focus on listing the major products and services you released for the first time on FY2016. Examples of products include: software or hardware products, journal or informational articles, research reports, booklets, pamphlets (not marketing brochures about your project), web-based instructional materials, manuals, DVDs, CDs, multi-media kits or modules, PowerPoint presentations. Examples of services include conducting training or technical assistance, providing captioning, video description, Braille, or other accessible formatting of text or media, leading and convening informational meetings, responding to inquiries from a targeted population. For the purpose of this performance measurement review process, maintaining a website is not considered to be either a product or a service.

Description Guides Guides are the primary source of information for the GPRA review Complete, detailed, and clearly written guides make it easier for panelists to rate product/service QRU accurately Please consult tips for completing description guides provided by TSG The Media Product, New Product, and New Service Description Guides are very important to the GPRA review process. They need your time and attention. The guides are the primary source of information consulted by the expert review panels in making their QRU ratings. Although you have the option to submit supporting materials along with each guide, the panelists are not required to read through these materials in their entirety. Projects that submit complete, detailed, and clear guides make it easier for the expert review panels to rate product and service quality, relevance, and usefulness. If the guides are incomplete, you cannot expect that panelists will spend substantial amounts of time seeking out the information they need from your supporting materials. TSG has developed a set of tips for completing the guides that can be helpful. You have, or will, receive a copy to consult when developing your guide(s). Keep in mind that TSG staff are always willing to have a conversation with you about this task and to review a draft of your response.

Data Analysis ETechM2 Stakeholder Expert Panel Science Expert Panel Technology and media researchers Program administrators Technical assistance providers Science Expert Panel Special education researchers Policy Evidence-based practice

Data Analysis (cont’d) Assessment of products/services – Panelists rate products/services against each criterion for high quality, relevance and usefulness using a 4-point scale ranging from very low to very high Products and services identified as-- Evidence-based….review for quality conducted by the Science Expert Panel Policy-based…..review for quality conducted by the Stakeholder Expert Panel All products and services reviewed for relevance and usefulness by the Stakeholder Expert Panel Ratings of 6 or higher across the criteria are deemed of high quality, relevance, and usefulness

GPRA Data Reporting

ETechM2 Program Performance GPRA Data Reporting TSG completes data collection, analysis, and submit a draft FY 2016 data to OSEP OSEP submits FY 2016 data to ED’s Office of Planning, Evaluation and Policy Development (OPEPD-Budget Services) OPEPD submits FY 2016 data to Office of Management and Budget (OMB) OMB issues GPRA performance data to Congress and public

QUESTIONS ????

Note of Thanks Educational Technology, Media, and Materials Program Project Officers and Directors Dr. Patricia Gonzalez, COR, OSEP The Study Group, Dr. Patti Bourexis or Mr. Larry Law (StudygroupETM2@aol.com)