Table 5 and Follow-up Collecting and Reporting Follow-up Data Or What’s Behind Table 5? American Institutes for Research February 2005.

Slides:



Advertisements
Similar presentations
Rosemary Matt NYS Director of Accountability April 2014.
Advertisements

Changes for PY 2005 Changes to NRS Reporting PY 2005 American Institutes for Research February 2005.
COMMON MEASURES Presented by: Phil Degon Workforce Administration September 2006.
COMMON MEASURES. Training Objectives Review Common Measures data sources Discuss Reports and Desk Aids.
Cohort Graduation Exit Code Review December 18, 2014.
This document was developed by the National Post-School Outcomes Center, Eugene, Oregon, (funded by Cooperative Agreement Number H326U090001) with the.
Brian A. Harris-Kojetin, Ph.D. Statistical and Science Policy
1 WIA Core Performance Measures (17 Measures) Presented by: Department of Economic Opportunity An equal opportunity employer/program. Auxiliary aids and.
Overview and Discussion of NRS Changes. Massachusetts Department of Elementary and Secondary Education 2 NRS Changes for FY13 AGENDA  Review changes.
New York State Report Cards Program Report Cards Fiscal Year 12/13 Presented by: Rosemary Matt.
New York State Report Cards Program Report Cards Fiscal Year 10/11 Presented by: Rosemary Matt NYS Director of Accountability.
LACES Cohorts. Cohort #1: Enter Employment The most important thing to remember regarding employment cohorts for 12/13 is that most of the employment.
WTCS Framework for Student Success WTCS Board Meeting March
SKIES Data Entry Instructions for WIA Entrepreneurial Training.
The National Reporting System: Foundations Technical College System of Georgia Office of Adult Education.
WIA PERFORMANCE MEASURES AND STANDARDS: The WIASRD, Common Measures and Standards Negotiation Challenges Christopher T. King Ray Marshall Center for the.
Understanding the NRS Rosemary Matt NYS Director of Accountability.
NRS Changes & Implications Central Southern Tier RAEN December 14, 2012.
Performance Improvement Projects: Validating Process, Tips, and Hints Eric Jackson, MA Research Analyst October 19, 2009.
Review of Data Validation Results for PY 2004 We’ve come a long way, baby!
Performance Measurement Under Title 1-B of the Workforce Investment Act Regional Training Richard West Social Policy Research Associates.
Data Validation (DV) What is DV? Who conducts DV Why conduct DV? How is DV done? When is DV? Common Fails Suggestions Expectations.
JACK O’CONNELL State Superintendent of Public Instruction California Adult Education Finding Our Way Debra G. Jones California Department of Education.
Employment and Training Administration DEPARTMENT OF LABOR ETA 1 Change in Reporting Requirements for the Workforce Investment Act Standardized Record.
Florida’s Experience with Long-Term, Short-Term and Common Measures Mike Switzer Workforce Florida, Inc Commonwealth Lane Tallahassee, FL
Report Samples The MAERS Development Team 1. Data Management Report Samples Characteristic Reports Participant Characteristics (AEPARTCHAR) Instructional.
FY13 National Reporting System (NRS) Changes Technical College System of Georgia Office of Adult Education As required by the National Reporting System.
Adjusting Erroneous Wage Record Match Results. 2 Agenda Background on the use of wage record data for tracking employment and earningsBackground on the.
Labor Exchange Validation July Labor Exchange Reporting ETA 9002 has five sections ETA 9002 has five sections –9002 A & B reports on job seeker.
Random Sampling. Introduction Scientists cannot possibly count every organism in a population. One way to estimate the size of a population is to collect.
U.S. Department of Labor Employment and Training Administration 1 Practicum on DATA VALIDATION.
What have we learned?. What is a database? An organized collection of related data.
The National Reporting System: Foundations Technical College System of Georgia Office of Adult Education.
NRS and MABLE Changes July 1, new data fields for students Highest Credential Achieved Education Location – U.S. or non U.S. Diploma at Entry?
NRS JEOPARDY! The Adult Education Community’s Favorite Quiz Show.
Title I Part C Migrant Education Program.  By end of school year 2015, 60% of migrant students are meeting standard in Reading.
Fall 2011 Goal Setting Training “Setting goals allows adult education students to specify what they want to accomplish and provides a benchmark for both.
SSDT/EMIS Updates May ‘08 Year-end Updates Student Course (GN) Records – High School Credit Earned (GN150) New option “P” Only use when partial.
Technical Assistance Workshop Office of Adult Education January 16,
Changes in the context of evaluation and assessment: the impact of the European Lifelong Learning strategy Romuald Normand, Institute of Education Lyon,
Senior Service America’s SPARQ Transition Overview Module August 15, 2012.
Performance Reporting Under WIA Title 1B Candice Graham-Young ETA Performance Accountability Team.
United Nations Oslo City Group on Energy Statistics OG7, Helsinki, Finland October 2012 ESCM Chapter 8: Data Quality and Meta Data 1.
Adult Education Assessment Policy Effective July 1 st, 2011.
North Carolina Post-School Outcomes (Indicator 14) January 14, 2016 David W. Test, UNC Charlotte.
Let’s Play Jeopardy! AET/535 WIA TITLE II ACCOUNTABILITY With your hosts: Jennifer Vaughn and Mark Reha.
1 YOUTHBUILD EVALUATION Building evidence about the effect of YouthBuild on the young people it serves August 17, 2011 Washington, D.C.
New York State’s Report Card Summary.  Evaluated in TWO Methods NYS Benchmarks Quartile Ranked Educational Gain = 46% Post Test Rate = 65% Follow Up.
Logic Model Training for HUD SuperNOFA Grantees U.S. Department of Housing and Urban Development Satellite Broadcast – June 1, 2004 – 2:00PM-5:00PM Washington,
Understanding the NRS Rosemary Matt NYS Director of Accountability.
TOPSpro Special Topics Data Detective II: Data Integrity and Payment Points.
Understanding the National Reporting System Rosemary Matt NYS Director of Accountability NRS.
Documenting California Program and Learner Outcomes to Federal and State Policymakers for the ACSA presentation on September 25, 2003 and for the California.
Rosemary Matt Director of Accountability for Adult Education.
Common Performance Measures for Employment and Training Programs SC Workforce Development Partnership Conference October 26-29, 2003 Brad Sickles
Data Validation ETA/ASTD Regional Technical Assistance Forum November 2, 2011 San Francisco, CA.
The Minnesota Youthbuild Program Costs and Benefits to the State of Minnesota Nancy Waisanen, Youthbuild Coordinator February 5, 2011.
United States Department of Labor Employment & Training Administration TAA-CCCT Round 4 New Grantee Fiscal and Administrative Q&A TAA-CCCT Round 4 New.
WIOA’s Goal Make Participants’ Skills Everyone’s Business
WIOA + MAERS = Performance
TOPSpro Special Topics
WIOA Accountability Ben Konruff
Understanding Transition Data Reporting under WIOA
Consolidated district performance report (CDPR)
ASISTS Program Manager’s Training
Office of Adult Education Instructional Services Team
Technical Assistance Webinar
Office of Adult Education Instructional Services Team
NRS Training: Reporting Student Performance on Table 4 - ABE Scenarios
NRS Training: Reporting Student Performance on Table 4 - ESL Scenarios
Presentation transcript:

Table 5 and Follow-up Collecting and Reporting Follow-up Data Or What’s Behind Table 5? American Institutes for Research February 2005

Table 5 and Follow-up 2 Table 5: Outcomes Earned a High School Diploma or GED Entered post-secondary School Entered Employment Retained Employment

Table 5 and Follow-up 3 Table 5: Outcomes States submit six (6) data points for each of the four outcomes: Core Follow- up Outcome Measures # Participants With Main or Secondary Goal # Participants Included in Survey # Participants Responding to Survey or Used for Data Matching Response Rate or % Available for Match # Participants Achieving Outcome % Achieving Outcome (A)(B)(C)(D)(E)(F)(G)

Table 5 and Follow-up 4 Table 5: Column (B) All participants who had the outcome as a goal and who exited during the program year. Core Follow- up Outcome Measures # Participants With Main or Secondary Goal # Participants Included in Survey # Participants Responding to Survey or Used for Data Matching Response Rate or % Available for Match # Participants Achieving Outcome % Achieving Outcome (A)(B)(C)(D)(E)(F)(G)

Table 5 and Follow-up 5 Table 5: Column (C) The number of participants who were surveyed to assess whether they had achieved the outcome. Note: C should be equal to B unless sampling was used, in which case C would be less than B. For data matching C = [BLANK] Core Follow- up Outcome Measures # Participants With Main or Secondary Goal # Participant s Included in Survey # Participants Responding to Survey or Used for Data Matching Response Rate or % Available for Match # Participants Achieving Outcome % Achieving Outcome (A)(B)(C)(D)(E)(F)(G)

Table 5 and Follow-up 6 Table 5: Column (D) Of those surveyed or data matched, the number who responded or were used for data match. Note: D should be equal to or less than B and C each. Core Follow- up Outcome Measures # Participants With Main or Secondary Goal # Participants Included in Survey # Participants Responding to Survey or Used for Data Matching Response Rate or % Available for Match # Participants Achieving Outcome % Achieving Outcome (A)(B)(C)(D)(E)(F)(G)

Table 5 and Follow-up 7 Table 5: Column (E) Percent of respondents or participants successfully matched; column D divided by column B. E = D/B * 100 Should be > 50%, but likely not 100% Core Follow- up Outcome Measures # Participants With Main or Secondary Goal # Participants Included in Survey # Participants Responding to Survey or Used for Data Matching Response Rate or % Available for Match # Participants Achieving Outcome % Achieving Outcome (A)(B)(C)(D)(E)(F)(G)

Table 5 and Follow-up 8 Table 5: Column (F) Number of participants achieving outcome. Note: F must be equal to or less than D. Core Follow- up Outcome Measures # Participants With Main or Secondary Goal # Participants Included in Survey # Participants Responding to Survey or Used for Data Matching Response Rate or % Available for Match # Participants Achieving Outcome % Achieving Outcome (A)(B)(C)(D)(E)(F)(G)

Table 5 and Follow-up 9 Table 5: Column (G) Percent achieving outcome. G= F/D * 100 Core Follow- up Outcome Measures # Participants With Main or Secondary Goal # Participants Included in Survey # Participants Responding to Survey or Used for Data Matching Response Rate or % Available for Match # Participants Achieving Outcome % Achieving Outcome (A)(B)(C)(D)(E)(F)(G)

Table 5 and Follow-up 10 Table 5: Example Core Follow- up Outcome Measures # Participants With Main or Secondary Goal # Participants Included in Survey # Participants Responding to Survey or Used for Data Matching Response Rate or % Available for Match # Participants Achieving Outcome % Achieving Outcome (A)(B)(C)(D)(E)(F)(G) Entered Employment 1, %20025%

Table 5 and Follow-up 11 Real Data – PY 2002–03 Examples State # Participants With Main or Secondary Goal # Participants Included in Survey # Participants Responding to Survey or Used for Data Matching Response Rate or % Available for Match # Participants Achieving Outcome % Achieving Outcome (A)(B)(C)(D)(E)(F)(G) State 1 1, %35637% What type of follow-up was used?

Table 5 and Follow-up 12 Real Data – PY 2002–03 Examples State # Participants With Main or Secondary Goal # Participants Included in Survey # Participants Responding to Survey or Used for Data Matching Response Rate or % Available for Match # Participants Achieving Outcome % Achieving Outcome (A)(B)(C)(D)(E)(F)(G) State 1 1, %35637% Why is this number less than column B?

Table 5 and Follow-up 13 Real Data – PY 2002–03 Examples State # Participants With Main or Secondary Goal # Participants Included in Survey # Participants Responding to Survey or Used for Data Matching Response Rate or % Available for Match # Participants Achieving Outcome % Achieving Outcome (A)(B)(C)(D)(E)(F)(G) State 1 1, %35637% What data should be reported here?

Table 5 and Follow-up 14 Real Data – PY 2002–03 Examples State # Participants With Main or Secondary Goal # Participants Included in Survey # Participants Responding to Survey or Used for Data Matching Response Rate or % Available for Match # Participants Achieving Outcome % Achieving Outcome (A)(B)(C)(D)(E)(F)(G) State 1 1, %35637% Is this percent computed correctly?

Table 5 and Follow-up 15 Exercise Questions For states 2–6: method of follow-up What method of follow-up did each state use? correctly Which states, if any, filled in the table correctly? Which did not? errors Do you see any errors in the table? confidence Do you have confidence in these data? Why or why not?

Table 5 and Follow-up 16 Exercise Example State # Participants With Main or Secondary Goal # Participants Included in Survey # Participants Responding to Survey or Used for Data Matching Response Rate or % Available for Match # Participants Achieving Outcome % Achieving Outcome (A)(B)(C)(D)(E)(F)(G) State 1 1, %35637% State 2 3,325NOTAVAIL.0%1,06232% State 3 8,644000%4,06747% State 4 31, %21,77969% State 5 9,6333,8531,15630%54313% State %9659%

Table 5 and Follow-up 17 Top Errors for Table 5 Missing Data Put “0” for # surveyed in Column (C) Put “0” for # responded or matched in Column (D) Unreasonably high response rates Many 100% Unreasonably high data matching Many 100% (DOL goal for matching = 90%) Unreasonable matching of # who had goal to # who responded Column (D) = Column (B)

Table 5 and Follow-up 18 Top Errors for Table 5 (cont’d.) Incorrectly computed percentages: Response rates calculated as D/C instead of D/B (B and C should be the same unless using random sampling, however) Number achieving outcome computed as F/B instead of F/D

Table 5 and Follow-up –03 Data Collection Methods

Table 5 and Follow-up 20 Data Matching Methodology Produces better quality data: More student coverage Better data validity Better response rate Matches Title I—better for OMB common measures 28 states use it for employment measures—we need more states! What are the reasons for not data matching? How can we promote use of this methodology?

Table 5 and Follow-up 21 What’s Behind the Data? State responsibility to ensure accuracy and quality of follow-up measures Requires monitoring of local program procedures Key steps: Identify students—goal setting and tracking Uniform data collection Training Reporting

Table 5 and Follow-up 22 Percent of Students with Goals Measure # with Goal Total # Students % with Goal Entered Employment —Average ALL states 6,90452,54810%* State ,2841% State 22,624138,1842% State 31,609 70,8932% Entered post-secondary— Average ALL states 2,91952,5486% State ,0620% State ,5741% State 614,523565,3113% High School Completion— Average ALL states 7,20752,54820%** State 77,192114,0086% State 82,76832,4929% State 933,118387,7109% * 38% Unemployed (Table 6) ** 18% Enrolled in ASE

Table 5 and Follow-up 23 Quality Control: Surveys Identify students Set student goals appropriately By exit quarter for employment measures Data Collection Quarterly collection for employment State survey instrument Sufficient resources (staff, time, funds) Sampling method (if applicable)

Table 5 and Follow-up 24 Quality Control: Surveys (cont’d.) Training Trained staff on survey procedures Improving response rate Reporting Quarterly reporting Database to link outcomes to programs and students

Table 5 and Follow-up 25 Quality Control: Data Matching Procedure for collecting and validating Social Security numbers Report all with goal, prior to eliminating invalid or missing numbers (Table 5, Col. D) Data have exit quarter and employment records matched after correct exit quarter Only students with goal matched and/or reported—should not have 100% with outcome

Table 5 and Follow-up 26 Discussion: Issues and Strategies Improving reporting and collecting Resources Training on Monitoring