1 Program and Compliance Management Workshop: Data Element Validation—Issues from Federal Monitoring V I R T U A L L Y.

Slides:



Advertisements
Similar presentations
MONITORING OF SUBGRANTEES
Advertisements

(Individuals with Disabilities Education Improvement Act) and
Data Quality Considerations
Preparing for Compliance Monitoring Reviews Understanding CMS Protocols Used by Review Organizations January 14, 2009 Presented by: Margaret deHesse, RN,
1 Program and Compliance Management Workshop: UNDERSTANDING PARTICIPATION CYCLES V I R T U A L L Y.
Performance Accountability Improving Data Accuracy and Reporting Washington State Web-Ex August 22, 2014.
RMS – a collaborative approach Presentation Lyn Dare & Stephen Larmour Authorisation & Audit Comcare.
Workforce Investment Act (WIA) Eligibility and Required Documentation.
U.S. Department of Veterans Affairs Veterans Health Administration Supportive Services for Veteran Families (SSVF) Program SSVF Grantee Uniform Monitoring.
Final Determinations. Secretary’s Determinations Secretary annually reviews the APR and, based on the information provided in the report, information.
1 Program and Compliance Management Workshop: Documentation Desires, Duties, Dilemmas V I R T U A L L Y.
SEPARATION OF DUTIES CONFLICT OF INTEREST POTENTIAL FRAUD 1.
Data Validation Documentation for Enrollments. Learning Objectives As a result of this training you will be able to: Describe the data validation process.
The Nevada Department of Employment, Training and Rehabilitation is a proactive workforce & rehabilitation agency 1 “A Proactive Workforce and Rehabilitation.
Employment and Training Administration DEPARTMENT OF LABOR ETA Data Validation ARRA Performance Accountability Forum San Francisco, California December.
Special Education Accountability Reviews Let’s put the pieces together March 25, 2015.
Supportive Services for Veteran Families (SSVF) Data
ERSEA Training April 17, /17/2015
Maine CACFP Website Program Information - by type Resources Forms
1 Implementing Veterans’ Priority of Service Atlanta Recovery & Reemployment Forum May 28, 2009.
Employment and Training Administration DEPARTMENT OF LABOR ETA Reporting and Data Validation Updates Presenters: Wes Day Barbara Strother Greg Wilson ETA’s.
NYS Integrated Reporting 1 New York State Integrated Reporting Plenary Session USDOL/ETA Performance and Reporting Summit 2007 Phoenix Marriott.
Verification Visit by the Office of Special Education Programs (OSEP) September 27-29, 2010.
Bulletin Guidance and Resources. BUL Format Provides a table of contents Organizes School Site Council (SSC), Compensatory Education Advisory.
Loss Control Program Compliance Audits An overview of the purpose and procedures of program auditing.
Demonstrating Comparability School Year October 2014October 2014.
Staff Structure Support HCCA Special Interest Group New Regulations: A Strategy for Implementation Sharon Schmid Vice President, Compliance and.
ETA Data Validation July Overall ETA Data Validation Project Goals Develop a comprehensive, systematic data validation system to ensure data integrity.
Performance Counts Julian Hardy Lane Kelly. 2 Discussion Topics TAPR Reporting and Related Issues Common WIASRD Errors Data Validation Issues.
Changing Perspectives on Workforce System Performance Data Validation Workforce Innovations San Antonio July, 2004.
DATA VALIDATION NORTH CAROLINA’S EXPERIENCE. NC Division of Employment & Training2 Program Year 2002 Data Validation We contracted with our former Finance.
1 Changing Perspectives on Workforce System Performance Employment and Training Administration Office of Performance and Technology
1 Application of SAS 112 in a Single Audit GAQC Member Conference Call January 15, 2008 Presented by Mandy Nelson, CPA George Rippey, CPA.
OSEP National Early Childhood Conference December 2007.
TRANSPORTATION FEFP PREPARING FOR THE “DREADED” OFFICIAL STATE AUDIT.
Procedures and Forms 2008 FRCC Compliance Workshop April 8-9, 2008.
Employment and Training Administration DEPARTMENT OF LABOR ETA Back to Basics! Program Reporting U.S. Department of Labor Employment and Training Administration.
Encounter Data Validation: Review and Project Update August 25, 2015 Presenters: Amy Kearney, BA Director, Research and Analysis Team Thomas Miller, MA.
U.S. Department of Labor Employment and Training Administration Keith Rowe ETA – Dallas Region Office Presenter ETA – PROTECH WISPR Quarterly Reports and.
NEW START-UP APPLICATION  Deadline to submit application is October 1 year prior to implementation  If proposal is ed, submit cover page.
Fiscal Year (FY) 2015 National Training and Technical Assistance Cooperative Agreements (NCA) Funding Opportunity Announcement (FOA) HRSA Objective.
Review of Data Validation Results for PY 2004 We’ve come a long way, baby!
VPP Mobile Workforce Demonstration for Construction.
Data Validation (DV) What is DV? Who conducts DV Why conduct DV? How is DV done? When is DV? Common Fails Suggestions Expectations.
Employment and Training Administration DEPARTMENT OF LABOR ETA 1 Change in Reporting Requirements for the Workforce Investment Act Standardized Record.
Data Quality Assessment
Bridging the Gap and Maximizing Resources with Partner Programs for Positive Program Outcomes Presented by Mershal Noble and Danielle McNeil.
Program Performance and Reporting Pacific Northwest TAT Forum April 20-22, 2010 Seattle, Washington.
Learning Objectives Conducting an On-Site Monitoring Review FPO calls the Grantee: “As you know, we’re a little more than nine months into your 24 month.
U.S. Department of Labor Employment and Training Administration 1 Practicum on DATA VALIDATION.
Tad and Terry Legal Issues in ILP. 28 CFR Part 23 The federal rule that governs or provides guidance for these issues. § 23.3 Applicability: These policy.
Workforce Innovations Conference July 2006 Workforce Investment Streamlined Performance Reporting (WISPR) System: “HOT Wiring” State Data for Workforce.
Trade Act Participant Report (TAPR) 2005 Revisions for Implementing Common Measures.
1 Title IA Coordinator Training Preparing for Title IA Monitoring
Changing Perspectives on Workforce System Performance Workforce Innovations Conference July 2004 Employment and Training Administration Performance and.
Compliance Monitoring and Enforcement Audit Program - The Audit Process.
Division of Risk Management State of Florida Loss Prevention Program.
Performance Reporting Under WIA Title 1B Candice Graham-Young ETA Performance Accountability Team.
1 Cross-Cutting Issues 5310-JARC-New Freedom U.S. Department of Transportation Federal Transit Administration SAFETEAU-LU Curriculum August 7, 2007.
Participants will have a knowledge and understanding of priority of service in DOL funded programs.
FREQUENTLY ASKED QUESTIONS Common Measures. When did common measures become effective? Common measures became effective for W-P on 7/1/05.
Employment and Training Administration DEPARTMENT OF LABOR ETA Data Validation Looking Back and Moving Forward Region IV Technical Assistance Forum Retooling.
Data Validation ETA/ASTD Regional Technical Assistance Forum November 2, 2011 San Francisco, CA.
March 23, SPECIAL EDUCATION ACCOUNTABILITY REVIEWS.
1 Modifications to the WIA Performance Reporting System (OMB Controls # and ) November 2013.
WIOA Annual Performance Report
Poll Question Have you read the Supplemental Wage Information Guidance issued under TEGL 26-16, PM 17-6, and TAC 17-04?
Technical Assistance Webinar
Monitoring & Managing Your WIF Grant
Presentation transcript:

1 Program and Compliance Management Workshop: Data Element Validation—Issues from Federal Monitoring V I R T U A L L Y

Outline Federal MonitoringWhat’s the approach to Federal Monitoring? Compliance IssuesWhat are the consistent Compliance Issues? Areas of ConcernWhat are the consistent Areas of Concern? Springboard to System Improvements?How can these reviews be used as a Springboard to System Improvements? 2

Background Oversight agencies like GAO and OIG cited data quality issues with ETA’s data (2002) TEN (Training & Employment Notice) issued 5/28/03 announcing the Data Validation Initiative TEGL (Training & Employment Guidance Letter) 3-03 issued 8/20/03 providing implementation guidance, describing the process, and offering tools –“…ETA will monitor the validation effort on a regular schedule.” Guidance issued annually containing report submission deadlines and source documentation requirements 3

Federal Monitoring of Data Validation 4

In General... Most regions began monitoring DV after common measures were implemented Compliance ”playbook” –Program Reporting Instructions –DV policy guidance and handbooks –TEGL Overall objective is data quality (data are accurate and reliable) 5

Review Approach Where Review Takes Place –Traditional On-Site Review –Remote Review –Combination Focus of the Review –Program Reporting and DV –DV as component of comprehensive review Structure of Review Team –Varies based on regional protocols 6

Review Scope Programs –Workforce Investment Act (WIA ) –Trade (TAA) –LX (Wagner-Peyser/VETS) Validation Cycles –Most recently completed cycle –Additional cycle sometimes included Components 1.Report Validation 2.Data Element Validation 3.Process Evaluation 7

Components of Review Assessment of Report Validation (RV) For WIA and LX only – not TAA –RV focuses on aggregate calculations Did state validate their annual data submission (WIA and LX)? How were ‘high’ error rates addressed? Generally completed up front (e.g., before site visit) 8

Components of Review Assessment of Data Element Validation (DEV) DEV = record review = case file review Required across core workforce programs and focuses on data that are used for calculations Federal staff revalidate sub-sample of state’s sample –Random Sample for WIA and TAA –All 25 records for LX Review data element error rates What has the state done with the information? 9

Components of Review Process Evaluation Data management and the resultant quality of reported data are derived from and influenced by the policies, procedures and protocols utilized at the state and/or local levels –Review of Reporting and Validation Process (including MIS) –Review of Policies, Procedures, Protocols (including training) and how deployed 10

Review Approach and Scope In the Dallas Region (Region 4) Traditional on-site review (one full week) that includes local office visit –Remote review only if site visit not possible Scope consists of core workforce programs –Two most recent cycles unless prior cycle reviewed –“…the organization, data collection, report preparation and data validation work activities of the state and how these activities are managed on a daily basis.” Joint ETA/VETS Review Team –Also includes formula FPO and others 11

Process evaluation component begins as soon as review is scheduled and addresses objectives covered in the Core Monitoring Guide and Supplements –Review results inform subsequent program reviews All eleven states have been reviewed –Second round of reviews will commence in FY Review Approach and Scope In the Dallas Region (Region 4)

Clarifying Accuracy Standards Across the core workforce programs, ETA has two published error rate thresholds 1.(LX) No errors allowable for LX DEV (0% errors in the minimum 25-record sample) 2.(WIA) States submitting annual reports with RV errors that exceed 2% will not be eligible for WIA incentives for that PY (TEGL 9-07, dated 10/10/07) We have a provisional error rate threshold of 5% for WIA and TAA data elements –Error rates exceeding 5% as “Area of Concern” only 13

Consistent “Findings” Across Regions 14

Non-Compliance with EXIT Requirements Exit dates not reflective of dates of last service Gaps of service spanning years Case management used to extend exit date Hard exits utilized –Date of last contact = Exit date –Date of employment = Exit date Services provided within 90 days Exit dates not consistent with dates in MIS 15

Participation Cycles and Dates of Service Although there are clear issues around exit, there are also issues around participation cycles and dates of service in general –Service provision prior to formal participation –Staff unclear about services that commence participation –Dates of service inconsistent across file and MIS, within MIS, within file, within documents 16

Incorrect Capture/Coding of Required Data Elements Several data elements routinely cited across reviews –Ethnicity and Race –Disability Status –Veteran Status –UI Claimant Status –UC Eligible Status –School Status at Participation –Needy Family Status –Dislocation Date 17 Example: Ethnicity and race combined (in MIS, on forms, etc.) No response for ethnicity interpreted as a negative response (MIS default)

Wage Records Not Available for Validation Two Primary Issues 1.Not accessible due to privacy concerns 2.Data not frozen or archived 18

Wage Records Not Available Access to the Data WRIS Data-Sharing Agreement specifies: –Info can be shared with “…auditors who are public employees seeking access to the information in the performance of their official duties.” –“The PACIA shall permit ETA … to make onsite inspections… for … conducting program audits…” 19

Wage Records Not Available Data Not Frozen or Archived DV is a point-in-time activity but wage records are dynamic information –Federal policy requires the data be frozen or archived to allow federal reviewers follow the same audit trail as state validators –Federal reviewers cannot utilize a live database Because the data have not been kept, there is no audit trail – also a finding (record retention) 20

Record Retention Two Primary Issues: 1.Participant files missing, cannot be located, or documents missing 2.Wage records and validation extract files purged or just not kept 21

Record Retention (2) Wage Record Data –Example: Data periodically purged from state system with DV extract files and documentation –“Data validation results and documentation should be retained for at least three years after completion. Retention methods are at the state’s discretion and may include an archived database, printed worksheets and reports, or other methods.” (DRVS Handbook, which can be accessed at 22

Source Documentation Incorrect and/or Lacking Not using most recent guidance (sources changed) Using MIS as allowable source (checkmark or radio button is insufficient by itself) Lack of documentation to support service provision (e.g., youth follow up services) 23

Source Documentation Incorrect and/or Lacking (2) Policy includes incorrect sources or combines eligibility documentation with DV source documentation, resulting in incorrect DV sources Self-attestation forms signed by case manager, not participant Using internal form as “cross-match” with another database 24

Veterans’ Services Issues Coding Errors –Individuals not counted as veterans but receiving veterans’ services (from LVER/DVOP) –If individual is a vet, other veteran-related data elements are required but State MIS either allows no response or defaults to no Priority of Service –Signage/Policy indicating 180 days of active duty service is needed to be eligible for POS –For priority purposes, only one day of active duty service is needed 25

Consistent “Areas of Concern” 26

DEV Error Rates > 5% “High” error rates can only be noted as an Area of Concern Applies to WIA, TAA or both In some cases, high error rates continue across validation cycles for same data elements What has the state done to eliminate or minimize errors? 27

DV Results Not Utilized In some instances, results not shared or are only noted during state’s exit conference Conducting the required validation for compliance purposes only is a missed opportunity (at best) If the intent is data quality, then utilizing the results of annual validation for continuous improvement purposes just makes sense 28

Quality of Participant Files Lack of consistent file structure Documents missing Case notes missing, skimpy, illegible, irrelevant MIS contains different information One file for multiple periods of participation for same individual 29

Need for Policies/Procedures Just to report basic metrics, participant data goes through many steps—gathering, data entry, case management, extract process, data validation, etc. –Strong operating policies and procedures are essential at each step to maximize accuracy and minimize confusion –Active management of the data (state and local levels) is necessary Note: Reviews have also highlighted need for required policies (e.g., youth needing additional assistance) 30

DV Review as Springboard to Needed Improvements Needed Improvements 31

Myth: Data Management is Purely Technical Reality: In the end, data management is really about providing the best possible services –Reporting and Validation are there to support effective service provision Accurate reporting is a requirement but it’s also a tool –Data represents actual participants and their experiences with our system! 32

Myth: Data Management is Easy Reality: Data management is HARD! –Business rules are complex and multi-layered –Data sets are large and hard to visualize –Specs are complex and evolving –Circumstances change 33 …and sometimes the closer you look, the less clear things become, which is another challenge

Enter: DV Reviews Because these reviews look at the entire system from the bottom up (e.g., how data are collected) and the top down (e.g., state policies), Program Reporting and Data Validation Reviews can identify system weakness and opportunities for improvement. 34

Got Gaps? Data Validation Reviews can – –Clarify federal policy and expectations –Identify policy and procedural gaps –Facilitate needed MIS changes, edit checks, adjustments –Demonstrate the need for training –Highlight ways to improve “regular” monitoring –And more! 35

Although no one can predict the future with certainty, data quality will remain a key factor in decision-making, including funding decisions 36 Questions?

Resources Most recent source documentation for WIA data validation is part of TEGL – _28-11-Atta1.pdfhttp://wdr.doleta.gov/directives/attach/TEGL/TEGL _28-11-Atta1.pdf Reporting tutorials and user guides are located on ETA’s performance website – TEGL 9-07 (2% RV error rate threshold) – 07acc.pdfhttp://wdr.doleta.gov/directives/attach/TEGL09- 07acc.pdf All guidance letters are posted – 37