1 / 32

Practicum on DATA VALIDATION

Practicum on DATA VALIDATION. Overview. Background & Basics Federal Requirements Issues/Findings from Federal Reviews Exercise: DEV with WIA NEG Record. USDOL’s Data Validation (DV) Initiative.

gallia
Télécharger la présentation

Practicum on DATA VALIDATION

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Practicum on DATA VALIDATION

  2. Overview • Background & Basics • Federal Requirements • Issues/Findings from Federal Reviews • Exercise: DEV with WIA NEG Record

  3. USDOL’s Data Validation (DV) Initiative • To support President’s Management Agenda and respond to data quality issues cited by oversight agencies • DV Directives • Eight guidance letters/notices to date • Initial Guidance – TEN 14-02 (5/28/03) • TEGL 3-03 and related changes • Latest Guidance – TEN 9-06 (8/15/06) • All performance-related guidance at www.doleta.gov/performance

  4. How Does Validation Work? • Two separate processes required to ensure performance data are reliable • Report Validation • Data Element Validation • Report Validation (RV) ensures performance calculations are accurate • Data Element Validation (DEV) ensures the data used in the calculations are accurate

  5. Understanding the Distinction • SCENARIO • A State reports an Adult Entered Employment Rate of 78% based on a numerator of 975 and a denominator of 1250 • The EER calculation is based on percentage of adults not employed at participation who were employed in the 1st quarter after exit • Other operational parameters apply, such as Transitioning Service Members* are automatically considered not employed at participation and included in calculations *TSM – Within 12 months of separation or 24 months of retirement

  6. Data Quality from Perspective of RV • For instance, • How do we know the 78% is correct? Does the denominator consist of the “right” exiters (e.g., those not employed at participation)? Are all TSMs included in calculations as required? • In other words, are the calculations correct; did the State follow federal reporting specifications correctly?

  7. Data Quality from Perspective of DEV • For instance, • How do we know those individuals identified as TSMs were actually within 12 months of separation or 24 months of retirement from the service? • For those “employed in the 1st quarter after exit,” what if the exit date was actually in a prior quarter? • In other words, were the data used to generate the calculations correct to begin with?

  8. The Bottom Line • Are the calculations reported by the State accurate based on federal reporting specifications? • Are the data used in the calculations accurate? • It’s all about Data Quality ! RV DEV

  9. Federal Requirements • Report Validation • Programs that submit “year-end” aggregate reports must validate their reports prior to submission • WIA, Wagner-Peyser, VETS (not Trade) • RV is largely a technical function, performed at state level by IT staff • NOT the focus of this session • Data Element Validation • Pertains to ALL programs (but is minimal in case of LX) • Involves checking data in participant records against allowable source documentation to verify compliance with federal definitions • Elements “pass” or “fail” validation

  10. More on DEV • ETA provides Data Reporting and Validation Software (DRVS), which generates a sample of participant records to be “inspected” by State staff • Except in the case of labor exchange programs (LX or Wagner-Peyser/VETS), DEV is very labor intensive because it involves state staff conducting reviews of a sample of participant records from across the state • Random sample for WIA and Trade • Typical sample for WIA might be ~1200 records • Typical sample for Trade might be ~150 records • 25 records for LX

  11. More on DEV (cont’d) • For each participant record in the sample, a “DEV Worksheet” is generated that contains the elements selected for validation that apply to the specific participant • State Validators use the appropriate federal guidance (e.g., validation handbook) to note allowable source documentation and check the accuracy of each element • Documentation must either MATCH the element or SUPPORT the element • Most source documentation is located at the One-Stop level (wage record information stored at State level)

  12. Summary • States are required to report accurate data and USDOL has oversight responsibility • USDOL requires RV and DEV (as applicable) and provides tools to assist, including software • Many states also use the software for reporting, although this isn’t required • User Guides and Handbooks for each program include allowable source documentation for critical data elements • Guidance states USDOL will monitor state DV efforts • This has begun!

  13. Issues/Findings from Federal Reviews • What are some of the key macro-level issues affecting states’ ability to report accurate and consistent data? • What are some of the key micro-level issues affecting data quality as per federal reviews?

  14. Macro-Level Issues Related to DV • Issues affecting State ability to collect and report accurate and consistent data • Flexibility in federal guidance (what, but not how) • Major changes to state management info systems (MIS) • Limited monitoring (state and federal) • Issues affecting State experience/compliance with DV • Identifying roles of different unit/staff (TAA in particular) • Communication of expectations and requirements to local areas • Lack of a comprehensive data management strategy (e.g., including monitoring of sub-grantees)

  15. Data Element Validation Issues • Most Common DEV Issues • State failure to request or ensure complete case files • State staff not validating wage-related information as required • Changes to wage record data not documented • Incorrect, outdated or misapplied definitions of data elements (e.g., employment status at registration was used prior to PY05, incorrect capture of race and ethnicity) • Lack of MIS manuals or data collection guides to assist sub-grantees

  16. Data Element Validation Issues (cont’d) • Lack of compliance with federal requirements pertaining to unique identifiers (particularly for those co-enrolled in TAA and WIA) • Quality of case notes varies dramatically • Incorrect and inconsistent dates within files (dates of participation, training, training completion, exit, date of birth) • Although Local (sub-grantee) staff have limited control over some areas, there is much that can be done locally to improve the structure and content of case files

  17. Exercise • Experiencing DEV: • WIA NEG Case File • [Our thanks to the State of Tennessee]

  18. “Setting Up” The Exercise • We are conducting PY05 DEV, using PY05 validation policies and instructions • What You Have: • Copy of WIA NEG case file with pages numbered (1-61) • DEV Worksheet • Source Documentation Instructions • For PY05 validation, instructions were part of TEN 9-06, dated 8/15/06 • PLEASE MAKE NO MARKS ON THE CASE FILE OR THE DOCUMENT CONTAINING SOURCE DOCUMENTATION; THESE MUST BE RETURNED AS IS • Only write on the DEV Worksheet

  19. “Setting Up” the Exercise (cont’d) • About This File • eCase Management and Activity Tracking System or eCMATS is Tennessee’s MIS • Participant is female, single mother, under 30 • National Emergency Grant (NEG) received as result of permanent closure of facility in 2002 • Concurrent enrollment noted (TAA, W-P, Voc Ed., Rapid Response)

  20. DEV Exercise • Elements to be Validated (exactly as appears on worksheet) • DislocationDate • ProgramParticipationDate • ProgramExitDate • NEGProject1 • FirstCoreServiceDate • FirstIntensiveService • DateEnterTraining • DateExitTraining • TrainingService1 • ExitEmployed1 • ExitEmployedMatch1 Note: Sometimes you need to “decipher” what the element means (e.g., “ExitEmployed1” actually means employment in first quarter after exit)

  21. Data Element: Dislocation Date • Called “Date of Actual Qualifying Dislocation” in source documentation • The “value” is 12/06/2002 • Allowable source documentation • Verification from employer; rapid response list; notice of layoff; public announcement with follow-up cross-match with UI; self-attestation • Does it Pass or Fail and based on what?

  22. Data Element: Program Participation Date • Called “Date of Program Participation” in source documentation • The “value” is 09/16/2003 • Allowable source documentation • State MIS Information • Does it Pass or Fail?

  23. Data Element: Program Exit Date • Called “Date of Exit” in source documentation • The “value” is 09/30/2004 • Allowable source documentation • WIA status/exit forms, state MIS data, case notes • Does it Pass or Fail and based on what?

  24. Data Element: NEG Project No. • Called “National Emergency Grant Project Numbers” in source documentation • The “value” is 0160 • Allowable source documentation: • Case notes or other file data specifying the particular layoff or emergency the precipitated enrollment. The project number for the grant(s) should be included. • Does it Pass or Fail?

  25. Data Element: 1st Core Service Date • Called “Date of First Staff Assisted Core Service” in source documentation • The “value” is 09/16/2003 • Allowable source documentation • State MIS data • Does it Pass or Fail?

  26. Data Element: 1st Intensive Service Date • The “value” is 09/16/2003 (same as 1st core service) • Allowable source documentation • State MIS data, case notes • Does it Pass or Fail and based on what?

  27. Data Element: Date Entered Training • The “value” is 01/06/2004 • Allowable source documentation • Cross match between dates of service and vendor training information, vendor training documentation, state MIS, case notes • Does it Pass or Fail and based on what?

  28. Data Element: Date Exited Training • Called “Date Completed or Withdrew from Training” in source documentation • The “value” is 03/29/2004 • Allowable source documentation • Cross match between dates of service and vendor training information, vendor training documentation, state MIS, case notes • Does it Pass or Fail and based on what?

  29. Data Element: Type of Training Service • The “value” is 6 WIA reporting instructions contain codes • 1=OJT • 2=skill upgrading and retraining • 3=entrepreneurial training • 4=ABE or ESL in combination with training • 5=customized training • 6=other occupational skills training • Allowable source documentation • State MIS data, case notes • Does it Pass or Fail and based on what?

  30. Data Element: Employed 1st Qtr After Exit • The worksheet refers to this element as “ExitEmployed1” and the source documentation refers to this as “Employed in 1st Quarter after Exit Quarter” • The “value” is 1, which means YES • Allowable source documentation • UI wage records, WRIS, supplemental data sources defined by TEGL17-05, State MIS • Does it Pass or Fail and based on what?

  31. Data Element: Type of Employment Match • The worksheet refers to this as “ExitEmployedMatch1,” but the source documentation refers to this as “Type of Employment Match 1st Quarter After Exit Quarter” • The “value” is 1, which means “UI wage records and WRIS” • Allowable source documentation • Note: Follow up services, surveys, record sharing and/or automated record matching with other employment and administrative databases, other out of state wage record systems, case notes • Does it Pass or Fail and based on what?

  32. Thank You! In God we trust. All others must use data. W. E. Deming

More Related