1 / 50

The ETA 227 Report: What It Is, How It’s Used, How It’s Validated

The ETA 227 Report: What It Is, How It’s Used, How It’s Validated. Scott Gibbons Burman Skrable. What and Why of the 227 Report. Why Burden You with the ETA 227?. Federal oversight of “proper and efficient” administration includes program integrity Performance Measures Analysis

arion
Télécharger la présentation

The ETA 227 Report: What It Is, How It’s Used, How It’s Validated

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The ETA 227 Report: What It Is, How It’s Used, How It’s Validated • Scott Gibbons • Burman Skrable

  2. What and Why of the 227 Report

  3. Why Burden You with the ETA 227? • Federal oversight of “proper and efficient” administration includes program integrity • Performance Measures • Analysis • Workload and Planning • Economic Indicators • Government-wide promotion of integrity

  4. How Is the ETA 227 Used? • Monitoring of BPC by ETA staff • Construction of Key Performance Measures • Government Performance and Results Act • UI Performs Core measure • Integrity measuring by OMB, Department of Labor CFO and OIG • Benefit-Cost and similar Analyses • Legislative and Program Development • NDNH, Tax-offset programs

  5. ETA 227 Content • `Detection & Establishment of Overpayments • Who or what caused the OPs • How they were detected • Recovery and other Reconciliation Activities • Criminal & Civil Recovery Actions • Age of Overpayment Receivables

  6. Section A, Overpayments Established by Cause

  7. Section B, Overpayments Establishedby Mode of Detection

  8. Section C,Recovery/Reconciliation

  9. Section D, Criminal/Civil Actions

  10. Section E, Aging of OP Accounts

  11. 227 and the Reporting Process

  12. All data on the ETA 227 should be traceable to data in the state’s financial accounting system. Entries must be made for all items. Amended reports - (send any changes to the ETA 227 electronically). Have 3 years to amend a report. Generating the ETA 227

  13. Reasons for Inaccurate Reporting • Poor coordination between program offices and reporting teams • Lack of accountability in the reports process • The three I’s and Handbook 401: • Incorrect guidance • Incomplete guidance • Imprecise guidance

  14. Weighing Accuracy vs. Timeliness • Accuracy is ALWAYS more important than timeliness • If issues arise, contact us and we’ll work with you to help solve the issues

  15. Quarter ending March 31 is due May 1 Quarter ending June 30 is due August 1 Quarter ending September 30 is due November 1 Quarter ending December 31 is due February 1 ETA 227 Quarterly Due Dates

  16. Reporting Systems and Data Integrity • Rules on data that can be submitted • Basic QA/QC sits on the front end of the reporting process • Comes in two forms: Warnings and Errors • Transmittal depends on content

  17. ETA 227 Reporting :: Examples of Reporting Edits

  18. Troubleshooting ETA 227 Transmittal • It is critical that the report be reviewed by BPC staff before transmittal • The report will not transmit with empty cells • Please make sure that you have up-to-date changes in the edits for the report

  19. Troubleshooting ETA 227 Transmittal • Please make sure you are using the current version of the ETA 227 (See 401 Handbook, 4th Edition, April 2007). http://wdr.doleta.gov/directives/attach/ETAH/ETHand401_4th_s04.pdf(see IV-3-1) • Data edits are found in appendix C of Handbook 402: http://wdr.doleta.gov/directives/attach/ETAH/ETHand402_5th_Att12.pdf(see C-13) • Make sure the report is actually transmitted • Make certain that errors are addressed • Warnings are for your information

  20. Data Validation and the 227

  21. How We Validate UI Reported Counts • We independently reconstruct the counts we want to validate • This involves building a record for each transaction that goes into those counts • The result is an “audit trail” • We compare the reported counts with reconstructed counts • Reported counts are valid if within a tolerance

  22. Validation Process • Decide what must be validated • Design appropriate record for each type of transaction or status (Population) • Build file • Test file • Compare counts from tested file with reported counts

  23. Validation Concepts: Population • “Population” is the name DV gives to a group of the same type of transaction or status • Example: Overpayments Established • Validation activity is organized by Population • It’s more focused and more efficient • Often the same type of transaction or status appears in more than one report • Many UI reports combine multiple types of transactions/statuses • E.g., the 227 has overpayments established, overpayments reconciled, overpayment balances at end of quarter, criminal or civil actions

  24. Validation Concepts: Population • Benefits DV defines 15 distinct populations • These are used to validate 13 UI reports • DV uses three populations to validate the 227 report • Tax DV defines 5 populations • These are all used to validate parts of the ETA 581 report

  25. Example: Validating the Clue Fest • The Great National Clue-Fest, hundreds of teams from throughout the U.S. play Clue to determine • Who murdered John Boddy? • Col Mustard, Prof. Plum, Mr. Green, Mrs. Peacock, Miss Scarlet, or Mrs. White? • What were the circumstances of his death? • What was the weapon? • In what room was he killed?

  26. Validating the Clue Fest (continued) • Monthly reporting by all teams • Counts on 288 (6x6x8) possible outcomes • Teams are told to retain for possible validation • Game sheets containing result, date of game, team name • You must validate

  27. Validating the Clue Fest (continued) • You design the Clue Data Record • Team name/ID • Date of game • Murderer • Weapon • Room • You select teams to code individual results into your record • You spot-check some records against game sheets • You enter results into Clue-DV software • DV software compares validation counts against reported counts

  28. Validating the 227 • What should be validated? • Section A (Numbers and Dollars of Overpayments established, by Cause) • Population 12, Overpayments Established • At some point may include B. Mode of Detection • Section C (Recovery/Reconciliation) • Population 13, Overpayment Reconciliation Activities • Section E (Aging of Overpayment Accounts) • Population 14, Age of Overpayments

  29. Validating the 227 • Designing the extract file • The file is to have a record for every countable transaction or status for the report cells validated • To classify each record into one of the report cells, that data record must have a field for each classifying dimension • It’s like the game of Clue

  30. Validating the 227 • Population 12 Extract File Record • Individual ID (SSN, each OP’s unique ID) • Program Type • Type of OP (Fraud, Non-fraud) • OP Cause (Multi-claimant schemes, Claimant, Employer, SESA, Reversals, Other) • Detection Type • Date Established • UI Amount • Federal Amount

  31. Validating the 227 • Population 13 Extract File Record • Individual OP ID (SSN, each OP’s unique ID) • Program Type (UI or UCFE or UCX) • OP Type (Fraud, Non-fraud) • Type of Reconciliation Activity (Cash, Benefit Offset, etc.) • Date of Reconciliation Activity • UI Amount • Federal Amount

  32. Validating the 227 • Population 14 Extract File Record • Individual OP ID (SSN, each OP’s unique ID) • Program Type (UI or UCFE or UCX) • Date Established • Outstanding Overpayment • Active Collection (Yes or No or Dropped) • Type of Overpayment (Fraud or Non-fraud) • UI Balance at End of Quarter • Federal Balance at End of Quarter

  33. Validating the 227 • Build the extract file • Update Module 3 • Follow File Layouts • Population link off Benefits software • DV User’s Guide • Use Module 3 definitions and Rules as guide to database screens and database locations for correct elements

  34. Validating the 227 • Test the Extract File • Ensure it captures all relevant transactions • Examine records rejected as errors • Fix incorrectly-built but countable transactions • Eliminate uncountable transactions from file • Investigate samples • Does record use elements that are consistent with correct reporting? • If random sample fails, must rebuild file or correct database

  35. Validating the 227 • The extract file is the standard for judging reported counts when • All random samples pass • Validators and programmers are confident that it includes all transactions • When these conditions are met, go to RV comparison • Results cannot be transmitted unless random samples are completed

  36. Validating the 227 • Compare Validation with Reported Counts • Every reported count is validated • Pass and Fail are only judged at group level • All Groups must pass for Population to pass

  37. Validating the 227 • Status of Validating Overpayment Populations

  38. Validating the 227 • Percentage Distribution of Submissions, Passes Fails

  39. 227 and Performance Measures • How Accurate is the Detection of Overpayments Ratio? • FY 2009 Reported Rate: 53.1% • Mean Error in Failed Validations + 10% • % $ Failed or Not Validated 77% • Estimated true D/O ratio: 49.0%

  40. 227 Data Analysis • Cost-benefit calculations of BPC activity

  41. 227 Reporting Issues • Coordination between BPC staff and reporting staff • Where do errors predominate? • Penalty Reporting • State practice and definitional issues

  42. 227 Reporting Issues • Fields where reporting errors predominate • Population 12 (125 errors in 480 cases) • Overpayment cause – 29% • Date Established - 26% • Detection Type -- 19% • $ Amount -- 16%

  43. 227 Reporting Issues • Fields where reporting errors predominate • Population 13 (79 errors in 360 cases) • Type of Reconciliation Activity – 52% • $ Amount Errors -- 18% • Type of Overpayment -- 17% • Date of Reconciliation Activity 11%

  44. 227 Reporting Issues • Fields where reporting errors predominate • Population 14 (50 errors out of 160 cases) • UI $ Amount 34% • Date Established 32% • Active Collection Status 18% • Outstanding Overpayment 12%

  45. 227 Reporting Issues • Penalty Reporting • In case of fraud, some states assess as penalty future weeks claimed • Claimant must claim and serve the weeks as otherwise eligible but is not paid • Reporting: • Overpayments Established, when served, report on line 109 of Section A, Penalty • Reconciliation: Report in Columns 13 or 14, Nonfraud—but in no specific section • Eight States reported Penalty in 2007, 2008, 2009

  46. 227 Reporting Issues • Conditional overpayments • State makes payments while issue is pending • Adjudication hearing • Claimant fails to appear • OP established—Reported as $ Established • OP reversed on redetermination • State reports as $ Subtraction even if establishment, reversal are within same quarter • Result: very high Detection of Overpayments Ratio

  47. 227 Reporting Issues • OPs Established and then Subtracted in the same quarter may be inflating the D/O ratio • $ Subtracted/$ Overpaid in states with • D/O Ratio over 100%:20.3% • D/O Ratio under 100%: 5.2%

  48. For Further Information • Scott Gibbons • (202) 693-3008 • Gibbons.Scott@dol.gov • Burman Skrable • (202) 693-3197 • Skrable.Burman@dol.gov • DV Web page: http://ows.doleta.gov/dv/

  49. End of Presentation

More Related