1 / 29

Collaboration in CDR

Learn how the Central Data Repository (CDR) collaboration solution using XBRL and .NET technologies improves the definition, analysis, and management of financial data collected by FFIEC agencies.

brendau
Télécharger la présentation

Collaboration in CDR

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Collaboration in CDR Erik Brown Unisys Corporation September 19, 2006

  2. The Central Data Repository (CDR) • Requested by FFIEC Agencies in 2002 • FFIEC: Federal Financial Institutions Examination Council • Participating Agencies: • Federal Deposit Insurance Corporation (FDIC) • Federal Reserve Board (FRB) • Office of the Comptroller of the Currency (OCC) • Vision A Web-based solution that improved the definition, analysis, and management of financial data collected by the FFIEC agencies, using XBRL and Microsoft .NET technologies. • Initial Implementation Targeted Call Report Data • Entered Production in September 2005

  3. Why Build CDR? • Agencies are tasked with regulating Financial Institutions • Regulations require reporting of data • Reporting requirements are forms which may change • Institutions send completed forms to FFIEC agencies • CDR goal is to automate this process

  4. Legacy View of Data Each agency had different view of data Public did not have good insight into exactly how data was evaluated

  5. CDR System Requirements What is required to receive and process regulatory data?

  6. CDR System Requirements Financial Institutions have data

  7. CDR System Requirements 1. Electronic method for submitting data

  8. CDR System Requirements 2. Mechanism for creating a shared data definition

  9. CDR System Requirements 3. Use shared data definition in submission software

  10. CDR System Requirements 4. Use shared data definition to validate submitted data

  11. CDR System Requirements 5. Use data definition to present data to agency analysts

  12. CDR System Requirements 6. Call it CDR and utilize XBRL for data interactions

  13. Collaboration in CDR

  14. What is Collaboration? “processes wherein people work together —applying both to the work of individuals as well as larger collectives and societies” - from http://en.wikipedia.org

  15. CDR Collaboration Areas • Individuals • Metadata Administrators • FI User (Filers) • Vendor User • Agency Analysts • Collectives (organizations) • FDIC, FRB, and OCC • Software Vendors • Financial Institutions • Society • The Public

  16. Individual Collaboration Metadata Administrators FI User (Filers) Vendor User Agency Analysts

  17. Metadata Administrators • Problem: Metadata Management Staff from three agencies need to collaborate to produce a shared Call Report definition (XBRL Taxonomy) that can be provided to agency staff and vendor users • Solution: Web UI Web UI allows all staff to share a common view of the metadata and collaborate to evaluate changes and publish final versions.

  18. Metadata Admins and FI Users • Problem: Data Quality and Validation Agencies need to electronically communicate the reporting requirements to financial institutions. This includes both what data is required (concepts), required relationships for the data (data validity), and what criteria is used to evaluate the data (data quality) • Solution: XBRL Formula Linkbase Extend XBRL to allow agencies to specify edits (formulas) to indicate required data relationships as well as evaluation criteria for the resulting values. Vendors incorporate these edits into their software.

  19. XBRL Formula Linkbase • A Collection of Business Rules (Edits) • Ensure valid data and quality data • Each edit evaluates to True or False (pass or fail) • Validity Edits – failure results in rejection • Example: sum of values in a column must equal the value reported as the total for this column. Wages + Tips + Dividends = Total_Income • Quality Edits – failure results in analyst query • Example: total deposits for the current quarter should be greater than the total deposits in the prior quarter. The formula language permits inclusion of prior period data for the associated financial institution. Total_Deposits[P0] >= Total_Deposits[-P1Q]

  20. Metadata Admins and Vendor Users • Problem: Taxonomy Validation • Changes to the Call Report form must be represented in the CDR system, communicated to software vendors, and verified for use in vendor software and by legacy systems. • Solution 1: Notifications • Users receive email notifications as a result of various events related to taxonomy development. • Solution 2: Taxonomy Publication Workflow • Generated taxonomies are made available to vendors and agency users for validation and verification before they are published for official use and distribution.

  21. FI Users (Filers) • Problem: Reporting Requirements • FI Users need to understand reporting requirements before filing their data. After filing, FI Users need to determine the status of their submission without agency intervention. • Solution 1: Taxonomy Publication Workflow • Vendors incorporate edits into their software, allowing filers to correct reporting errors or validation failures before filing. • Solution 2: Notifications • The system sends a notification to filers on the status of their submission, including detailed descriptions of edit failures which incorporate filer data values into the explanation.

  22. Two Types of Filer Notifications • Pending Notification • CDR received it: don’t call us, we’ll call you • Rejection Notification • Submission was rejected, you need to resubmit • Notification includes list and description of each edit failure • Failed Validity Edits are always rejected • Failed Quality Edits without an explanation are rejected

  23. Sample Quality Edit (R0240.2039) • Formula If ( (MonthOf(Context.Period.EndDate) <> 3) And (cc:RIAD9106[P0] <= cc:RIAD9106[-P1Q]) And (cc:RCON2170[P0] >= 0), cc:RIADA518[P0] >= (cc:RIADA518[-P1Q] - 2000), TRUE) • Edit Message for Rejection Notification Income statement items are reported on a calendar year-to-date basis. Therefore, the $RIADA518[0] your bank reported this quarter for "Interest expense on Time deposits of less than $100,000 in Nontransaction accounts" (RI 2.a.(2)(c)) should be greater than or equal to the $RIADA518[-1] reported in the previous quarter. Please review your reported data, and explain or revise as appropriate. $1,234,567 $2,345,678

  24. FI Users and Agency Analysts • Problem: Data Discrepancies • While institutions can ensure that data is reported correctly, the expected validation criteria cannot always be met. As a result, filers require an electronic mechanism for detecting and explaining validation failures in the provided data. • Solution 1: Quality Edits • Solution 2: Required Edit Explanations • The taxonomy requires filers to provide textual explanations for quality edit failures. Agency analysts review these explanations when reviewing submitted data.

  25. Agency Analysts • Problem: Submission Review • Agencies must verify over 8000 filings in a short timeframe (most in less two weeks). • Solution 1: Analyst Assignment • Agencies can designate which analysts are to review which institutions (up to 500+ per analyst) • Solution 2: Automated Edit Processing • Filers are forced to meet all validity edits during submission. • Quality edits capture business rules as formulas. CDR applies these rules consistently across all filers. • Since system identifies problem reports, analysts can prioritize their time to resolve (not identify) issues.

  26. Collective and Society Collaboration FDIC, FRB, and OCC Software Vendors Financial Institutions The Public

  27. Legacy View of Data Each agency had different view of data Public did not have good insight into exactly how data was evaluated

  28. Collective & Society Collaboration Agencies share a common definition of the data and evaluation criteria Public can view data values and validation criteria

  29. Questions Erik Brown Unisys Corporation erik.brown@unisys.com

More Related