1 / 45

Informatica Application ILM Streamline & Secure Nonproduction Mainframe Environments

Informatica Application ILM Streamline & Secure Nonproduction Mainframe Environments. September 16, 2010 Scott Hagan, Data Integration Sr. Product Manager Jay Hill, ILM Director of Product Management and Marketing. Informatica Confidential & Proprietary. Agenda. Business Drivers

Télécharger la présentation

Informatica Application ILM Streamline & Secure Nonproduction Mainframe Environments

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Informatica Application ILMStreamline & Secure Nonproduction Mainframe Environments September 16, 2010 Scott Hagan, Data Integration Sr. Product Manager Jay Hill, ILM Director of Product Management and Marketing Informatica Confidential & Proprietary

  2. Agenda • Business Drivers • Building Better Test Environments • Identifying and Masking Data Automatically • Enabling Seamless Database Connectivity • Products in Action

  3. Market Driver: Data ProliferationCustomers are Drowning in their Own Data Escalating storage, server, and database cost Diminishing application and data warehouse performance Inability to retire redundant or obsolete applications Increasing effort spent on maintenance & compliance More data in more place = greater risk of data breach Informatica Confidential. Do Not Distribute.

  4. Test Data ManagementDevelopers & QA Struggle With Data Why is this such a big problem? • Creating data = time consuming, laborious, costly • Gaining access = data protection legislation. More data in more places = more risk • Ensuring integrity = complex, especially if you’re federating across systems • Getting enough = load, stress and performance testing • Storage space = expensive to maintain lots of full production copies • Getting the right quality = you need maximum code coverage

  5. Informatica OverviewCritical Infrastructure for Data Driven Enterprises

  6. Data Warehouse Data Migration Test Data Management & Archiving Data Consolidation Master Data Management Data Synchronization B2B Data Exchange Database Partner Data SWIFT NACHA Unstructured … Cloud Computing Application HIPAA The Informatica ApproachComprehensive, Unified, Open and Economical Platform

  7. Production Development/Testing/Training Copies Informatica Data Subset Informatica Data Archive Inactive data Informatica Data Masking Application ILM Products & Use CasesImproving Operational Efficiency & Compliance • Reducestorage, RDBMS license, personnel costs • Increaseperformance • Reduceeffort spent on maintenance & compliance • Reducedata privacy risk Copy 3 Copy 3 Copy 2 Copy 1 Performance D A T A B A S E S I Z E Copy 1 Copy 2 Copy 3 Active data T I M E

  8. Informatica Application ILM • Application ILM Enables Customers To: • Data Archive – Relocate older/inactive data out of production for performance, compliance and application retirement • Data Subset – Create smaller copies of production databases for test and development purposes • Data Masking – Protect sensitive information in nonproduction • ILM Value Proposition: • Lower storage and server costs • Improve application and query performance • Less time and cost for back-up & batch processes • Eliminate cost, complexity by retired legacy applications • Reduce compliance and eDiscovery expense • Prevent data breaches in nonproduction environments

  9. Building Better Test Environments Informatica Data Subset

  10. Informatica Data SubsetProduct Objectives Smaller nonproduction footprint Objective Retaining only required data Method Enabling target application usability Primary Challenge Solution Informatica Data Subset

  11. Informatica Data SubsetBenefits of Subsetting

  12. Informatica Data SubsetLean Copies for Nonproduction Use Time SavingsHere Space SavingsHere Time Slice or Functional Slice Production Database5 TB Subset 300 GB 300 GB 300 GB 300 GB 300 GB

  13. Informatica Data SubsetEntity Concept • Entity Definition • Logical unit to subset • Database and application level relationships • Policy scoping criteria

  14. Entities • Data Subset uses metadata-based Entities • Entities typically represent the transactions with which your application specialists interact. Such as, purchase orders, sales orders or financial documents • Selection screens are also metadata-driven to allow for easy customization

  15. Identifying and Masking Data Automatically Informatica Data Masking

  16. Informatica Data MaskingProduct Objective Protect sensitive information in nonproduction Objective Data masking Method Creating meaningful yet de-identified data Primary Challenge Solution Informatica Data Masking

  17. Informatica Data MaskingPrivacy Regulations Driving Masking Initiatives

  18. Informatica Data MaskingRealistic, Masked Data to Prevent Data Breach QA 02 = Protected Data = Production = Nonproduction QA 03 QA 01 • Substitute • Key-Masking • Credit Card Special Masking • SSN Special Masking • Blur • Nullify • Randomize • Expression Subset & Clone CRM20 TB Clone 1 FIN 15 TB Mask & Clone DEV 05 Subset & Mask HR12 TB Subset DEV 04 DEV 03 Mask & Clone DEV 01 DEV 02

  19. Informatica Data MaskingContextually Correct, Referentially Intact Data Masking

  20. Informatica ILMBroad Application and Database Support Informatica ILM Solutions Data Masking Data Subset Data Archive Application Aware Accelerators Custom/3rd Party Oracle e-Business SAP PeopleSoft Siebel Universal Connectivity SQLServer DB2UDB DB2z/OS VSAM Other Teradata Oracle Sybase

  21. Informatica ILM: An Enterprise SolutionPlatform & Vendor Independent ACQUIRED DIVISION SHARED SERVICE CENTER Reservation Applications Custom Billing Application INVOICES CONTRACTS Oracle 9i HPUX 10 Years = 600GB IMS 7 Years = 1.4 TB CALL CENTER CORPORATE HQ Logistics Applications Siebel 7.8 SERVICE REQUESTS BENEFITS DB2 for z 5 Years = 350GB VSAM – 600 KSDS files

  22. Informatica PowerExchangeFast and Easy Access to Mainframe Sources! September 16, 2010 Scott Hagan, Data Integration Sr. Product Manager Informatica Confidential & Proprietary

  23. Informatica PowerExchange What’s the Problem? You need access to mainframe data, quickly! No time or expertise to code extracts? FTP’s? Queries? What about Security? Speed? Recoverability? Integration Support? Oh yes, and I need it yesterday!

  24. Informatica PowerExchange Informatica PowerExchange helps you to… Unlock difficult to access data – Mainframe, legacy, etc. And make it available in when you need it – Batch, regular updates or real-time

  25. Data IntegrationTraditional Methods Target Database Source Data Translate Extract Move Load Program Extract from one or more sources Filter, ASCII EBCDIC conversion Transport data across platforms Load data to target Database

  26. Data IntegrationPowerExchange Approach Target Database Source Data NO PROGAMMING, NO INTERMEDIATE FILES Data is extracted using SQL, converted (EBCDIC/ASCII), filtered and available to the target database in memory, without any program code or FTP.

  27. SOURCES/TARGETS PROJECTS • Data warehousing • Data migration • Data consolidation • Application implementation • Application migration • ILM • Test Data Sources • Databases • Data warehouses • Packaged applications • Mainframe, midrange • Message-oriented middleware • Collaboration • Technology standards Informatica Data Integration Platform PowerExchange PowerExchange PowerCenter PowerExchange - BatchHighly Scalable Bulk Access to Data

  28. SOURCES PROJECTS • Straight-through processing • Real-time analytics • Real-time warehousing • Application integration • Message-oriented middleware • Web services • Packaged applications • Multiple modes • Batch • Continuous Informatica Data Integration Platform PowerExchange PowerExchange PowerCenter Real Time Edition PowerExchange - Real-time Immediate Access to Data, Events, and Web Services

  29. SOURCES PROJECTS • Create business events from database updates • Operational data integration (ODI) • Master data management (MDM) • Trickle-feed data warehousing • Data replication/synchronization • Relational, mainframe, midrange databases • Multiple modes • Batch (for initial materialization) • Net change • Continuous capture Informatica Data Integration Platform PowerExchange CDC Option PowerExchange PowerCenter Real Time Edition PowerExchange - Change CaptureCreation and Detection of Business Events

  30. Mainframe and Mid-Range PackagedApplications Relational and Flat Files Listener PowerCenter Tools Operating Environment User Applications PowerExchange Standards and Messaging Remote Data PowerExchange Run-TimeBatch Data Movement (Test Data Creation?) Targets (ETL, EAI, BI) DataRecords SQL Data Maps for Non-Relational Access

  31. Test Data Management Concepts • Privacy Policies at logical level • Define once use multiple times Subst Last Names Skew Salary Subst Credit Cards • Plans define a data subset with entities, filter criteria and privacy policies Policy Assignment Nullify SSN’s • Policy Assignment at physical level • Reuse policy for multiple applications Files ERP Mainframe DBMS

  32. Financial Services Co. and New Regulations • Financial holding company approval • October 2008 Financial Services Co. is approved by the US Federal Reserve Board to operate as a financial holding company allowing Financial Services Co. to offer additional retail banking services to its customers • New regulations • Financial Services Co. is now subject to supervision by the Federal Reserve and regulated by the FDIC • Redundant work • To comply with new regulations many business units within Financial Services Co. are performing redundant work such as complying with data privacy regulations • Self-service solution • Financial Services Co. wanted to pursue a holistic approach and build a self-service data masking solution

  33. Self Service Data Masking Solution • Corporate IT Compliance Team • Reviewed regulations and determined what constitutes sensitive or private data • Built a finite list of sensitive fields that must be masked throughout the organization and the masking rule that should be used • Action item 1: update Business Glossary with the corporate IT privacy policies for each sensitive field • Action item 2: build company-wide data masking policies for each sensitive fields with the associated masking rule • Online banking application owner • Apply corporate IT compliance team’s privacy policies to my online banking application

  34. Business Glossary Open the Business Glossary to define the masking policies in business terms

  35. ILM Workbench – Policies I’ll enter a clear name and description for credit cards Rules can also be defined reusable mapplets I need to create new policy for credit cards I’ll locate available masking rules and choose the rule I want to assign

  36. ILM Workbench – Policy Assignment I’ll start with assigning a policy to credit card columns I’ll locate sensitive columns and assign the appropriate policy I just profiled my source database. Now I can look for data patterns that represent credit cards I need to apply the corporate privacy policy to my online banking application

  37. ILM Workbench – Entities I’ll create a subset of data based on the Customer entity Entities are a set of related tables with a filter criteria definition I initially want to test my masking policies on a subset of data

  38. ILM Workbench – Plans I’ll give the plan a good name to show this is a Subset and Masking plan I’ll add all the Policy Assignments I created earlier to the plan I’ll search for and add the Customer entity to the plan to mask only a subset of the data Now that I reviewed my list of entities, I’m ready to create an integrated Data Subset and Data Masking plan

  39. ILM Workbench – Plans The plan is now complete and being generated Before I process the plan, I’m going to launch Metadata Manager and look at the data lineage from my source to target system to validate the end to end definition Now I’m ready to process the plan. I’ll switch to PowerCenter workflow monitor for detailed monitoring information If there were any additional sensitive fields they would have been highlighted

  40. ILM Workbench – Masking Validation Once the plan completes, I’d like to validate the results to ensure masking was performed as intended I can use rules such as these to validate the results SSN All values have changed SSN All values came from the dataset First Name All values have changed Credit cards All values have proper format

  41. ILM Workbench – Masking Validation After running the validation. I can see a simple scorecard of the rules that passed or failed Here are a few of the validation rules I created earlier I set up the rules earlier with simple operators like this one I can see that one SSN value didn’t pass the validation rule The value in the source is the same as the target

  42. Data Masking and Data Subset Check-List • Built reusable masking policies • Reduced redundant work • Complied with data privacy regulations • Integrated subset with privacy rules • Validated masking results

More Related