1 / 39

Advanced Metadata Modeling for Oracle Business Intelligence Enterprise Edition

Advanced Metadata Modeling for Oracle Business Intelligence Enterprise Edition. Alan Lee, Oracle Raghav Venkat, City of Las Vegas.

soyala
Télécharger la présentation

Advanced Metadata Modeling for Oracle Business Intelligence Enterprise Edition

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Advanced Metadata Modeling for Oracle Business Intelligence Enterprise Edition Alan Lee, Oracle Raghav Venkat, City of Las Vegas

  2. The following is intended to outline our general product direction. It is intended for information purposes only, and may not be incorporated into any contract. It is not a commitment to deliver any material, code, or functionality, and should not be relied upon in making purchasing decisions. The development, release, and timing of any features or functionality described for Oracle’s products remains at the sole discretion of Oracle.

  3. Program Agenda • Advanced Metadata Models - OBIEE on OLTP Schemas • Recent Relevant Features for OLTP Models • Case Study: City of Las Vegas • Questions and Answers

  4. Advanced Metadata Models - OBIEE on OLTP Schemas

  5. Real-Time, Low Data Latency BI • The ability to historically analyze, explore, trend, report and act on data up to the second it occurs is a real requirement for many organizations • Historically difficult to cost justify • Prohibitive infrastructure, design, implementation and ongoing maintenance investments • “What is the net value add of reporting real-time versus data a day, hour or minute older?” • The world has changed… • Critical business data today is inherently more real-time • Tools and technology have dramatically improved becoming more cost effective, e.g. in-memory • Real-time data analysis can provide a competitive differentiator • Why not real-time if possible and cost effective?

  6. BI on Transactional Schemas • RPD modeled directly over live, operational OLTP schema. All queries are inherently ‘real-time’ • Fine for tightly constrained and straightforward reporting and analysis • Beware the classic analytical query over operational schema paradox • OLTPs not designed for OLAP, e.g. OLAP queries can negatively impact operations • The more OLAP the user experience (sparsity, interactivity, custom groups, etc…), the more taxing the query

  7. OLTP Model Characteristics to Consider When Modeling for BI • Third-Normal form vs Dimensional models • Challenging to find logical tables from highly-normalized schema • Degenerate dimension, factless-fact • Extensive use of joins, especially outer joins • Potential for circular join paths • Time dimension not typically persisted in the OLTP Schema • Multi-valued lookup tables • Data security design

  8. Oracle BI Server and OLTP Models • Oracle BI has excellent, powerful and differentiating OLTP modeling capabilities • Oracle BI Server is designed to generate optimized SQL against OLTP models while maintaining data correctness • The Oracle BI Common Enterprise Information Model (CEIM) and BI Server have a number of key features that can be leveraged for OLTP models • Rich logical table definition • Alias tables • LOOKUP Function • VPD security integration • Selective physical table caching • Opaque views • Join trimming capability

  9. Oracle BI Server Join Trimming • Join trimming is essential to performantOLTP analytics • Oracle BI Server is designed to trim joins when not needed provided “data correctness” is maintained • RPDs on 3NF / OLTP models require deliberate attention to avoid “over joining” • OUTER JOINing 3NF / OLTP models require even more deliberate attention

  10. Factors that Determine Join Trimming • “Oracle BI Server Data Correctness Rules” do not trim joins that cause changes in cardinality of the result sets • Trimming joins can dramatically improve performance but worthless if incorrect results generated • Oracle BI Server Rules can be manipulated through RPD config • Integrators / Customers know their data and are the ultimate judge of data correctness • Table to be trimmed must not be referenced anywhere in the query. • Table.Column projected in a query or used in WHERE clauses are not trimmed • Importantly includes Table.Columns in Logical Table Source WHERE clauses • Join trimming rules are documented here: http://docs.oracle.com/cd/E28280_01/bi.1111/e10540/busmodlayer.htm#BIEMG4267

  11. Recent Features

  12. Join Trimming – 1..0 OUTER JOINs New in 11.1.1.7 • Department can be trimmed since it is on the 0..1 side of an outer join and it is on the right side of a LEFT OUTER JOIN (i.e. the null supplying table) • The BI Server allows the null supplying table on the 0..1 side of an outer join to be trimmed since in this case trimming Department from the query would not change the number of rows selected from the Employee table. • E.g. Employees with or without Departments…Department source can be trimmed if associated columns not specified * 0..1 Department Employee Employee LEFT OUTER JOIN Department

  13. Recursive Application of Join Trimming Rules New in 11.1.1.7 • Prior to 11.1.1.7, the join trimming algorithm is evaluated only for the bottom most outer join • Limited join trimming with multiple joins specified in LTS even when only columns from the base LTS are used in the query • In 11.1.1.7 the evaluation is done recursively up through parent node, i.e. join trimming rules evaluated at each join operation for a given LTS

  14. Recursive Application of Join Trimming Rules New in 11.1.1.7 • In this example, join trimming rules for a given query will be evaluated for all joins defined in the LTS • Applies in cases where inner joins and outer joins are mixed • Significant performance benefits with high degrees of join nesting * 0..1 Table B Table A * 1 1 * Table D Table C

  15. NULL Un-Suppress New Option in 11.1.1.7 Default: • Property in Answers to display rows with Null values • New UNSUPPRESSED Logical SQL predicate is added when property is selected New View Property:

  16. NULL Un-Suppress New Option in 11.1.1.7 • BI Server processing as follows: • Single queries to retrieve members from each dimension • Query to retrieve suppressed result set • Dimension members crossed joined internally • Dimension cross join result set outer joined with suppressed result set internally • Attempting to un-suppress null values along multiple dimensions where there is an extremely high degree of sparseness in the data values will not perform well • Null values for a single dimension will perform reasonably well

  17. NULL Un-Suppress A Blast From the Past…Preserve Dimensions • Create a fact source and “dummy” fact that will always return a value, e.g. Select 1 as col1 from dual • Join the dimension sources to the fact sources with a complex join, e.g. 1=1 • Propagate and associated measure to the business model and presentation layers • Use this in the filter criteria when Null un-suppress is desired, e.g. “dummy measure’=1

  18. Preserve Dimension Screenshot Cross Join configuration

  19. Preserve Dimension Screenshot (cont.) Control untenablecross joins with levels Dimension table LTSes alsoserve as fast Prompt and Filter sources

  20. Preserve Dimension Pseudo SQL • WITH0: Cross Joined Dimension Only Query • WITH1: Inner Joined OLTP Query • SAWWITH0 FULL OUTER JOIN SAWWITH1 1 2 3

  21. Comparing Approaches Null Un-Suppress and Preserve Dimensions • NULL Un-Suppress requires no metadata modeling work, though has performance implications • End-User training and cautions should be a consideration • Preserve Dimensions requires RPD changes with some anticipation of user behavior • Performance implications can be somewhat managed in the RPD • End-User training a consideration

  22. Testing the RPD Design Without Generating a Query to the Database • Requirement to see the physical SQL generated by a logical query without sending a query to the database • Enabled by setting request variable: SKIP_PHYSICAL_QUERY_EXEC • Example: SET VARIABLE SKIP_PHYSICAL_QUERY_EXEC=1, LOGLEVEL=2, DISABLE_CACHE_HIT=1, DISABLE_CACHE_SEED=1: SELECT ...

  23. Testing the RPD Design Physical Query within NQCMD • NQCMD supports an option called –ShowQueryLog • Hidden by default • Enabled by setting Windows environment variable: SA_NQCMD_ADVANCED =Yes • Combining –ShowQueryLog and –H options will print the query log details to the screen

  24. City of Las Vegas Case Study

  25. Case Study : City of Las Vegas Oracle Business Intelligence Raghav Venkat City of Las Vegas

  26. City Manager – Operations • 14 Departments Under CMO • 2,000+ Employees • CLV Total Budget: $1.5B • CLV GF Budget: $485M • Founded in 1905 • City Population 600,000 • LV Valley Population: 2,000,000 • CLV Land Area: 117 Square Miles

  27. CLV’s IT Footprint • The city uses multiple applications to manage its functions • These are unique applications that are custom built or highly customized to suit the city’s business • They range from very native applications to best in class, up to date technology applications

  28. Reporting Requirements • City wide performance management initiative • A system for collecting data and measuring the achievement of goals and objectives • Monitor performance of strategies and processes interactively • Transactional Reporting to monitor sensitive functions • Analytical Reporting • Single Source of truth • A way for departments to explain the results of their work

  29. Data Sources & Fetch Strategies • OLTP schemas – Mostly 3NF, non-relational etc. • Ease of integrating data from disparate sources • Direct Connections to the OLTP Data Source • Replication of the OLTP Data Sources • Few Transformed Versions • Low Latency Data Loads

  30. CEIM – Physical Layer • Optimizing Properties of Connection Pools • Opaque Views • FK-PK Based Simple Joins • Use of Complex Joins • Aliases • Cross Database Joins

  31. CEIM – Logical Modeling • Mapping the business users view of data to • the available source snowflakes • Model Logical Dimensions • Design Logical Facts • Ways to create Calendar Dimensions • Building Dimensional Hierarchies • Use of Fact Less Facts

  32. CEIM– Other Features Used in Logical Layer • Logical Table Sources • Joins in Logical Table Sources • Using Where clause in LTSs’ • Specifying proper Content Levels • Taking Advantage of Content Fragmentation Options

  33. BI Development Process

  34. Conclusion • A solution for transactional and historical reporting • An interactive dashboard for Executives • Advance ad-hoc capabilities for data hungry Analysts • Quantitative Benefits • Qualitative Benefits • We just Migrated to 11g, slowly leveraging new features – Mobile and OSSM

  35. Questions & Answers

More Related