1 / 46

TPC Benchmarks

TPC Benchmarks. Sandeep Gonsalves CSE 8330 – Project 1 SMU May 1, 2004. Overview. Benchmarks Benchmark Wars Establishment of the TPC TPC Benchmarks: A, B, C, D, E, H, R, S, W Conclusion. Benchmarks. Standard for comparison of various systems performing similar operations

zinnia
Télécharger la présentation

TPC Benchmarks

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. TPC Benchmarks Sandeep Gonsalves CSE 8330 – Project 1 SMU May 1, 2004

  2. Overview • Benchmarks • Benchmark Wars • Establishment of the TPC • TPC Benchmarks: A, B, C, D, E, H, R, S, W • Conclusion

  3. Benchmarks • Standard for comparison of various systems performing similar operations • Set of programs that simulate a typical workload on a given system • Used to measure system performance • Metrics such as speed, performance, price etc. are recorded • Used to determine the optimal system

  4. Domain-specific benchmarks • “No single metric can measure the performance of computer systems on all applications” [1] • Performance of a system varies tremendously from one application domain to another [1]

  5. Domain-specific benchmarks • Key criteria for a domain-specific benchmark to be useful: • Relevant • Portable • Scaleable • Simple[1]

  6. Early Benchmarks • TP1 benchmark • Wisconsin benchmark

  7. Benchmark Wars • Benchmark wars happen when one vendor publishes superior results for an important benchmark evaluation and the other vendors or individuals try to get back by improving their numbers [2] • Consequences: • Abnormal results • Waste of resources • Confusion !

  8. Initial Efforts • Working Group on Performance Measurement Standards (WG-PMS) [2] • Anon et.al. 1985 • Three standard performance tests for OLTP systems • Most popular – DebitCredit [2]

  9. Establishment of the TPC • Motivation: • Need for the competition to get civilized • Put an end to manipulations • Stop the confusion in the industry • Form an industry forum to enforce OLTP performance measurement standards [2]

  10. TPC • TPC is the acronym for the Transaction Processing Performance Council • Established on 10th August, 1988 • Founder: Omri Serlin • Co-founder: Tom Sawyer • Initial 8 members: Control Data Corp, Digital Equipment Corp, ICL, Pyramid Technology, Stratus Computer, Sybase, Tandem Computers and Wang Laboratories [2]

  11. TPC • First standard TPC-A published in November, 1989 • TPC-A – Council’s version of the DebitCredit benchmark test [2] • Published the second standard TPC-B in August 1990 • TPC-B – Council’s version of the TP1 benchmark test [2]

  12. TPC • Major changes brought about by the TPC: [2] • Submission of a Full Disclosure Report (FDR) • Results must be audited by a TPC certified individual

  13. TPC Members - 2004

  14. TPC Auditors - 2004

  15. TPC Benchmark™ A • Issued in November 1989 • Obsolete as of 6th June, 1995 • Measure performance of update-intensive database environments [5] • e.g.. OLTP applications with characteristics like: • “Multiple on-line terminal sessions • Significant disk input/output • Moderate system and application execution time • Transaction integrity” [5]

  16. TPC Benchmark™ A • Measures the transactions per second (tps) of a system as a measurement of performance [5] • Measure the performance of systems in a wide area or local area network configuration [5] • Metrics are: “TPC-A local Throughput” and “TPC-A wide Throughput” measured in tps [6] • System configurations, details of the benchmark run, total cost mandatory part of FDR [6]

  17. TPC-A Specifications [5] • Specification consists of 11 clauses • TPC-A is stated in terms of a hypothetical bank that has one or more branches and each branch has multiple tellers • Each customer of the bank has an account • Transactions denote customer operations such as withdrawals, deposits etc. performed by a teller at a branch

  18. TPC-A Specifications • Fig 4: ER Diagram of the components of the TPC Benchmark™ A database[5]

  19. TPC-A Specifications [5] • Any commercially available database management system (DBMS), database server, file system, etc can be used to implement the database • The system under test must support the ACID properties of transaction processing systems • Horizontal partitioning of files/tables is permitted • Vertical partitioning of files/tables is not permitted

  20. TPC-A Results [5] • Throughput of the system in units of transactions per second is: • “tpsA-Local” - local area networks • “tpsA-Wide” - wide area networks • Cost is given as price/tpsA • “ what is tested and/or emulated is priced and what is priced is tested and/or emulated ” • 5 year maintenance pricing must be included

  21. TPC-A FDR Requirements [5] • Identification of sponsor of the benchmark • Participating companies • Program listings • List of settings for parameters and options that are tunable by a customer and have been altered from the defaults in actual products • Auditor’s name, address, phone number along with a copy of the auditor’s attestation letter • Etc…

  22. TPC-A FDR Requirements [5] • In short: All that a customer would need to replicate the results ! • The FDR document must be available to the public at a reasonable cost • Official language is English

  23. TPC Benchmark™ B • Approved in August 1990 • Obsolete as of 6th June, 1995 • Quite different from TPC-A • Database Stress Test • Not OLTP oriented • It focuses on database management systems (DBMS) applications and on the back-end database server [8]

  24. TPC-B Specifications [8] • Specified in terms of a hypothetical bank • Measures the total number of simultaneous transactions that a system can carry out • No users, communication lines or terminals used • Equivalent to electronic data processing (EDP) batch processing applications • Metrics –Throughput: “tpsB” and the associated price-per-TPS

  25. TPC Benchmark™ C • Approved on July 23, 1992 • Currently in use with its latest version 5.2 • It is an OLTP benchmark • It is centered on the transactions of a wholesale supplier managing orders which is an order-entry environment [14] • TPC-C metrics are new-order txn rate (tpmC) and price/performance ($/tpmC) [14]

  26. TPC Benchmark™ C • It measures the entire business operation and is a more extensive and complex yardstick for measuring the performance of an OLTP system [14] • Current version is the 20th revision since 1992 [12]

  27. TPC Benchmark™ D • Approved on April 5, 1995 • Obsolete as of April 6, 1999 • Decision support benchmark • Metrics are: • TPC-D Composite Query-per-Hour Metric (QphD@Size) • TPC-D Price/Performance ($/QphD@Size) • Systems Availability Date [18]

  28. TPC Benchmark™ D • “It illustrates decision support systems that: • Examine large volumes of data; • Execute queries with a high degree of complexity; • Give answers to critical, frequently-asked business questions [18] ” • It’s successors are TPC-H and TPC-R

  29. TPC Benchmark™ H • Approved on February 26, 1999 • Current version is 2.1.0 which was specified on August 14, 2003 • Decision support benchmark • It executes a set of queries against a standard database under controlled conditions to evaluate the performance of various decision support systems [20]

  30. TPC Benchmark™ H • Metrics are: • TPC-H Composite Query-per-Hour Metric (QphH@Size) which is the performance metric • TPC-H Price/Performance ($/QphH) which is the price-performance metric • The Systems Availability Date [20]

  31. TPC Benchmark™ R • Approved on February 26, 1999 • Current version is 2.1.0 which was specified on August 14, 2003 • Decision support benchmark • Similar to TPC-H

  32. TPC Benchmark™ R • The TPC-R metrics are: • TPC-R Composite Query-per-Hour Metric (QphR@Size) which is the performance metric • TPC-R Price/Performance ($/QphR) which is the price-performance metric • System Availability date [23]

  33. TPC Benchmark™ W • TPC’s latest benchmark • First version 1.0 was approved on December 9, 1999 • Transactional web e-Commerce benchmark that models a retail store on the internet like an online bookstore. [24] • TPC-W enables the testing of environments where a host of servers perform different functions [25]

  34. TPC Benchmark™ W • TPC-W metrics are: • Number of web interactions processed per second (WIPS) which is the performance metric. • The associated price per WIPS ($/WIPS) • The availability date of the priced configuration [26] • It can verify the performance measurements of a variety of e-commerce servers in a real world internet environment [25]

  35. Aborted benchmark efforts • TPC – S • Server version of the TPC-C • Did not receive sufficient Council support • Potential for abnormally high performance ratings [22]

  36. Aborted benchmark efforts • TPC – E • “Enterprise” benchmark • No complex benchmark available that could stress large enterprise systems • Could cause confusion • Would cause vendors to spend additional resources • Was germane to only a very small number of vendors competing in that arena [22]

  37. What's up with the TPC • As of March 2004: • Working on a new TPC-E specification • Working on a new decision support benchmark: TPC-DS • Carrying out revisions on the TPC-H, TPC-R, TPC-W

  38. Conclusion - TPC • Fair-competition in the industry • End to bench marketing wars • Widely used by all vendors • Continuous evolution of its benchmarks

  39. Conclusion - TPC Log

  40. References • [1] Jim Gray: Database and Transaction Processing Performance Handbook. The Benchmark Handbook 1993. Online edition. http://www.benchmarkresources.com/handbook/index.html • [2] Omri Serlin: The History of DebitCredit and the TPC. The Benchmark Handbook 1993. Online edition. http://www.benchmarkresources.com/handbook/index.html • [3] Anon, et al, “A Measure of Transaction Processing Power”, Datamation, V. 31.7, April 1985, pp. 112-118. • [4] Gray, J.N., Reuter, A., “Transaction Processing: Concepts and Techniques, Morgan Kaufmann, San Mateo, CA, 1993, pp. 11-12, 168.

  41. References • [5] TPC BENCHMARK™ A Standard Specification Revision 2.0 7June 1994 http://www.tpc.org/tpca/spec/tpca_current.pdf • [6] TPC-A http://www.tpc.org/tpca/default.asp • [7] Hanson, Robert J., TPC Benchmark B - What It Means and How to Use It, AT&T Global Information Solutions http://www.tpc.org/tpcb/default.asp • [8] TPC BENCHMARK™ B Standard Specification Revision 2.0 7June 1994 http://www.tpc.org/tpcb/spec/tpcb_current.pdf

  42. References • [9] Bramer, Brian. System benchmarks, DeMontfort University, UK. http://www.cse.dmu.ac.uk/~bb/Teaching/ComputerSystems/SystemBenchmarks/BenchMarks.html#introduction • [10] Levine, Charles.,SIGMOD '97 Industrial Session 5 - 5/29/97. http://www.tpc.org/information/sessions/sigmod/indexc.htm • [11] Ozsu, T., P. Valderez’s, Principles of Distributed Database Systems, Second Edition. Prentice Hall, Englewood Cliffs, NJ., 1999. • [12] TPC BENCHMARK™ C Standard Specification Revision 5.2 December 2003 http://www.tpc.org/tpcc/spec/tpcc_current.pdf

  43. References • [13] Patterson, D. A., J. L. Hennessy, Computer Architecture, a Quantitative Approach, Morgan Kaufmann, San Mateo, CA, 1990. Chapter 1. • [14] Raab, Francois et.al. Overview of the TPC Benchmark C: The Order-Entry Benchmark. http://www.tpc.org/tpcc/detail.asp • [15] TPC-C http://www.tpc.org/tpcc/default.asp • [16] TPC-D http://www.tpc.org/tpcd/default.asp

  44. References • [17] Stephens, Jack. TPC-D. The Industry Standard Decision Support Benchmark http://www.tpc.org/information/sessions/sigmod/indexc.htm • [18] TPC BENCHMARKTM D (Decision Support) Standard Specification Revision 2.1 http://www.tpc.org/tpcd/spec/tpcd_current.pdf • [19] TPC-D http://www.tpc.org/tpcd/default.asp • [20] TPC BENCHMARK™ H (Decision Support) Standard Specification Revision 2.1.0 http://www.tpc.org/tpch/spec/tpch2.1.0.pdf

  45. References • [21] Levine, Charles. TPC Benchmarks, Microsoft. http://research.microsoft.com/~gray/WICS_96_TP/ • [22] Draft White Paper. August 2, 1999. Object Management Group www.omg.org/docs/bench/99-08-02.ps • [23] TPC-R http://www.tpc.org/tpcr/default.asp • [24] TPC Benchmark™ W (Web Commerce) SpecificationVersion 1.8. Feb 19, 2002. http://www.tpc.org/tpcw/spec/tpcw_V1.8.pdf

  46. References • [25] Smith, Wayne D. TPC-W*: Benchmarking: An Ecommerce Solution. Revision 1.2. Intel Corporation. http://www.tpc.org/tpcw/TPC-W_wh.pdf

More Related