1 / 24

Applied Software Performance Engineering

Applied Software Performance Engineering. Presented By Steve Feldman. April 13, 2005. Welcome. Session Objectives: Introduction to Software Performance Engineering Understanding the SPE Methodology Applying SPE to Building Block® Development Performance Analysis Tools Innovation

rosa
Télécharger la présentation

Applied Software Performance Engineering

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Applied Software Performance Engineering Presented By Steve Feldman April 13, 2005

  2. Welcome • Session Objectives: • Introduction to Software Performance Engineering • Understanding the SPE Methodology • Applying SPE to Building Block® Development • Performance Analysis Tools • Innovation • Thinking about Performance Early in the Lifecycle • Results/Outcomes • A new approach to design and development

  3. Introduction: About Your Presenter • What do I do at Blackboard? • Director, Software Performance Engineering and Architecture • Part of Product Development, but interface with every department in Blackboard. • A few key points… • Been at Blackboard since the Fall of 2003. • Worked on AP2, AP3 and R7.0 • Manage a team of (4) developer/engineers.

  4. Introduction to SPE Q: What is software performance? A: Any characteristic of a software product that you could quantifiably measure? Q: Why is software performance important? A: The world we live in today is becoming more digitally sophisticated. We expect our digital transactions to be faster then our paper transactions. Q: How do we manage performance in the SDLC? A: Plan, Predict, Prove, Improve performance throughout the SDLC?

  5. Introduction to SPE SPE Methodology Data Collection & Usage Analysis Complete End to End Performance Engineering Modeling, Profiling and Simulation End to End Performance Testing Refactoring and Optimizing Strategy,MethodologyandBestPractices

  6. Introduction to SPE • Performance is at the top of everyone’s mind. • If Performance is poor, software adoption will decline and potentially cause usage attrition or adoption of alternative products/processes. • Performance Failures cost institutions a large amount of unplanned money. • Performance problems can create a huge trust barrier between user and provider of technology. • So then what is SPE?

  7. Introduction to SPE SPE is a methodology that… • Provides a systematic, quantitative approach to constructing software systems that meet performance objectives. • Provides a software-oriented approach to architecture, design and implementation choices. • Prescribes principles and performance patterns for creating responsive software. • Performance antipatterns for recognizing and correcting common problems, the data required for evaluation, procedures for obtaining performance specifications, and guidelines for the types of evaluation to be conducted at each development stage.

  8. Introduction to SPE • SPE is a Seven Step Methodology focused primarily on solving the software performance model. • The software performance model consists of optimizing the design patterns and implementation of the underlying code for the greatest performance impact. • Identifying and refactoring software anti-patterns for performance gain. • Included in this methodology is an understanding in solving the system performance model. • The system performance model consists of optimizing the configuration and deployment options of a system in order to yield the greatest performance impact without software modifications. • Making the user experience (response time) more optimal.

  9. Understanding the SPE Methodology Establish Performance Objectives Assess Performance Risk • Specify the quantitative criteria for evaluating the performance characteristics of the system under development. • Assessing the performance risk at the outset of the project (During Requirements) • Identify, qualify and mitigate Identify Critical Use Cases Construct Performance Models • Look for use cases where there is a risk that, if performance goals are not met, the system will fail or be less than successful • Critical actions important to responsiveness as seen by the user - Modeling techniques for representing the software processing steps for the performance model. Select Key Performance Scenarios Determine Software Resource Requirements • Most frequently executed scenarios, or those that are critical to the perceived performance of the system. • Each performance scenario corresponds to a workload. • Determination of software resource utilization (business logic execution) to appropriately measure effect of software as it scales in usage. • Identification of Performance Anti-Patterns targeted for refactoring. Determine System Resource Requirements • Determination of system resource requirements utilized by the software under a given workload. • Used for sizing and capacity models. * SPE Methodology, Dr. Connie Smith and Dr. Lloyd Williams

  10. SPE: Assessing Risk • Distinguish between new development and refactoring. • Work with requirements specialists to understand the problem/domain issues for the development. • Understand potential inter- versus intra-system integration. • Apply common sense and basic business logic from past experiences and similar development projects.

  11. SPE: Identify Critical Use Cases • Most important operations/actions in the feature or system to be developed. • Responsiveness driven. • Risk driven. • You look for use cases where there is a risk that, if performance goals are not met, the system will fail or be less than successful.

  12. SPE: Select Key Performance Scenarios • Unlikely that all critical use cases will be important to performance. • The key performance scenarios are those that are executed frequently, or those that are critical to the perceived performance of the system. • Each performance scenario corresponds to a workload. • Performance scenarios represented through sequence diagrams augmented with some useful extensions.

  13. SPE: Establish Performance Objectives • Identify and define performance objectives. • Specify the quantitative criteria for evaluating the performance characteristics of the system under development. • Response time, throughput, or constraints on resource usage • Identify and define workload objectives. • Specify the level of usage for the scenario. • They are specified as an arrival rate (e.g., number of Web site hits per hour), number of concurrent users or number of parallel transactions.

  14. SPE: Performance Modeling • Use of Execution Graphs to represent software processing steps in the performance model. • The sequence diagram representations of the key performance scenarios are translated into execution graphs. • Reverse Engineer  Reverse Engineer  Reverse Engineer

  15. SPE: Determine Software Resource Requirements • Understand at an atomic level the types of software resource requirements. • Method Calls • SQL Executions • Data Loading • Caching • Understand the effects of this software to inter- and intra-system components. • Understand at best- and worst-case perspective. • 90% of the time • Peak time.

  16. SPE: Resource Requirements • Apply Workload Understanding • Based on findings in the software resource requirements phase you should be able to understand at an atomic level... • CPU Cycles consumed by the method call and/or sql operation. • Network overhead (packet transfer) • Memory requirements • Processing Thread/DB Connection Requirements

  17. Applying SPE to Blackboard Building Block Development: Software Performance Requirements • What’s the intended audience of the function? • What’s the frequency of use of the function? • How many touch points (integration)? • What is the function doing? • Processing/Transacting? • Navigating? • Calculating? • Loading? • How much data is the function working with? • What’s the integrity of the transaction? • Candidate for asynchronous processing? • What’s the response time expectation of the user? • What’s the resource requirements to perform the function?

  18. Data Model Size (number of entity types in the model) Growth/Capacity (volume driven by usage) Complexity (relationships, opacity of values) Input/Validation (internationalized, user provided) Statistics impact (data that affects the stats schema) Data Access Frequency (page-level, session-level, cacheable) Volatility (same parameters?) Volume (Small, medium, large datasets? Entity size?) Complexity (Number of joins? Outer joins?) File System Applying SPE to Blackboard Building Block Development: Software Design • Traceability • Requirements • Verification/Testability • Workflow (s) • Breadth. Number of discrete paths. • Depth. How many steps are in each path or branch? • Complexity. Average number of branches. • Testability • Encapsulation. How does a requirements specification point map to a unit test? • Coverage. What function points can’t reasonably be tested at the unit test level? • Instrumentation. Can the design accommodate instrumentation to help testing? • Function Points/Unit Tests

  19. Applying SPE to Blackboard Building Block Development: Iterative Software Development • Software Development is an Iterative Process • Develop for Functionality • Focus on agility • Eloquence Comes through Iterations • Identify Optimal Performance Patterns • Eliminate Unnecessary Performance Anti-Patterns • Profile and Instrument • Refactor Based Findings

  20. Applying SPE to Blackboard Building Block Development: Iterative Software Development • Software Development is an Iterative Process • Be willing to scrap your development efforts between iterations. • Focus on simplicity • Complexity comes from a lack of understanding of what it is you are designing or developing. • Decompose your development efforts into manageable components

  21. Applying SPE to Blackboard Building Block Development: Iterative Software Development

  22. Applying SPE to Blackboard Building Block Development: Iterative Software Development • A pattern is a common solution to a problem that occurs in many different contexts. • Patterns capture knowledge about “best practices” in software design for reuse and application. • Antipatterns are similar to design patterns in that they document recurring solutions to common design problems. • Antipatterns produce negative consequences. • Antipatterns document common mistakes made during software development.

  23. Performance Analysis Tools • Tools for Different Purposes • Modeling/Diagramming • Entity Relationships, Activity and Deployment • Developing • IDE (Eclipse) • Textual Editing • Load Generation • Grinder, Apache JMeter and Volano • Loadrunner, Segue • Profiling • Application Layer • JProbe, Performasure, OptimizeIt, Wily Enterprise, HPJmeter • Database Layer • Hotsos, IronEye, Built-in Blackboard Profiling

  24. Thank You For Participating • Innovating Together in ‘05: • Performance is at the top of everyone’s mind. • Design for Performance • Develop for Performance • Pattern Identification • Antipattern Identification • Profile/Instrument • Resources Available: • www.cmg.org • www.perfeng.com • www.acm.org • Follow up Contact(s): • Steve Feldman (sfeldman@blackboard.com) • IF YOU ONLY REMEMBER 1 THING: • Performance should not be an afterthought, but an upfront effort.

More Related