1 / 16

Towards an understanding of Decision Complexity in IT Configuration

Towards an understanding of Decision Complexity in IT Configuration. Bin Lin Department of Electrical Engineering & Computer Science, Northwestern University binlin@cs.northwestern.edu Aaron Brown IBM T.J. Watson Research Center abbrown@us.ibm.com. Context: Quantifying IT Process Complexity.

bernad
Télécharger la présentation

Towards an understanding of Decision Complexity in IT Configuration

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Towards an understanding of Decision Complexity in IT Configuration Bin Lin Department of Electrical Engineering & Computer Science, Northwestern Universitybinlin@cs.northwestern.edu Aaron Brown IBM T.J. Watson Research Center abbrown@us.ibm.com

  2. Context: Quantifying IT Process Complexity • Technical problem • Identify metrics and develop methodology for quantifying the exposed operational complexity of IT processes • Importance • Complexity of systems management processes drives labor cost • Labor cost reductions are extremely important to services/outsourcing organizations and customers • A quantitative framework for complexity can guide process improvements to reduce labor cost • Opportunities for deploying autonomic computing in IT environment Towards an understanding of Decision Complexity in IT Configuration

  3. automated 1 5 17 17 0 94 0 0 Previous work • Initialize a model of configuration complexity and demonstrates its value for a change management system. • Metrics that indicate some configuration complexity, including execution complexity, parameter complexity, and memory complexity. • Process complexity: manual • Execution • 59 steps, 27 context switches • Parameter • 32 parameters used 61 times,18 outside of source context • Source score: 125 • Memory (LIFO stack model) • Size: max 8, avg 4.4 • Example: complexity of J2EE provisioning See: Brown, A.B., A. Keller, and J.L. Hellerstein. A Model of Configuration Complexity and Its Application to a Change Management System. Proceedings of the Ninth IFIP/IEEE International Symposium on Integrated Network Management (IM 2005), Nice, France, May 2005. Towards an understanding of Decision Complexity in IT Configuration

  4. Previous metrics assume expert skill Do not consider complexity arising from decision-making Capturing complexity impact of decisions along a specific procedure’s path Parameterized by skill level Understanding the overall complexity across all possible procedures Quantifying the tradeoff between flexibility and simplicity s s goal vs. Next Step: Decision Complexity Procedure Design Space Towards an understanding of Decision Complexity in IT Configuration

  5. Factors that affect complexity constraints e.g. compatibility between software products, capabilities of a machine consequences e.g.functionality, performance levels of guidance e.g. documentation, previous configuration experience Manifestation task time, user-perceived difficulty, error probability A starting point to drive data collection (user study) After we have the real-world data, refine the model Install/Config Procedure for J2EE App Need Enterprise Clustering? N Install Cloudscape + WAS Express Y ... Install DB2 UDB + WAS ND Decision Complexity (An initial model & methodology) Towards an understanding of Decision Complexity in IT Configuration

  6. Model details: levels of guidance • Global information • E.g. documentation, design guide, deployment patterns • Short-term goal-oriented information • E.g. wizard-based prompts indicating the appropriate next step • Confounding information • E.g. alternate configuration instructions for a different platform than the target • Position information • E.g. feedback on the current state of the system and the effect of the previous action Towards an understanding of Decision Complexity in IT Configuration

  7. Decision Complexity (challenge & solution) • Hard to conduct a full user study to validate the model (constraints, consequences,levels of guidance) using real IT processes • Sol: measuring decision complexity in a simplified domain: Route-planning • navigating a car from one point to another Towards an understanding of Decision Complexity in IT Configuration

  8. Decision Complexity (user study design) • Web-based study • larger subject pool • accurate timing data • standardized information • Questionnaire to collect user background • Recording user interaction • time spent, each decision point • comparison b/w user path & optimal path • user ranking of the complexity for testcases Towards an understanding of Decision Complexity in IT Configuration

  9. Testcase selection • Testcases • Different combinations of factors • Static traffic • Dynamic traffic • Expert path • GPS • Difference in travel times • Position information • Selected 10 most relevant testcases • Example: dynamic traffic (speed updates) + expert path Expert path Dynamic traffic Towards an understanding of Decision Complexity in IT Configuration

  10. User Study: Overview & Analysis Approach • Overview • 3 experiments, 10 testcases with 1 warm-up • 1st stage, 35 users • 2nd stage, 23 users, with refined experiment • Metrics • Average time spent per step (e.g. time / no. of steps) • User rating (in the end of each experiment) • Error rate (user picked non-optimal path) • Analysis approach • Step I: general statistical analysis of all data • Each testcase measured as an independent data point • Goal: identify factors that explain the most variance • Step II: pair-wise testcase comparisons • Get more insight into specific effects of factor value • Goal: remove inter-user variance Towards an understanding of Decision Complexity in IT Configuration

  11. Summary of results • Significantly different impacts on user-perceived difficulty than on objective measures (e.g. time and error rate) • Time is influenced by: • Constraints • static constraints > dynamic; static constraints > without constraints • Guidance (goal) • without short-term goal oriented guidance > with such guidance • Rating is influenced by: • Guidance (goal) • Guidance (position) • without position guidance > with such guidance • Constraints • static constraints > dynamic • Error rate: hard to say statistically, except • error rate is reduced when guidance (goal) is present • error rate is reduced when guidance (position) is not present Towards an understanding of Decision Complexity in IT Configuration

  12. Summary of results (cont) • Depending on its goal (user, time or error rate), optimization for less complexity will have different focus, examples: • An installation procedure with easily-located clear info (e.g. wizard-based prompts) for the next step will reduce both task time and user-perceived complexity • A procedure with feedback on the current state of the system and the effect of the previous action (e.g. message windows following a button press) will only reduce user-perceived complexity, but unlikely to improve task time or error rate • Omitting positional feedback (i.e., not showing users effects of their actions) may, counterintuitively, increase user accuracy, but at cost of significantly higher perceived complexity and task time Towards an understanding of Decision Complexity in IT Configuration

  13. Proposal for a new user study • Validate the model in the IT configuration domain Towards an understanding of Decision Complexity in IT Configuration

  14. Analogy between two studies • Driving time per segment • Global map • Traffic • Goal (reach the destination) • Number of features achieved per step • Flowchart of the overall process (text) • Soft compatibility / machine capacity limit • Achieve the max number of features Towards an understanding of Decision Complexity in IT Configuration

  15. Further step • Apply the model to assess IT decision complexity AvgTimePerStep Operation time Complexity (Constraints, Guidance, Consequence, …) Cost ($) Rating (User perceived complexity) Skill levels Error Rate Probability (downtime) Towards an understanding of Decision Complexity in IT Configuration

  16. Conclusions • We investigated decision complexity in IT configuration procedures • Used an carefully-mappedanalogous domain to explore complexity space • Conduct an extensive user study • Quantitative results showing the key factors • Some guidance for system designers seeking to reduce complexity • Next steps are to explore further in simulated IT environment Towards an understanding of Decision Complexity in IT Configuration

More Related