1 / 39

Exploitation of OA techniques to support IA & decision making

Exploitation of OA techniques to support IA & decision making. 9 th Feb 2010 Colin Drysdale E-mail : colin.drysdale@atkinsglobal.com. Overview. What is Operational Analysis (OA)? Difficulties facing a decision maker – how can the analytical community help? Ways to inform a decision maker

rolf
Télécharger la présentation

Exploitation of OA techniques to support IA & decision making

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Exploitation of OA techniques to support IA & decision making 9th Feb 2010 Colin Drysdale E-mail : colin.drysdale@atkinsglobal.com

  2. Overview • What is Operational Analysis (OA)? • Difficulties facing a decision maker – how can the analytical community help? • Ways to inform a decision maker • Ways to ensure the decision maker understands the information • An example of a technique that sets out to both inform & make the information understandable

  3. What is Operational Analysis (OA)? • OA is the name given by the military to Operational Research (OR) • OR, also known as Operations Research or Management Science (OR/MS) is the discipline of applying advanced analytical methods to help make better decisions • Source: The OR society (http://www.orsoc.org.uk)

  4. Typical outputs of OA • The articulation of analytical problems to be solved • Identification of the solution space • Mathematical theories • Suggested policies that could optimise systems or satisfy goals • Logical structuring of issues • Leading to insights • Measures of Effectiveness • Rather than Measures of Performance

  5. Difficulties facing decision makers

  6. Difficulties facing decision makers • The influence of “providers” over “deciders” • The maturity & availability of solutions • Irreconcilable starting positions • Multiple goals & constraints • Uncertainty & risks • Trade-offs • Funds (Not an exhaustive list)

  7. What makes decisions succeed or fail? Informed Investment Decision made Decision maker Decision maker & & heads info' understands info' & Decision maker informed & & Rigorous, Rigorous, uncertainty bounded uncertainty bounded operational "true" cost info' effectiveness info' for all options for all options & & & & Assumptions are Method for valid (fit for Method for combining cost purpose) combining estimates in performance data in scenarios is fit for scenarios is fit for purpose purpose & & & & Scenarios are fit Cost estimates Performance data for purpose are fit for purpose are fit for purpose

  8. How can the analytical community help decision makers? • We can calculate and collate information to inform decision makers • Inform them of the cost implications of investment options • Inform them of the operational effectiveness impact of the options

  9. How can we help decision makers understand the gathered information? • One way to inform decision makers that, additionally, seeks to help decision makers understand the gathered information is called Multi Criteria Decision Analysis • An example is coming up shortly • Another way is to present information in more than one format • E.g. both tables & graphs • Yet another way is to try to turn the implications of numbers back into words for the decision maker

  10. Introducing MCDA as one way to inform decision makers

  11. Background to MCDA What MDCA is: • MCDA is the name given to a group of techniques by the OR Society • A technique for supporting decision makers who are trying to simultaneously satisfy multiple, potentially conflicting goals • MCDA techniques pop-up in all sorts of domains with different names & conducted in different ways to different standards of analytical rigour • MCDA is a technique for helping decision makers make choices • Arguably it does not: • do predictive modelling • produce Measures of Effectiveness

  12. One way to inform decision makers • It is difficult to view MCDA as an “advanced” analytical technique • It is classed as a form of Soft OA • MCDA requires little or no mathematical knowledge to use • But beware of this • Like any technique it has pit falls, these can be avoided by good practitioners, who understand the features of the decision to be supported • Decision making Culture • Procedures • Standards of analytical rigour

  13. What does MCDA do? • Produces a Figure of Merit (FOM) • Combines disparate criteria or factors • Provides the relative ranking of the options

  14. MCDA sequence of events • Network of criteria / requirements • Criteria that can be measured, predicted, estimated, judged, or populated from available data • Define the relationship between performance and worth • Weight the links within the network • Evaluated criteria at nodes on the network (& then calculate the scores) • Conduct sensitivity analysis • Review findings with decision maker

  15. MCDA Example

  16. How MCDA works – simple example Warning! – this simplified example containssome elements not appropriate to MOD Business Case support • Equipment investment decision • Equipment - Bicycle front light • Scenario 1 • 10 mile daily commute after dark on a mixture of lit town roads and un-lit dual carriage way, single carriageway, & single track roads with passing places • Requirements: • Vision - to enable no reduction in speed from day light conditions • Presence - no reduction in recognition to other road users compared to day light conditions • All weather capable • Light weight • Reliable - very low probability of failure in use • Reliable - graceful degradation or failure warning • Available – for expected maximum mission duration • User able to choose where to fix the light

  17. 1. MCDA network of criteria 37 All Weather 22 Purchase Cost 23 Running Costs 36 weight 14 Vision 21 FOM Scenario 1 15 Presence 18 Availability 17 Fixing options

  18. 1. MCDA network of criteria 37 All Weather 22 Purchase Cost WARNING: This is not necessarily how you should combine cost information for Government Investment decisions consult the Treasury “Green Book”. 23 Running Costs 36 weight 28 Luminous flux 14 Vision 21 FOM Scenario 1 35 Flashing mode 15 Presence 33 Mission length supported 18 Availability 31 Failure warning 17 Fixing options 26 Helmet 24 Oversize 27 Head

  19. 2. Criteria that can be “measured” 37 All Weather Measured Quantity (measurand i/p) (£) 22 Purchase Cost Yes/No Question (Boolean i/p) (£ per annum) Subjective SME judgement 23 Running Costs g 36 weight Lm 28 Luminous flux 14 Vision 21 FOM Scenario 1 Y/N 35 Flashing mode 15 Presence 33 Mission length Hr supported 18 Availability Y/N 31 Failure warning 17 Fixing options 26 Helmet Y/N 24 Oversize Y/N 27 Head Y/N No reliable information

  20. 3. Performance & worth relationships Value Score Performance Measured Quantity (measurand i/p) 1 Score (0≤ Score ≤1) Value Lumens 1 0 100 1,000 Weight Value 1 0 500 (g) Burn Time 0 (hr) 1 2 3

  21. 3. Performance & worth relationships Value Score Performance Measured Quantity (measurand i/p) 1 Score (0≤ Score ≤1) Purchase Cost 0 (£) 75 100 200 300 Value 1 Running Cost 0 10 15 30 (£)

  22. 4. Weight contributions to outcomes - weight network links Weight Contributions Contribution Weight 22 Purchase Cost (ΣWi = 1) 23 Running Costs 0.2 0.1 36 weight 0.2 1 28 Luminous flux 0.2 14 Vision 0.5 21 FOM Scenario 1 0.1 35 Flashing mode 0.5 15 Presence 33 Mission length 0.7 0.1 supported 0.1 18 Availability 0.3 31 Failure warning 0.33 17 Fixing options 26 Helmet 0.33 24 Oversize 0.33 37 All Weather 27 Head

  23. Aspects of good practice

  24. Sensitivity Analysis • Having got initial results out of the models may have felt like arriving at the destination … • This is NOT correct • the Analysts job is not done • there is a need to conduct Sensitivity Analysis • to understand how uncertainty affects the ranking of the options • indeed once uncertainty is considered it may not be possible to discriminate between the options

  25. 5. Evaluate criteria – for each Option - Top FOM results

  26. Flaws with these answers • Option 1, The Dog ear, “Hedo” should not come out as the best option as it is not really bright enough to be fit for purpose in this Scenario • It is successful because of its light weight and low purchase price

  27. Flaws with these answers • Options 13 & 14, The Faith, “Four 1” and The “Four 1+1” should not be equal in merit

  28. 7.Review findings with decision maker • Should not use this technique in the absence of decision makers from relevant parts of the process • Review findings with decision maker • Sometimes the analysis (or thought process) does not get as far as MCDA results • We gain some insight that reframes or satisfactorily solves the exam question before the analysis is concluded

  29. What MCDA can include • Scenarios • Linear or non-linear conversion of performance against each criterion into some from of worth to the decision maker (e.g. military worth) • Sensitivity analysis (e.g. to address uncertainty & risk)

  30. How I actually made the decision

  31. How I actually made the decision Lumens 1000 Faith, Four 1 900 800 Chilly, Devil 700 600 500 Faith, Two 1 400 300 Faith, Sight 200 100 Dog ear, Hedo 0 0 50 100 150 200 250 300 Purchase Cost (£)

  32. Conclusion Previously: • Discussed the role of causal mapping to make the development of MCDA networks adequately rigorous to support Business Case decisions Today: • Stated IA and OA studies can inform decision makers • Considered failure modes for decisions & how the analytical community can help decision makers succeed • Explained what MCDA is and what it does • Identified that there are good and bad practices • Suggested some issues for discussion covering which practices are acceptable for any given purpose

  33. Thank you for your time Any Questions? Colin Drysdale E-mail : colin.drysdale@atkinsglobal.com

  34. END Colin Drysdale E-mail : colin.drysdale@atkinsglobal.com

  35. Spares Colin Drysdale E-mail : colin.drysdale@atkinsglobal.com

  36. Options and data

  37. Evaluated criteria – evaluate network “leaf” nodes

  38. And the answer was …. Because I was just generating an example I did not conduct any sensitivity analysis

  39. 4. Objective & Threshold definition or “effectiveness envelope” Top Speed Top Speed Value Value 1.0 1.0 Top Speed Top Speed (miles per hour) (miles per hour) 0 0 100 100 150 150 200 200

More Related