1 / 23

The History of Hypervolume

The History of Hypervolume. Lyndon While Walking Fish Group School of Computer Science & Software Engineering The University of Western Australia wfg.csse.uwa.edu.au (Work performed by Luigi Barone, Lucas Bradstreet, Phil Hingston, Simon Huband, and Lyndon While). Background.

meir
Télécharger la présentation

The History of Hypervolume

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The History of Hypervolume Lyndon While Walking Fish Group School of Computer Science & Software Engineering The University of Western Australia wfg.csse.uwa.edu.au (Work performed by Luigi Barone, Lucas Bradstreet, Phil Hingston, Simon Huband, and Lyndon While)

  2. Background • An optimisation problem is one where the performance of a solution is measured on a continuous scale • usually don’t expect to find an optimal solution • A multi-objective optimisation problem (MOOP) is one where in addition the performance of a solution is measured by more than one objective • e.g. for vehicles: safety vs. acceleration • An algorithm for solving a MOOP returns a set of solutions offering varying trade-offs between the objectives • e.g. a Hummer vs. a Volvo vs. a Porsche • How can we compare such sets? • i.e. how can we compare algorithms? The History of Hypervolume

  3. A A’ B B’ C C’ D D’ E A 2-objective MOOP – objective space Objective 2 Maximising both objectives 5 Improving in one objective means downgrading in at least one other 4 3 NB: C’dominatesD 2 1 0 Objective 1 0 1 2 3 4 5 The History of Hypervolume

  4. Hypervolume – a metric for comparing sets • The hypervolume of a set is the size of the portion of objective space dominated by the members of the set • also called the S-metric, or the Lebesgue measure, or the Klee’s measure • Hypervolume captures in one scalar both the convergence and the spread of the set • Hypervolume has nicer mathematical properties than many other metrics • Buthypervolume is expensive to calculate The History of Hypervolume

  5. Hypervolume {A, B, C, D, E} = 11 A Hypervolume {A’, B’, C’, D’} = 12 A’ B B’ C C’ D D’ Reference point E Hypervolume in 2D Objective 2 5 4 3 2 1 0 Objective 1 0 1 2 3 4 5 The History of Hypervolume

  6. Hypervolume in 3D The History of Hypervolume

  7. Algorithms for calculating hypervolume • Inclusion-exclusion O(n2m): impractical • LebMeasure • HSO • optimised HSO ← WFG • IHSO ← WFG • IIHSO ← WFG • FPL • BROY • Approximation algorithms notdiscussed today The History of Hypervolume

  8. Hypervolume by slicing objectives (HSO, 2001) Each slice has known thickness The kth slice has k points But some of them are dominated in the remaining objectives = The History of Hypervolume

  9. LebMeasure (LM, 2003) A dominates exclusively the yellow shape A lops off the pink hyper-cuboid A is replaced by three “spawns” But A2 is dominated The History of Hypervolume

  10. Timeline: 2003 — 2006 • HSO known to be O(mn) • LM believed to be O(n2m3) • “proof” published in 2003 • confusion between LM’s space complexity (which is amazingly good) and its time complexity • Proof that LM is O(mn) published by While at EMO in 2005 • Empirical demonstration that HSO substantiallyoutperforms LM published by While et al. in IEEE TEC in 2006 The History of Hypervolume

  11. Making HSO faster – reordering objectives • If we process the last objective… • A dominates B, which dominates C, etc • Every slice will have exactly one point • Best-case performance! • If we process the first objective… • No point dominates any other point in the remaining objectives • The kth slice will have k points • Worst-case performance! The History of Hypervolume

  12. Timeline: 2005 • Objective-reordering heuristics which improve the performance of HSO by 25–98% published by While et al. at CEC in 2005 • i.e. up to 50x speed-up! • The best heuristic MWW (“minimising worst-case work”) works by estimating for each objective the amount of work that will be required if that objective is processed The History of Hypervolume

  13. A A p B B C C D D E E In-line hypervolume • Hypervolume is also used within the operation of multi-objective EAs: • to promote diversity • to aid in selection • for archiving purposes • What we need to calculate now is how much hypervolume a new solution adds to an existing set The History of Hypervolume

  14. Incremental HSO (IHSO, 2008) The kth slice has k points But some of them are dominated in the remaining objectives Or p itself may be dominated in the remaining objectives = The History of Hypervolume

  15. Timeline: 2006 – 2008 • Use of HSO for in-line hypervolume calculation published by Bradstreet et al. at CEC in 2006 • IHSO published by Bradstreet et al. in IEEE TEC in 2008 • including point- and objective-reordering optimisations • won a UWA best paper award this year! • Use of IHSO for in-line hypervolume calculation published by Bradstreet et al. at CEC in 2007 • substantiallyfaster than the 2006 work The History of Hypervolume

  16. Iterated IHSO (IIHSO, 2009) – back to the metric = Need to calculate only the “bottom part” of each slice The History of Hypervolume

  17. Timeline: 2008 – 2009 • IIHSO paper currently under review by IEEE TEC • with good heuristics, IIHSO substantiallyoutperforms all previously known algorithms on typical data in 5+D • Publication held up partly by philosophical differences about empirical vs. theoretical analyses • how important is it to know the complexity of a heuristic-based algorithm? • Should we prefer • the algorithm with the best worst-case complexity? • the algorithm with the best “average” performance? The History of Hypervolume

  18. We now have worldwide competition! • Remember in 2003 they thought it was all over! The History of Hypervolume

  19. IIHSO vs. BROY vs. FPL: performance • Random data in 7D (averages of 200 distinct fronts) The History of Hypervolume

  20. IIHSO vs. BROY vs. FPL: variation • Random data with 640 points in 6D (1,000 distinct fronts) The History of Hypervolume

  21. IIHSO vs. BROY vs. FPL vs. earlier HSOs • How many points can be processed in 10s? (random data) The History of Hypervolume

  22. Future plans • Further adaptation of IHSO for in-line calculations • Possibility of reducing duplicate calculations • Possibility of optimising BROY • Get Lucas to complete his PhD! The History of Hypervolume

  23. Any questions?

More Related