1 / 64

Surrogate-based constrained multi-objective optimization

Surrogate-based constrained multi-objective optimization.

tass
Télécharger la présentation

Surrogate-based constrained multi-objective optimization

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Surrogate-based constrained multi-objective optimization Aerospace design is synonymous with the use of long running and computationally intensive simulations, which are employed in the search for  optimal designs in the presence of multiple, competing objectives and constraints. The difficulty of this search is often exacerbated by numerical `noise' and inaccuracies in simulation data and the frailties of complex simulations, that is they often fail to return a result. Surrogate-based optimization methods can be employed to solve, mitigate, or circumvent problems associated with such searches. This presentation gives an overview of constrained multi-objective optimization using Gaussian process based surrogates, with an emphasis on dealing with real-world problems. Alex Forrester3rd July 2009

  2. Coming up: • Surrogate model based optimization – the basic idea • Gaussian process based modelling • Probability of improvement and expected improvement • Missing data • Noisy data • Constraints • Multiple objectives

  3. PRELIMINARY EXPERIMENTS SAMPLING PLAN OBSERVATIONS CONSTRUCT SURROGATE(S) design sensitivities available? multi-fidelity data? ADD NEW DESIGN(S) SEARCH INFILL CRITERION (optimization using the surrogate(s)) constraints present? noise in data? multiple design objectives? Surrogate model based optimization • Surrogate used to expedite search for global optimum • Global accuracy of surrogate not a priority

  4. Gaussian process based modelling

  5. Building Gaussian process models, e.g. Kriging • Sample the function to be predicted at a set of points

  6. Correlate all points using a Gaussian type function

  7. 20 Gaussian “bumps” with appropriate widths (chosen to maximize likelihood of data) centred around sample points

  8. Multiply by weightings (again chosen to maximize likelihood of data)

  9. Add together to predict function Kriging prediction True function

  10. Optimization

  11. Polynomial regression based search (as Devil’s advocate)

  12. Gaussian process prediction based optimization

  13. Gaussian process prediction based optimization (as Devil’s advocate)

  14. But, we have error estimates with Gaussian processes

  15. Error estimates used to construct improvement criteria Probability of improvement Expected improvement

  16. Probability of improvement • Useful global infill criterion • Not a measure of improvement, just the chance there will be one

  17. Expected improvement • Useful metric of actual amount of improvement to be expected • Can be extended to constrained and multi-objective problems

  18. Missing Data

  19. What if design evaluations fail? • No infill point augmented to the surrogate • model is unchanged • optimization stalls • Need to add some information or perturb the model • add random point? • impute a value based on the prediction at the failed point, so EI goes to zero here? • use a penalized imputation (prediction + error estimate)?

  20. Aerofoil design problem • 2 shape functions (f1,f2) altered • Potential flow solver (VGK) has ~35% failure rate • 20 point optimal Latin hypercube • max{E[I(x)]} updates until within one drag count of optimum

  21. Results

  22. A typical penalized imputation based optimization

  23. Four variable problem • f1,f2,f3,f4varied • 82% failure rate

  24. A typical four variable penalized imputation based optimization • Legend as for two variable • Red crosses indicate imputed update points. • Regions of infeasible geometries are shown as dark blue. • Blank regions represent flow solver failure

  25. ‘Noisy’ Data

  26. ‘Noisy’ data • Many data sets are corrupted by noise • We are usually interested in deterministic ‘noise’ • ‘Noise’ in aerofoil drag data due to discretization of Euler equations

  27. Failure of interpolation based infill • Surrogate becomes excessively snaky • Error estimates increase • Search becomes too global

  28. Regression improves model • Add regularization constant to correlation matrix • Last plot of previous slide improved

  29. Failure of regression based infill • Regularization assumes error at sample locations (brought in through lambda in equations below) • Leads to expectation of improvement here • Ok for stochastic noise • Search stalls for deterministic simulations

  30. Use “re-interpolation” • Error due to noise ignored using new variance formulation (equation below) • Only modelling error • Search proceeds as desired

  31. Two variable aerofoil example • Same parameterization as missing data problem • Course mesh causes ‘noise’

  32. Interpolation – very global

  33. Regression - stalls

  34. Re-interpolation – searches local basins, but finds global optimum

  35. Constrained EI

  36. Probability of constraint satisfaction • g(x) is the constraint function • F=G(x)-gmin is a measure of feasibility, where G(x) is a random variable

  37. It’s just like the probability of improvement, but with a limit, not a minimum Constraint function Prediction of constraint function Constraint limit Probability of satisfaction

  38. Constrained probability of improvement • Probability of improvement conditional upon constraint satisfaction • Simply multiply the two probabilities:

  39. Constrained expected improvement • Expected improvement conditional upon constraint satisfaction • Again, a simple multiplication:

  40. A 1D example

  41. After one infill point

  42. A 2D example

  43. Multi-objective EI

  44. Pareto optimization • We want to identify a set of non-dominated solutions • These define the Pareto front • We can formulate an expectation of improvement on the current non-dominated solutions

  45. Multi-dimensional Gaussian process • Consider a 2 objective problem • The random variables Y1 and Y2 have a 2D probability density function:

  46. Probability of improving on one point • Need to integrate the 2D pdf:

  47. Integrating under all non-dominated solutions: • The EI is the first moment of this integral about the Pareto front (see book)

  48. A 1D example

More Related