1 / 38

Engineering Optimization

Concepts and Applications. Engineering Optimization. Fred van Keulen Matthijs Langelaar CLA H21.1 A.vanKeulen@tudelft.nl. In practice : additional “tricks” needed to deal with: Multimodality Strong fluctuations Round-off errors Divergence. Summary single variable methods. Bracketing +

ringo
Télécharger la présentation

Engineering Optimization

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Concepts and Applications Engineering Optimization • Fred van Keulen • Matthijs Langelaar • CLA H21.1 • A.vanKeulen@tudelft.nl

  2. In practice: additional “tricks” needed to deal with: • Multimodality • Strong fluctuations • Round-off errors • Divergence Summary single variable methods • Bracketing + • Dichotomous sectioning • Fibonacci sectioning • Golden ratio sectioning • Quadratic interpolation • Cubic interpolation • Bisection method • Secant method • Newton method 0th order 1st order 2nd order • And many, many more!

  3. Descent methods Unconstrained optimization algorithms • Single-variable methods • Multiple variable methods • 0th order • 1st order • 2nd order Direct search methods

  4. Examples of test functions: • Rosenbrock’s function (“banana function”) Optimum: (1, 1) Test functions • Comparison of performance of algorithms: • Mathematical convergence proofs • Performance on benchmark problems (test functions)

  5. Many local optima: Optimum: (0, 0) Test functions (2) • Quadratic function: Optimum: (1, 3)

  6. Random walk method: • Generate random unit direction vectors • Walk to new point if better • Decrease stepsize after N steps Random methods • Random jumping method:(random search) • Generate random points, keep the best

  7. Simulated annealing (Metropolis algorithm) • Random method inspired by natural process: annealing • Heating of metal/glass to relieve stresses • Controlled cooling to a state of stable equilibrium with minimal internal stresses • Probability of internal energy change (Boltzmann’s probability distribution function) • Note, some chance on energy increase exists! • S.A. based on this probability concept

  8. Obtain f(y). Accept new design if better. If worse, generate random number r, and accept new design when Note: • Stop if design has not changed in several steps. Otherwise, update temperature: Simulated annealing algorithm • Set a starting “temperature” T, pick a starting design x, and obtain f(x) • Randomly generate a new design y close to x

  9. Increasingly negative Simulated annealing (3) • As temperature reduces, probability of accepting a bad step reduces as well: Negative Reducing • Accepting bad steps (energy increase) likely in initial phase, but less likely at the end • Temperature zero: basic random jumping method • Variants: several steps before test, cooling schemes, …

  10. Random methods properties • Very robust: work also for discontinuous / nondifferentiable functions • Can find global minimum • Last resort: when all else fails • S.A. known to perform well on several hard problems (traveling salesman) • Quite inefficient, but can be used in initial stage to determine promising starting point • Drawback: results not repeatable

  11. Cyclic coordinate search • Search alternatingly in each coordinate direction • Perform single-variable optimization along each direction (line search): • Directions fixed: can lead to slow convergence

  12. Directions for cycle i+1 Steps in cycle i Powell’s Conjugate Directions method • Adjusting search directions improves convergence • Idea: replace first direction with combined direction of a cycle: • Guaranteed to converge in n cycles for quadratic functions! (theoretically)

  13. Gradually move toward minimum by reflection: f = 10 f = 5 f = 7 Nelder and Mead Simplex method • Simplex: figure of n + 1 points in Rn • For better performance: expansion/contraction and other tricks

  14. Biologically inspired methods • Popular: inspiration for algorithms from biological processes: • Genetic algorithms / evolutionary optimization • Particle swarms / flocks • Ant colony methods • Typically make use of population (collection of designs) • Computationally intensive • Stochastic nature, global optimization properties

  15. 1 1 0 1 0 0 1 0 1 1 0 0 1 0 1 Genetic algorithms • Based on evolution theory of Darwin:Survival of the fittest • Objective = fitness function • Designs are encoded in chromosomalstrings, ~ genes: e.g. binary strings: x1 x2

  16. Test termination criteria Create new population Crossover Mutation Reproduction Select individualsfor reproduction Quit GA flowchart Create initial population Evaluate fitness of all individuals

  17. GA population operators • Reproduction: • Exact copy/copies of individual • Crossover: • Randomly exchange genes of different parents • Many possibilities: how many genes, parents, children … • Mutation: • Randomly flip some bits of a gene string • Used sparingly, but important to explore new designs

  18. Mutation: 1 1 0 1 0 0 1 0 1 1 0 0 1 0 1 1 1 0 1 0 1 1 0 1 1 0 0 1 0 1 Population operators • Crossover: Parent 2 Parent 1 0 1 1 0 1 0 0 1 0 1 1 0 0 0 1 1 1 0 1 0 0 1 0 1 1 0 0 1 0 1 0 1 1 0 0 0 1 0 1 1 0 0 1 0 1 1 1 0 1 1 0 0 1 0 1 1 0 0 0 1 Child 1 Child 2

  19. Particle swarms / flocks • No genes and reproduction, but a population that travels through the design space • Derived from simulations of flocks/schools in nature • Individuals tend to follow the individual with the best fitness value, but also determine their own path • Some randomness added to give exploration properties(“craziness parameter”)

  20. Random numbers between 0 and 1 Control “social behavior” vs “individual behavior” PSO algorithm • Initialize location and speed of individuals (random) • Evaluate fitness • Update best scores: individual (y) and overall (Y) • Update velocity and position:

  21. Summary 0th order methods • Nelder-Mead beats Powell in most cases • Robust: most can deal with discontinuity etc. • Less attractive for many design variables (>10) • Stochastic techniques: • Computationally expensive, but • Global optimization properties • Versatile • Population-based algorithms benefit from parallel computing

  22. Unconstrained optimization algorithms • Single-variable methods • Multiple variable methods • 0th order • 1st order • 2nd order

  23. Taylor: Best direction: x2 f = 1.9 -f • Example: f = 0.044 -f f = 7.2 x1 Steepest descent method • Move in direction of largest decrease in f: Divergence occurs! Remedy: line search

  24. Steepest descent convergence • Zig-zag convergence behavior:

  25. y2 • Ideal scaling hard to determine (requires Hessian information) y1 Effect of scaling • Scaling variables helps a lot! x2 x1

  26. Fletcher-Reeves conjugate gradient method • Based on building set of conjugate directions, combined with line searches • Conjugate directions: • Conjugate directions: guaranteed convergence in N steps for quadratic problems(recall Powell: N cycles of N line searches)

  27. Property: searching along conjugate directions yields optimum of quadratic function in N steps (or less): Optimality: Fletcher-Reeves Conjugate gradient method • Set of N conjugate directions: (Special case: orthogonal directions, eigenvectors)

  28. Optimization process with line search along all di: Conjugate directions • Find conjugate coordinates bi:

  29. ? (definition) Conjugate directions (2) • Optimization by line searches along conjugate directions will converge in N steps (or less):

  30. Line search: f = c+1 f = c -f2 d1 But … how to obtain conjugate directions? • How to generate conjugate directions with only gradient information? Start with steepest descent direction:

  31. But, in general, A is unknown! Remedy: Line search: Gradients: Conjugate directions (3) • Condition for conjugate direction:

  32. Eliminating A (cont.) • Result:

  33. But because Now use Why that last step? By Fletcher-Reeves: starting from Polak-Rebiere version:

  34. Polak-Rebiere: • Fletcher-Reeves: Three CG variants • For general non-quadratic problems, three variants exist that are equivalent in the quadratic case: • Hestenes-Stiefel: Generally bestin most cases

  35. CG practical • Start with abritrary x1 • Set first search direction: • Line search to find next point: • Next search direction: • Repeat 3 • Restart every (n+1) steps, using step 2

  36. Slower convergence; > N steps • After N steps / bad convergence: restart procedure etc. CG properties • Theoretically converges in N steps or less for quadratic functions • In practice: • Non-quadratic functions • Finite line search accuracy • Round-off errors

  37. Equilibrium: • CG: Line search: Application to mechanics (FE) • Structural mechanics:Quadratic function! • Simple operations on element level. Attractive for large N!

More Related