1 / 42

Optimization methods Review

Optimization methods Review. Mateusz Sztangret. Faculty of Metal Engineering and Industrial Computer Science Department of Applied Computer Science and Modelling Krakow, 03-11-2010 r. Outline of the presentation. Basic concepts of optimization Review of optimization methods

thora
Télécharger la présentation

Optimization methods Review

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Optimization methodsReview Mateusz Sztangret Faculty of Metal Engineering and Industrial Computer Science Department of Applied Computer Science and Modelling Krakow, 03-11-2010 r.

  2. Outlineof the presentation Basic concepts of optimization Review of optimization methods • gradientless methods, • gradient methods, • linear programming methods, • non-deterministic methods Characteristics of selected methods • method of steepestdescent • genetic algorithm

  3. Basic concepts of optimization Man’s longing for perfection finds expression in the theory of optimization. It studies how to describe and attain what is Best, once one knows how to measure and alter what is Good and Bad… Optimization theory encompasses the quantitative study of optima and methods for finding them. Beightler, Phillips, Wilde Foundations of Optimization

  4. Basic concepts of optimization Optimization /optimum/ - process of finding the best solution Usually the aim of the optimization is to find better solution than previous attached

  5. Basic concepts of optimization Specification of the optimization problem: • definition of the objective function, • selection of optimization variables, • identification of constraints.

  6. Mathematical definition where: • x is the vector of variables, also called unknowns or parameters; • f is the objective function, a (scalar) function of x that we want to maximize or minimize; • giand hi are constraint functions, which are scalar functions of x that define certain equations and inequalities that the unknown vector x must satisfy.

  7. Set of allowed solutions Constrain functions define the set of allowed solution that is a set of points which we consider in the optimization process. X Xd

  8. Obtained solution Solution is called global minimum if, for all Solution is called local minimum if there is a neighbourhood N of such that for all Global minimum as well as local minimum is never exact due to limited accuracy of numerical methods and round off error

  9. Local and global solutions f(x) local minimum global minimum x

  10. Problems with multimodal objective function f(x) start start x

  11. Discontinuous objective function f(x) Discontinuous function x 3

  12. Minimum or maximum f(x) f c x* x – c – f

  13. General optimization flowchart Start Set starting point x(0) i = 0 i = i + 1 Calculatef(x(i)) NO Stop condition x(i+1) = x(i) + Δx(i) YES Stop

  14. Stop conditions Commonly used stop conditions are as follows: • obtain sufficient solution, • lack of progress, • reach the maximum number of iterations

  15. Classification of optimization methods

  16. Optimization methods The are several type of optimization algorithms: • gradientless methods, • line search methods, • multidimensional methods, • gradient methods, • linear programming methods, • non-deterministic methods

  17. Gradientless methods • Line search methods • Expansion method • Golden ratio method • Multidimensional methods • Fibonacci method • Method based on Lagrange interpolation • Hooke-Jeeves method • Rosenbrock method • Nelder-Mead simplex method • Powell method

  18. Features of gradientless methods Advantages: • simplicity, • they do not require computing derivatives of the objective function. Disadvantages: • they find first obtained minimum • they demand unimodality and continuity of objective function

  19. Gradient methods • Method of steepest descent • Conjugate gradients method • Newton method • Davidon-Fletcher-Powell method • Broyden-Fletcher-Goldfarb-Shanno method

  20. Features of gradient methods Advantages: • simplicity, • greater effciency in comparsion with gradientless methods. Disadvantages: • they find first obtained minimum • they demand unimodality, continuity and differentiability of objective function

  21. Linear programming If both the objective function and constraints are linear we can use one of the linear programming method: • Graphical method • Simplex method

  22. Non-deterministic method • Monte Carlo method • Genetic algorithms • Evolutionary algorithms • strategy (1 + 1) • strategy (μ + λ) • strategy (μ, λ) • Particle swarm optimization • Simulated annealing method • Ant colony optimization • Artificial immune system

  23. Features of non-deterministic methods Advantages: • any nature of optimised objective function, • they do not require computing derivatives of the objective function. Disadvantages: • high number of objective function calls

  24. Optimization with constraints Ways of integrating constrains • External penalty function method • Internal penalty function method

  25. Multicriteria optimization In some cases solved problem is defined by few objective function. Usually when we improve one the others get whose. • weighted criteria method • ideal point method

  26. Weighted criteria method Method involves the transformation multicriterial problem into one-criterial problem by adding particular objective functions.

  27. Ideal point method In this method we choose an ideal solution which is outside the set of allowed solution and the searching optimal solution inside the set of allowed solution which is closest the the ideal point. Distance we can measure using various metrics Ideal point Allowed solution

  28. Method of steepest descent Algorithm consists of following steps: • Substitute data: • u0 – starting point • maxit – maximum number of iterations • e – require accuracy of solution • i = 0 – iteration number • Compute gradient in ui

  29. Method of steepest descent • Choose the search direction • Find optimal solution along the chosen direction (using any line search method). • If stop conditions are not satisfied increased i and go to step 2.

  30. Zigzag effect Let’s consider a problem of finding minimum of function: f(u)=u12+3u22 Starting point: u0=[-2 3] Isolines

  31. Genetic algorithm Algorithm consists of following steps: • Creation of a baseline population. • Compute fitness of whole population • Selection. • Crossing. • Mutation. • If stop conditions are not satisfied go to step 2.

  32. Genotype 1 0 1 0 1 0 1 0 0 1 0 1 0 1 0 1 1 1 0 1 0 1 0 0 1 0 1 1 0 1 1 0 0 0 1 0 1 0 1 1 1 1 1 0 0 1 0 0 Objective function value (f(x)=x2) 28900 7225 44944 33124 1849 51984 Creation of a baseline population

  33. Baseline population 1 0 1 0 1 0 1 0 0 1 0 1 0 1 0 1 1 1 0 1 0 1 0 0 1 0 1 1 0 1 1 0 0 0 1 0 1 0 1 1 1 1 1 0 0 1 0 0 Parents’ population 1 1 1 0 0 1 0 0 1 1 0 1 0 1 0 0 1 1 1 0 0 1 0 0 0 1 0 1 0 1 0 1 1 0 1 1 0 1 1 0 1 0 1 0 1 0 1 0 Selection

  34. Roulette wheel method

  35. Parent individual no 1 1 0 1 0 1 Parent individual no 2 0 1 0 1 0 crossing point Descendant individual no 1 0 1 0 Descendant individual no 2 1 0 1 Crossing

  36. Mutation Parent individual 1 0 1 0 1 0 1 0

  37. Mutation 1 0 1 0 1 0 1 0 Mutation r>pm r>pm r<pm

  38. Mutation 1 0 0 0 1 0 1 0 Mutation r<pm r>pm r>pm r>pm r<pm

  39. Mutation 1 0 0 0 1 0 0 0 Mutation r>pm r<pm

  40. Mutation Parent individual 1 0 1 0 1 0 1 0 Descendant individual 1 0 0 0 1 0 0 0

  41. Genetic algorithm After mutation, completion individuals are recorded in the descendant population, which becomes the baseline population for the next algorithm iteration. If obtained solution satisfies stop condition procedure is terminated. Otherwise selection, crossing and mutation are repeated.

  42. Thank you for your attention!

More Related