1 / 57

Optimization methods

Optimization methods. Aleksey Minin Saint-Petersburg State University Student of ACOPhys master program (10 th semester). A pplied and CO mputational Phys ics. What is optimization?. Content:. Applications of optimization Global Optimization Local Optimization Discrete optimization

keziah
Télécharger la présentation

Optimization methods

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Optimization methods Aleksey Minin Saint-Petersburg State University Student of ACOPhys master program (10th semester) Joint Advanced Students School Applied and COmputationalPhysics

  2. Joint Advanced Students School What is optimization?

  3. Joint Advanced Students School Content: • Applications of optimization • Global Optimization • Local Optimization • Discrete optimization • Constrained optimization • Real application, Bounded Derivative Network.

  4. Joint Advanced Students School Applications of optimization • Advanced engineering design • Biotechnology • Data analysis • Environmental management • Financial planning • Process control • Scientific modeling etc

  5. Joint Advanced Students School Global or Local ?

  6. Joint Advanced Students School What is global optimization? • The objective of global optimization is to find the globally best solution of (possibly nonlinear) models, in the (possible or known) presence of multiple local optima.

  7. Joint Advanced Students School

  8. Joint Advanced Students School Branch and bound

  9. Joint Advanced Students School Branch and bound Scientist are ready to carry out some experiments. But the quality of all of the varies depending on type of experiment according to next table:

  10. Joint Advanced Students School _ _ _ _ Branch and bound 4 possibilities A _ _ _ Root … A AAA

  11. Joint Advanced Students School Branch and bound Root AAAA 0.55

  12. Joint Advanced Students School Type 1 Branch and bound A ADCC 0.42 B BAAA 0.42 Root AAAA 0.55 C CAAA 0.52 D DAAA 0.45

  13. Joint Advanced Students School Best CABD 0.38 Type 1 Type 2 Branch and bound A ADCC 0.42 B BAAA 0.42 Root AAAA 0.55 A CABD 0.38 C CAAA 0.52 B CBAA 0.39 D DAAA 0.45 D CDAA 0.45

  14. Joint Advanced Students School Best CABD 0.38 Type 1 Type 2 Type 3 Branch and bound A ADCC 0.42 B BAAA 0.42 Root AAAA 0.55 A CABD 0.38 C CAAA 0.52 A CBAD 0.37 B CBAA 0.39 D DAAA 0.45 B CDBA 0.40 D CDAA 0.45

  15. Joint Advanced Students School Best CABD 0.38 Type 1 Type 2 Type 3 Branch and bound A ADCC 0.42 Best CDBA 0.40 B BAAA 0.42 Root AAAA 0.55 A CABD 0.38 C CAAA 0.52 A CBAD 0.37 B CBAA 0.39 D DAAA 0.45 B CDBA 0.40 D CDAA 0.45

  16. Branch and bound Joint Advanced Students School

  17. Joint Advanced Students School Evolutionary algorithms

  18. Joint Advanced Students School Evolutionary algorithms

  19. Joint Advanced Students School Simulated annealing If T=0 Apply small perturbation Solution found! Repeat until good solution not found

  20. Joint Advanced Students School Simulated annealing results

  21. Joint Advanced Students School Simulated annealing

  22. Joint Advanced Students School Tree annealingdeveloped by Bilbro and Snyder [1991]

  23. Joint Advanced Students School Tree annealingdeveloped by Bilbro and Snyder [1991]

  24. Joint Advanced Students School Swarm intelligence

  25. Joint Advanced Students School Tabu Search

  26. Joint Advanced Students School Taboo search implementation 1 Taboo list

  27. Joint Advanced Students School Tabu search implementation 5 2 1 4 3 Tabu list

  28. Joint Advanced Students School Taboo search implementation 5 2 1 4 3 Tabu list 1

  29. Joint Advanced Students School Tabu search implementation 5 2 1 6 4 3 7 Tabu list 1 3

  30. Joint Advanced Students School Tabu search implementation 5 2 8 1 6 4 3 9 7 Tabu list 1 3 6

  31. Joint Advanced Students School Tabu search implementation 5 2 8 1 6 10 4 3 9 11 7 Tabu list 1 3 6 9

  32. Joint Advanced Students School Tabu search implementation 5 2 8 1 6 10 4 3 9 11 7 Tabu list 1 3 6 9

  33. Joint Advanced Students School Tabu search implementation 5 2 8 1 6 10 4 3 9 11 7 Tabu list 1 3 9 6

  34. Joint Advanced Students School Tabu search implementation 5 2 8 1 6 10 4 3 9 11 7 Tabu list 1 3 9 6

  35. Joint Advanced Students School Tabu Search

  36. Joint Advanced Students School What is Local Optimization? • The term LOCAL refers both to the fact that only information about the function from the neighborhood of the current approximation is used in updating the approximation as well as that we usually expect such methods to converge to whatever local extremum is closest to the starting approximation. • Global structure of the objective function is unknown to a local method.

  37. Joint Advanced Students School Local optimization

  38. Joint Advanced Students School Gradient descent

  39. Joint Advanced Students School Gradient descent Therefore we obtained: F(x0)<F(x1)<…<F(xn )

  40. Joint Advanced Students School Quasi-Newton Methods • These methods build up curvature information at each • iteration to formulate a quadratic model problem of the form: • where the Hessian matrix, H, is a positive definite symmetric matrix, c is a constant vector, and b is a constant. • The optimal solution for this problem occurs when the partial derivatives of x go to zero:

  41. Joint Advanced Students School Quasi-Newton Methods

  42. Joint Advanced Students School BFGS - algorithm

  43. BFGS - algorithm Joint Advanced Students School

  44. Joint Advanced Students School Gauss Newton algorithm Given m functions f1 f2 … fm of n parameters p1 p2 .. Pn (m>n),and we want to minimize the sum:

  45. Gauss Newton algorithm Joint Advanced Students School

  46. Joint Advanced Students School This is an iterative procedure. Initial guess for pT = (1,1,…,1). Levenberg-Marquardt

  47. Levenberg-Marquardt Joint Advanced Students School

  48. SQP – constrained minimization Joint Advanced Students School Reformulation

  49. SQP – constrained minimization Joint Advanced Students School The principal idea is the formulation of a QP sub-problem based on a quadratic approximation of the Lagrangian function:

  50. SQP – constrained minimization Updating the Hessian matrix Joint Advanced Students School

More Related