1 / 317

Optimization

Optimization. Assoc. Prof. Dr. Pelin Gündeş gundesbakir@yahoo.com. Optimization. Basic Information Instructor: Assoc. Professor Pelin Gundes (http://atlas.cc.itu.edu.tr/~gundes/) E-mail: gundesbakir@yahoo.com Office Hours: TBD by email appointment

derica
Télécharger la présentation

Optimization

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Optimization Assoc. Prof. Dr. Pelin Gündeş gundesbakir@yahoo.com

  2. Optimization Basic Information • Instructor: Assoc. Professor Pelin Gundes (http://atlas.cc.itu.edu.tr/~gundes/) • E-mail: gundesbakir@yahoo.com • Office Hours: TBD by email appointment • Website: http://atlas.cc.itu.edu.tr/~gundes/teaching/Optimization.htm • Lecture Time: Wednesday 13:00 - 16:00 • Lecture Venue: M 2180

  3. Optimization literature Textbooks: • Nocedal J. and Wright S.J., Numerical Optimization, Springer Series in Operations Research, Springer, 636 pp, 1999. • Spall J.C., Introduction to Stochastic Search and Optimization, Estimation, Simulation and Control, Wiley, 595 pp, 2003. • Chong E.K.P. and Zak S.H., An Introduction to Optimization, Second Edition, John Wiley & Sons, New York, 476 pp, 2001. • Rao S.S., Engineering Optimization - Theory and Practice, John Wiley & Sons, New York, 903 pp, 1996. • Gill P.E., Murray W. and Wright M.H., Practical Optimization, Elsevier, 401 pp., 2004. • Goldberg D.E., Genetic Algorithms in Search, Optimization and Machine Learning, Addison Wesley, Reading, Mass., 1989. • S. Boyd and L. Vandenberghe, Convex Optimization, Cambridge University Press, 2004.(available at http://www.stanford.edu/~boyd/cvxbook/)

  4. Optimization literature Journals: • Engineering Optimization • ASME Journal of Mechnical Design • AIAA Journal • ASCE Journal of Structural Engineering • Computers and Structures • International Journal for Numerical Methods in Engineering • Structural Optimization • Journal of Optimization Theory and Applications • Computers and Operations Research • Operations Research and Management Science

  5. Optimization Course Schedule: • Introduction to Optimization • Classical Optimization Techniques • Linear programming and the Simplex method • Nonlinear programming-One Dimensional Minimization Methods • Nonlinear programming-Unconstrained Optimization Techniques • Nonlinear programming-Constrained Optimization Techniques • Global Optimization Methods-Genetic algorithms • Global Optimization Methods-Simulated Annealing • Global Optimization Methods- Coupled Local Minimizers

  6. Optimization Course Prerequisite: • Familiarity with MATLAB, if you are not familiar with MATLAB, please visit http://www.ece.ust.hk/~palomar/courses/ELEC692Q/lecture%2006%20-%20cvx/matlab_crashcourse.pdf http://www.ece.ust.hk/~palomar/courses/ELEC692Q/lecture%2006%20-%20cvx/official_getting_started.pdf

  7. Optimization • 70% attendance is required! • Grading: Homeworks: 15% Mid-term projects: 40% Final Project: 45%

  8. Optimization • There will also be lab sessions for MATLAB exercises!

  9. 1. Introduction • Optimization is the act of obtaining the best result under given circumstances. • Optimization can be defined as the process of finding the conditions that give the maximum or minimum of a function. • The optimum seeking methods are also known as mathematical programming techniques and are generally studied as a part of operations research. • Operations research is a branch of mathematics concerned with the application of scientific methods and techniques to decision making problems and with establishing the best or optimal solutions.

  10. 1. Introduction • Operations research (in the UK) or operational research (OR) (in the US) or yöneylem araştırması (in Turkish) is an interdisciplinary branch of mathematics which uses methods like: • mathematical modeling • statistics • algorithms to arrive at optimal or good decisions in complex problems which are concerned with optimizing the maxima (profit, faster assembly line, greater crop yield, higher bandwidth, etc) or minima (cost loss, lowering of risk, etc) of some objective function. • The eventual intention behind using operations research is to elicit a best possible solution to a problem mathematically, which improves or optimizes the performance of the system.

  11. 1. Introduction

  12. 1. Introduction Historical development • Isaac Newton (1642-1727) (The development of differential calculus methods of optimization) • Joseph-Louis Lagrange (1736-1813) (Calculus of variations, minimization of functionals, method of optimization for constrained problems) • Augustin-Louis Cauchy (1789-1857) (Solution by direct substitution, steepest descent method for unconstrained optimization)

  13. 1. Introduction Historical development • Leonhard Euler (1707-1783) (Calculus of variations, minimization of functionals) • Gottfried Leibnitz (1646-1716) (Differential calculus methods of optimization)

  14. 1. Introduction Historical development • George Bernard Dantzig (1914-2005) (Linear programming and Simplex method (1947)) • Richard Bellman (1920-1984) (Principle of optimality in dynamic programming problems) • Harold William Kuhn (1925-) (Necessary and sufficient conditions for the optimal solution of programming problems, game theory)

  15. 1. Introduction Historical development • Albert William Tucker (1905-1995) (Necessary and sufficient conditions for the optimal solution of programming problems, nonlinear programming, game theory: his PhD student was John Nash) • Von Neumann (1903-1957) (game theory)

  16. 1. Introduction • Mathematical optimization problem: • f0 : Rn R: objective function • x=(x1,…..,xn): design variables (unknowns of the problem, they must be linearly independent) • gi : Rn R:(i=1,…,m): inequality constraints • The problem is a constrained optimization problem

  17. 1. Introduction • If a point x* corresponds to the minimum value of the function f (x), the same point also corresponds to the maximum value of the negative of the function, -f (x). Thus optimization can be taken to mean minimization since the maximum of a function can be found by seeking the minimum of the negative of the same function.

  18. 1. Introduction Constraints • Behaviour constraints: Constraints that represent limitations on the behaviour or performance of the system are termed behaviour or functional constraints. • Side constraints: Constraints that represent physical limitations on design variables such as manufacturing limitations.

  19. 1. Introduction Constraint Surface • For illustration purposes, consider an optimization problem with only inequality constraints gj(X)  0. The set of values of X that satisfy the equation gj (X) =0 forms a hypersurface in the design space and is called a constraint surface.

  20. 1. Introduction Constraint Surface • Note that this is a (n-1) dimensional subspace, where n is the number of design variables. The constraint surface divides the design space into two regions: one in which gj(X)  0and the other in which gj (X) 0.

  21. 1. Introduction Constraint Surface • Thus the points lying on the hypersurface will satisfy the constraint gj(X) critically whereasthe points lying in the region where gj(X) >0 are infeasible or unacceptable, and the points lying in the region where gj (X) < 0 are feasible or acceptable.

  22. 1. Introduction Constraint Surface • In the below figure, a hypothetical two dimensional design space is depicted where the infeasible region is indicated by hatched lines. A design point that lies on one or more than one constraint surface is called a bound point, and the associated constraint is called an active constraint.

  23. 1. Introduction Constraint Surface • Design points that do not lie on any constraint surface are known as free points.

  24. 1. Introduction Constraint Surface Depending on whether a particular design point belongs to the acceptable or unacceptable regions, it can be identified as one of the following four types: • Free and acceptable point • Free and unacceptable point • Bound and acceptable point • Bound and unacceptable point

  25. 1. Introduction • The conventional design procedures aim at finding an acceptable or adequate design which merely satisfies the functional and other requirements of the problem. • In general, there will be more than one acceptable design, and the purpose of optimization is to choose the best one of the many acceptable designs available. • Thus a criterion has to be chosen for comparing the different alternative acceptable designs and for selecting the best one. • The criterion with respect to which the design is optimized, when expressed as a function of the design variables, is known as the objective function.

  26. 1. Introduction • In civil engineering, the objective is usually taken as the minimization of the cost. • In mechanical engineering, the maximization of the mechanical efficiency is the obvious choice of an objective function. • In aerospace structural design problems, the objective function for minimization is generally taken as weight. • In some situations, there may be more than one criterion to be satisfied simultaneously. An optimization problem involving multiple objective functions is known as a multiobjective programming problem.

  27. 1. Introduction • With multiple objectives there arises a possibility of conflict, and one simple way to handle the problem is to construct an overall objective function as a linear combination of the conflicting multiple objective functions. • Thus, if f1(X) and f2(X) denote two objective functions, construct a new (overall) objective function for optimization as: where 1 and 2 are constantswhose values indicate the relative importance of one objective function to the other.

  28. 1. Introduction • The locus of all points satisfying f (X) = c = constant forms a hypersurface in the design space, and for each value of c there corresponds a different member of a family of surfaces. These surfaces, called objective function surfaces, are shown in a hypothetical two-dimensional design space in the figure below.

  29. 1. Introduction • Once the objective function surfaces are drawn along with the constraint surfaces, the optimum point can be determined without much difficulty. • But the main problem is that as the number of design variables exceeds two or three, the constraint and objective function surfaces become complex even for visualization and the problem has to be solved purely as a mathematical problem.

  30. Example Example: Design a uniform column of tubular section to carry a compressive load P=2500 kgf for minimum cost. The column is made up of a material that has a yield stress of 500 kgf/cm2, modulus of elasticity (E) of 0.85e6 kgf/cm2, and density () of 0.0025 kgf/cm3. The length of the column is 250 cm. The stress induced in this column should be less than the buckling stress as well as the yield stress. The mean diameter of the column is restricted to lie between 2 and 14 cm, and columns with thicknesses outside the range 0.2 to 0.8 cm are not available in the market. The cost of the column includes material and construction costs and can be taken as 5W + 2d, where W is the weight in kilograms force and d is the mean diameter of the column in centimeters.

  31. Example Example: The design variables are the mean diameter (d) and tube thickness (t): The objective function to be minimized is given by:

  32. Example • The behaviour constraints can be expressed as: stress induced ≤ yield stress stress induced ≤ buckling stress • The induced stress is given by:

  33. Example • The buckling stress for a pin connected column is given by: where I is the second moment of area of the cross section of the column given by:

  34. Example • Thus, the behaviour constraints can be restated as: • The side constraints are given by:

  35. Example • The side constraints can be expressed in standard form as:

  36. Example • For a graphical solution, the constraint surfaces are to be plotted in a two dimensional design space where the two axes represent the two design variables x1and x2. To plot the first constraint surface, we have: • Thus the curve x1x2=1.593 represents the constraint surface g1(X)=0. This curve can be plotted by finding several points on the curve. The points on the curve can be found by giving a series of values to x1and finding the corresponding values of x2 that satisfy the relation x1x2=1.593 as shown in the Table below:

  37. Example • The infeasible region represented by g1(X)>0 or x1x2< 1.593 is shown by hatched lines.These points are plotted and a curve P1Q1 passing through all these points is drawn as shown:

  38. Example • Similarly the second constraint g2(X) < 0 can be expressed as: • The points lying on the constraint surface g2(X)=0 can be obtained as follows (These points are plotted as Curve P2Q2:

  39. Example • The plotting of side constraints is simple since they represent straight lines. • After plotting all the six constraints, the feasible region is determined as the bounded area ABCDEA

  40. Example • Next, the contours of the objective function are to be plotted before finding the optimum point. For this, we plot the curves given by: for a series of values of c. By giving different values to c, the contours of f can be plotted with the help of the following points.

  41. Example • For • For • For • For

  42. Example • These contours are shown in the figure below and it can be seen that the objective function can not be reduced below a value of 26.53 (corresponding to point B) without violating some of the constraints. Thus, the optimum solution is given by point B with d*=x1*=5.44 cm and t*=x2*=0.293 cm with fmin=26.53.

  43. Examples Design of civil engineering structures • variables: width and height of member cross-sections • constraints: limit stresses, maximum and minimum dimensions • objective: minimum cost or minimum weight Analysis of statistical data and building empirical models from measurements • variables: model parameters • Constraints: physical upper and lower bounds for model parameters • Objective: prediction error

  44. Classification of optimization problems Classification based on: • Constraints • Constrained optimization problem • Unconstrained optimization problem • Nature of the design variables • Static optimization problems • Dynamic optimization problems

  45. Classification of optimization problems Classification based on: • Physical structure of the problem • Optimal control problems • Non-optimal control problems • Nature of the equations involved • Nonlinear programming problem • Geometric programming problem • Quadratic programming problem • Linear programming problem

  46. Classification of optimization problems Classification based on: • Permissable values of the design variables • Integer programming problems • Real valued programming problems • Deterministic nature of the variables • Stochastic programming problem • Deterministic programming problem

  47. Classification of optimization problems Classification based on: • Separability of the functions • Separable programming problems • Non-separable programming problems • Number of the objective functions • Single objective programming problem • Multiobjective programming problem

  48. Geometric Programming • A geometric programming problem (GMP) is one in which the objective function and constraints are expressed as posynomials in X.

  49. Quadratic Programming Problem • A quadratic programming problem is a nonlinear programming problem with a quadratic objective function and linear constraints. It is usually formulated as follows: subject to where c, qi,Qij, aij, and bjare constants.

More Related