1 / 93

Derivative-free Methods using Linesearch Techniques

Derivative-free Methods using Linesearch Techniques. Stefano Lucidi. joint works with. L. Grippo. (the father linesearch approach). F. Lampariello. M. Sciandrone. P. Tseng. G. Fasano. G. Liuzzi. V. Piccialli. F. Rinaldi. (in order of appearance in this research activity).

pestella
Télécharger la présentation

Derivative-free Methods using Linesearch Techniques

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Derivative-free Methods using Linesearch Techniques Stefano Lucidi

  2. joint works with L. Grippo (the father linesearch approach) F. Lampariello M. Sciandrone P. Tseng G. Fasano G. Liuzzi V. Piccialli F. Rinaldi (in order of appearance in this research activity)

  3. PROBLEM DEFINITION: are not available

  4. MOTIVATIONS: In many engineering problems the objective and constraint function values are obtained by • direct measurements • complex simulation programs first order derivatives can be often neither explicitly calculated nor approximated

  5. MOTIVATIONS: In fact • the mathematical representations of the objective function and the constraints are not available • the source codes of the programs are not available • the evaluations of the objective function and the constraints can be very expensive • the values of the objective function and theconstraintscan be affected by the presence of noise

  6. MOTIVATIONS: the mathematical representations of the objective function and the constraints are not available the first order derivatives the objective function and the constraints can not be computed analytically

  7. MOTIVATIONS: the source codes of the programs are not available the automatic differentiation techniques can not be applied

  8. MOTIVATIONS: the evaluations of the objective function and the constraints can be very expensive the finite difference approximations can be too expensive (they need n function evaluations at least)

  9. MOTIVATIONS: the values of the objective function and theconstraintscan be affected by the presence of noise finite difference approximations can produce very wrong estimates of the first order derivatives

  10. where denotes a Gaussian distributed random number with zero mean and variance NUMERICAL EXPERIENCE: we considered 41 box constrained standard test problems we perturbed such problems in the following way:

  11. NUMERICAL EXPERIENCE: we considered two codes: DF_box = derivative-free method E04UCF = NAG subroutine using finite-differences gradients Number of Failures DF_box E04UCF

  12. Direct search methods use only function values Modelling methods approximate the functions by suitable models which are progressively built and updated GLOBALLY CONVERGENT DF METHODS - pattern search methods where the function is evaluated on specified geometric patterns - line search methods which use one-dimensional minimization along suitable search directions

  13. UNCONSTRAINED MINIMIZATION PROBLEMS is not available is compact

  14. THE ROLE OF THE GRADIENT characterizes accurately the local behaviour of f allows us to determine an "efficient" descent direction to determine a "good" step length along the direction

  15. provides the rates of change of along the 2n directions THE ROLE OF THE GRADIENT is the directional derivatives of along characterizes accurately the local behaviour of

  16. aset of directions can be associated at each the local behaviour of along should be indicative of the whole local behaviour of HOW TO OVERCOME THE LACK OF GRADIENT

  17. Given , the bounded sequences are such that ASSUMPTION D

  18. are linearly independent and bounded EXAMPLES OF SETS OF DIRECTIONS

  19. are bounded EXAMPLES OF SETS OF DIRECTIONS (Lewis,Torczon)

  20. EXAMPLES OF SETS OF DIRECTIONS

  21. Assumption D ensures that, performing finer and finer sampling of along it is possible: • either to realize that the point is a good approximation of a stationary point of • or to find a point where is decreased UNCONSTRAINED MINIMIZATION PROBLEMS

  22. GLOBAL CONVERGENCE By Assumption D we have:

  23. By using satisfying Assumption D it is possible: to characterize the global convergence of a sequence of points the existence of suitable sequences of failuresin decreasing the objective function along the directions GLOBAL CONVERGENCE by means

  24. GLOBAL CONVERGENCE By Assumption D we have:

  25. PROPOSITION Let and be such that: - - satisfy Assumption D • there exist sequences of points and scalars such that then

  26. GLOBAL CONVERGENCE • the Proposition characterizes in “some sense” the requirements on the accettable samplings of along the directions that guarantee the global convergence • it is not necessary to perform at each point a sampling of along all the directions • the sampling of along all the directions can be distributed along the iterations of the algorithm

  27. Thedirect search methodscan divided in - pattern search methods - line search methods GLOBAL CONVERGENCE The use of directions satisfying Condition D and the result of producing sequences of points satisfying the hypothesis of the Proposition are the common elements of all the globally convergent direct search methods

  28. PATTERN SEARCH METHODS Pros: they require that the new point produces a simple decrease of (in the line search methods the new point must guarantees a “sufficient” decrease of ) Cons: all the points produced must lie in a suitable lattice this implies - additional assumptions on the search directions - restrictions on the choiches of the steplenghts (in the line search methods no additional requiriments respect to Assumption D and the assumptions of the Proposition)

  29. LINESEARCH TECHNIQUES

  30. LINESEARCH TECHNIQUES

  31. STEP 1 Computesatisfying AssumptionD ALGORITHM DF STEP 2 Minimization of along STEP 3 Compute and set k=k+1

  32. STEP 2 The aim of this step is: - to detect the “promising” directions, the direction along which the function decreases “sufficiently” - to compute steplenghts along these directions which guarantee both a “sufficiently” decrease of the function and a “sufficient” moving from the previous point

  33. LINESEARCH TECHNIQUE

  34. LINESEARCH TECHNIQUE

  35. The value of the initial step along the i-th direction derives from the linesearch performed along the i-th direction at the previuos iteration If the set of search directions does not depend on the iteration namely the scalar should be representative of the behaviour of the objective function along the i-th direction STEP 2

  36. Find such that otherwise set STEP 3 set k=k+1 and go to Step 1 At Step 3, every approximation technique can be used to produce a new better point

  37. THEOREM Let be the sequence of points produced by DF Algorithm then there exists an accomulation point of and every accumulation points of is a stationary point of the objective function GLOBAL CONVERGENCE

  38. (LCP) is not available is compact is compact LINEARLY CONSTRAINED MINIMIZATION PROBLEMS

  39. Given a feasible point it is possible to define • the set of the indeces of the active constraints • the set of the feasible directions LINEARLY CONSTRAINED MINIMIZATION PROBLEMS

  40. is a stationary point for Problem (LCP) is a stationary point for Problem (LCP) LINEARLY CONSTRAINED MINIMIZATION PROBLEMS

  41. is a stationary point for Problem (LCP) LINEARLY CONSTRAINED MINIMIZATION PROBLEMS

  42. an estimate of the set of the indeces of the active constraints • an estimate of the set of the feasible directions LINEARLY CONSTRAINED MINIMIZATION PROBLEMS Given and it is possible to define • has good properties which allow us to define globally convergent algorithms

  43. Given and the set of directions with satisfies: is uniformly bounded ASSUMPTION D2 (an example)

  44. STEP 1 Computesatisfying AssumptionD2 STEP 2 Minimization of along STEP 3 Compute the new point and set k=k+1 ALGORITHM DFL

  45. THEOREM Let be the sequence of points produced by DFL Algorithm then there exists an accomulation point of and every accumulation points of is a stationary point for Problem (LCP) GLOBAL CONVERGENCE

  46. (BCP) is not available is compact satisfies Assumption D2 the set BOX CONSTRAINED MINIMIZATION PROBLEMS

  47. (NCP) is not available are not available NONLINEARLY CONSTRAINED MINIMIZATION PROBLEMS

  48. We define and given a point NONLINEARLY CONSTRAINED MINIMIZATION PROBLEMS

  49. ASSUMPTION A1 The set is compact ASSUMPTION A2 there exists a vector such that For every boundeness of the iterates Assumption A1 existence and boundeness of the Lagrange multipliers Assumption A2 NONLINEARLY CONSTRAINED MINIMIZATION PROBLEMS

  50. where (penalty parameter) NONLINEARLY CONSTRAINED MINIMIZATION PROBLEMS We consider the following continuously differentiable penalty function:

More Related