200 likes | 338 Vues
Explore advanced optimization methods, focusing on the Steepest Descent algorithm and its application in minimizing functions. We discuss the criteria for sufficient and necessary conditions for functions with one and two variables, and the implications of learning rates on algorithm stability. Learn how oscillatory trajectories affect optimization, and discover techniques to determine stable learning rates for quadratic functions. Additionally, we delve into Newton's Method, line search strategies, and initial bracketing to effectively identify relative minima.
E N D
Optimization Part II G.Anuradha
Review of previous lecture-Steepest Descent Choose the next step so that the function decreases: For small changes in x we can approximate F(x): where If we want the function to decrease: We can maximize the decrease by choosing:
Necessary and sufficient conditions for a function with single variable
Functions with two variables Sufficient conditions Necessary conditions
Effect of learning rate More the learning rate the trajectory becomes oscillatory. This will make the algorithm unstable The upper limit for learning rates can be set for quadratic functions
Stable Learning Rates (Quadratic) Stability is determined by the eigenvalues of this matrix. Eigenvalues of [I - aA]. (li - eigenvalue of A) Stability Requirement:
Newton’s Method Take the gradient of this second-order approximation and set it equal to zero to find the stationary point:
This is used for finding line minimization methods and their stopping criteria • Initial bracketing • Line searches • Newton’s method • Secant method • Sectioning method
Initial Bracketing • Helps in finding the range which contains the relative minimum • Bracketing some assumed minimum in the starting interval is required • Two schemes are used for this purpose