200 likes | 318 Vues
Optimization Part II. G.Anuradha. Review of previous lecture- Steepest Descent. Choose the next step so that the function decreases:. For small changes in x we can approximate F ( x ):. where. If we want the function to decrease:. We can maximize the decrease by choosing:. Example.
E N D
Optimization Part II G.Anuradha
Review of previous lecture-Steepest Descent Choose the next step so that the function decreases: For small changes in x we can approximate F(x): where If we want the function to decrease: We can maximize the decrease by choosing:
Necessary and sufficient conditions for a function with single variable
Functions with two variables Sufficient conditions Necessary conditions
Effect of learning rate More the learning rate the trajectory becomes oscillatory. This will make the algorithm unstable The upper limit for learning rates can be set for quadratic functions
Stable Learning Rates (Quadratic) Stability is determined by the eigenvalues of this matrix. Eigenvalues of [I - aA]. (li - eigenvalue of A) Stability Requirement:
Newton’s Method Take the gradient of this second-order approximation and set it equal to zero to find the stationary point:
This is used for finding line minimization methods and their stopping criteria • Initial bracketing • Line searches • Newton’s method • Secant method • Sectioning method
Initial Bracketing • Helps in finding the range which contains the relative minimum • Bracketing some assumed minimum in the starting interval is required • Two schemes are used for this purpose