1 / 10

Understanding Convexity and Positive Definite Algorithms for Iterative Optimization

This document explores essential questions surrounding iterative algorithms, particularly in the context of convexity and positive definiteness. It addresses fundamental concepts such as the definitions of "f" and "x," the size of "n," and the relationship between "f" and "x." The document details two main types of algorithms: Line Search and Trust Region, describing their processes and how they utilize current and previous iterates to minimize objectives. It also examines different search directions employed in Line Search algorithms, providing a comprehensive overview for better algorithm design and understanding.

rowdy
Télécharger la présentation

Understanding Convexity and Positive Definite Algorithms for Iterative Optimization

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Questions • What is “f”? • What is “x”? • How large is “n”? • What is the relation between “f” and “x”?

  2. Convexity

  3. Positive Definite

  4. Iterative Algorithms • Starting at x0, generate a sequence of iterates {xk} and terminate when no more progress can be made or solution is obtained. • From one iterate xk to the next iterate xk+1, the value of the objective must decrease. • To generate the next iterate xk+1, we need information from the current and previous iterates.

  5. Two Types of Algorithms • Line Search (two-steps) • Find a direction to move into • Find an optimal length to travel • Trust Region • Construct a “model function” that approximates the objective function at the current iterate within a “trust region”. • Find the minimizer of the model function. • If the minimizer does not produce enough reduction in the objective function, reduce the approximation region.

  6. Search Directions in Line Search Algorithms– negative gradient

  7. Search Directions in Line Search Algorithms—Newton Step

  8. Search Directions in Line Search Algorithms—Quasi-Newton Step

  9. Search Directions in Line Search Algorithms—Conjugate-gradient Step

More Related