1 / 20

OR682/Math685/CSI700

OR682/Math685/CSI700. Lecture 5 Fall 2000. Nonlinear Equations. Solving f ( x ) = 0 (1 equation, n equations) Assume that [# of equations] = [# of variables] Closely related to: minimize F ( x ) Solve:  F ( x ) = 0

Ava
Télécharger la présentation

OR682/Math685/CSI700

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. OR682/Math685/CSI700 Lecture 5 Fall 2000

  2. Nonlinear Equations • Solving f (x) = 0 (1 equation, n equations) • Assume that [# of equations] = [# of variables] • Closely related to: minimize F (x) • Solve: F(x) = 0 • “Always” better to use optimization software to solve optimization problems • Applications: • Nonlinear differential equations • Design of integrated circuits • Data fitting with nonlinear models (e.g., exponential terms)

  3. Examples • 1-variable: x2 = 4 sin(x) • 2-variable:

  4. Solutions of Nonlinear Equations • Nonlinear equations can have any number of solutions: • No solution: exp(x) + 1 = 0 • 1 solution: exp(–x) – x = 0 • 2 solutions: x2 – 4 sin(x) = 0 • Infinitely many solutions: sin(x) = 0 • Iterative methods are necessary: no general exact formulas exist, even for polynomials • Terminology: solution = root = zero

  5. Multiple Roots • A nonlinear equation can have a multiple root: f (x) = 0 and f(x) = 0 • Examples: (x – 1)k = 0 • It is impossible to determine a multiple root to full machine accuracy • It is harder computationally to determine a multiple root, especially one with even multiplicity

  6. Accuracy of Solutions • We can measure if the residual is small: • Or if the error is small (x* is solution): • These are related, but not equivalent

  7. Conditioning • Mathematically: x* = f –1(0) • If computing f (x) is insensitive, then computing the root is sensitive • If computing f (x) is sensitive, then computing the root is insensitive • If we define F( y) f –1( y) thenF (0) = 1 / f (x*)

  8. Convergence Rate • Measuring speed of an iterative method • Define error: ek = xk– x* • For some algorithms, error will be the length of an interval containing x* • The sequence converges to zero with rate r if:

  9. Convergence Rate (continued) • Some important cases: • Linear (r = 1): requires C < 1 • Superlinear (r > 1): # of digits gained per iteration increases at each iteration • Quadratic (r = 2): # of accurate digits doubles at each iteration • Convergence rates refer to asymptotic behavior (close to the solution); early iterations of the algorithm may produce little progress

  10. Bisection: Simple & Safe • Require [a,b] with f (a)  f (b) < 0 • Reduce interval until error is “small” • While ((b – a) > tol1) Compute midpoint m = a + (b– a)/2 If | f (m)| < tol2, stop If f (a) f (m) < 0 then b = m, else a = m end Matlab m-files: bisect.m

  11. Bisection, Continued • Interval reduced by ½ each iteration • Linear convergence (r = 1, C = ½) • Bisection approximates f (x) by the line through [a,sign( f (a))] and [b,sign( f (b))] and determines the point m where this line is zero • This is a crude model of f (x) • What about multiple roots? Matlab m-files: bisect_model.m

  12. Newton’s Method • Approximate f (x) by its Taylor series: • Find point where line is zero: • Repeat this computation to get Newton’s method: Matlab m-files: newton_model.m, newton.m

  13. Newton’s Method: Convergence • Note: ek = xk– x* so x* = xk – ek. Thus • Quadratic convergence (r = 2) if f(x*)  0

  14. Secant Method • Goal: reduce iteration cost of Newton’s method • Approximate f(x) by finite difference: • Superlinear convergence (r 1.6)

  15. Safeguarded Methods • Newton, secant methods: • Fast close to solution • Potentially unreliable (esp. away from solution) • Bisection (and other) methods: • Slow to converge • Reliable • Safeguarded method: • Monitor performance of fast method • Use slow, safe method to guarantee convergence • Near solution, the slow method usually not needed

  16. Systems of Nonlinear Equations • Much more difficult than scalar case • Theoretical analysis harder, behavior of roots potentially stranger • No absolutely safe, reliable method • Costs rise rapidly with # of variables • Can only guarantee that algorithm converges to a solution of:

  17. Newton’s Method • In n dimensions: where (J = Jacobian matrix) • Quadratic convergence rate (if assumptions satisfied) Matlab m_files: newton_s.m

  18. Newton’s Method (continued) • Computational costs • O(n2) to compute Jacobian • O(n3) to solve Newton equations • Alternative methods • Analogs of secant method • Safeguards • Essential to guarantee convergence • “line search” or “trust region”

  19. Matlab Software • 1-variable: fzero • n-variable: fsolve

  20. For Next Class • Homework: see web site • Reading: • Heath: chapter 7

More Related