1 / 23

The linear system

The linear system. The problem: solve Suppose A is invertible, then there exists a unique solution How to efficiently compute the solution numerically???. Review of direct methods. Gaussian elimination with pivoting Memory cost: O(n^2) Computational cost: O(n^3)

odele
Télécharger la présentation

The linear system

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The linear system • The problem: solve • Suppose A is invertible, then there exists a unique solution • How to efficiently compute the solution numerically???

  2. Review of direct methods • Gaussian elimination with pivoting • Memory cost: O(n^2) • Computational cost: O(n^3) • Can only be used for small n, e.g. n<=1000 • LU decomposition • Memory cost: O(n^2) • Computational cost: O(n^2) • Can only be used for small n, e.g. n<=1000 • Good for problem to solve the linear system with different right hand

  3. Review of direct methods • For tri-diagonal matrix • Thomas algorithm based on Crout factorization • Memory cost: O(n) & Computational cost: O(n) • Can be extended to band-limited matrix • For linear system from discretization of Poisson equation by FDM • Direct Poisson solver based on FFT • Memory cost: O(n) & Computational cost: O(n ln n) • For linear system from discretization of elliptic equation by FEM • Multigrid method (MG) or Algebraic Multigrid method (AMG) • Memory cost: O(n) & Computational cost: O(n) • For linear system from discretization of Poisson equation by integral formulation • Fast Multipole method • Memory cost: O(n) & Computational cost: O(n)

  4. Iterative methods • Aim: to solve large sparse linear system • Basic iterative methods • Jacobi method • Gauss-Seidel method • Successive overrelaxation method (SOR) • Krylov subspace (modern iterative) methods • Steepest decent method • Conjugate gradient (CG) method • GMRES for nonsymmetric mehtod

  5. Basic iterative methods • Rewrite

  6. Jacobi iterative method • The linear system • Equation form • Matrix form

  7. Jacobi iterative method • An example • The method • Initial guess

  8. Jacobi iterative method • The results

  9. Gauss-Seidel method • Idea: Used the new values when they are available • Equation form • Matrix form

  10. Gauss-Seidel method • An example • The method • Initial guess

  11. Gauss-Seidel method • The results

  12. SOR method • Idea: To improve the Gauss-Seidel method by a linear combinationof the old value and new • Equation form • Matrix form

  13. Convergence analysis • General form of basic iterative methods • Exact solution • Define the error at the m-th iteration • Error equations

  14. Convergence analysis • Convergence • Lemma: For any square matrix R, there exists a nonsingular matrix T such that – Jordan canonical form

  15. Convergence analysis • Definition: Spectral radius of R • Lemma: For any square matrix R, • Theorem: The iterative method converges to the exact solution of iff

  16. Convergence rate • Thm: For the iterative method suppose then • The iterative method converges • Linear convergence rate with q<1 • Error bound

  17. Proof for convergence rate • Fact • Error bound • Another error bound • Error bound

  18. Convergence results • If A is strictly row diagonally dominant, then both Jacobi and Gauss-Seidel methods converge. • Gauss-Seidel method converges if A is symmetric positive definite • The relaxation parameter be in (0,2) is the necessary condition for the convergence of SOR method. In addition, if A is symmetric positive definite, then the condition is also sufficient for the convergence of SOR method

  19. Convergence results • Definition: A is strictly row diagonally dominant if • Examples • Thm: If A is strictly row diagonally dominant, it is invertible!

  20. Convergence results • Thm: If A is strictly row diagonally dominant, then both Jacobi and Gauss-Seidel methods converge. In fact, • Proof

  21. Convergence results • Thm: Let A be symmetric positive definite matrix, then the Gauss-Seidel method converges for any initial guess. • Proof: See details in class • Remark: There are linear system, for which the Jacobi method converges, but the Gauss-Seidel method diverges, e.g.

  22. Convergence results • Thm: For SOR method, we have Thus the relaxation parameter be in (0,2) is necessary for SOR converge • Proof:

  23. Convergence results • Thm: If A is symmetric positive definite, then for . That is, SOR converges for all • Proof: See details in class • Remark: • Over relaxation: • Under relaxation: • Optimal relaxation parameter:

More Related