1 / 13

Simple iteration procedure

residue. use of pre-conditionner. correction. residue. use of pre-conditionner. Simple iteration procedure. Solve. Known approximate solution. Preconditionning:. Jacobi. Gauss-Seidel. Lower triangle. Spectral radius (magnitude of largest eigenvalue ). Convergence.

lois-duke
Télécharger la présentation

Simple iteration procedure

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. residue use of pre-conditionner correction residue use of pre-conditionner Simple iteration procedure Solve Known approximate solution Preconditionning: Jacobi Gauss-Seidel Lower triangle

  2. Spectral radius (magnitude of largest eigenvalue) Convergence Convergence rate (from now on, we replace M-1A by A and M-1b by y): ( ) Field of values:

  3. = Orthomin(1) and orthomin(2) Krylov subspace Residue orthogonal to subspace Projection on Orthomin(2): Project on larger subspace Orthodir(n) GMRES Beyond that, project on all previous values of A ri

  4. Key elements of iterative solvers • Good pre-conditionner (often based on nearest interactions) • Fast matrix-vector multiplication (e.g. based on Fast Multipoles) (get far below N2 complexity, e.g. N log2 N) • Iterative technique with good convergence/stability properties (get far below N iterations) Get far below O(N3) complexity (D N log2 N) Number of iterations

  5. Modified Gram-Schmidt (MGS) algorithm

  6. MGS defines a QR decompositon For overdetermined problems, QR solution = least-squares solution Normal equations:

  7. Arnoldi algorithm: MGS on Krylov subspace normalize

  8. 2 Primitive GMRES Principle: the solution belongs to the (orthogonalized) Krylov subspace… …with coefficients obtained in the least-square sense …for instance with the help of a QR solution ! Then, the residue is: A qk-1 i.e. it is « minimized over subspace »: A qk

  9. Arnoldi algorithm: Upper Hessenberg Matrix H33 h43 0 0 H43

  10. -1 Arnoldi recursion Recursion formula: nxn nxk nx(k+1) (k+1)xk Proof: (j=1, i=1,2 loop) (j=k, i=1..k loop) (previous iteration)

  11. || || k+1 comp. of QR minimization = All 0’s, except entry 1 as proven earlier QR decomposition of Succession of orthogonal transformations: FFH=I

  12. Recursive orthogonal transformations for H Orthogonalisation realised at previous iteration: Same operations on new Hessenberg matrix: Find new operation to null that entry from same operations applied to new column of H

  13. Givens rotations if d≠0 if d=0 Expression of residual 2 2 2 2 Norm of k+1 comp. of

More Related