1 / 33

Lecture 8 - Iterative systems of Equations

Lecture 8 - Iterative systems of Equations. CVEN 302 June 19, 2002. Lecture’s Goals. Iterative Techniques Jacobian method Gauss-Siedel method Relaxation technique. Iterative Techniques.

prue
Télécharger la présentation

Lecture 8 - Iterative systems of Equations

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Lecture 8 - Iterative systems of Equations CVEN 302 June 19, 2002

  2. Lecture’s Goals • Iterative Techniques • Jacobian method • Gauss-Siedel method • Relaxation technique

  3. Iterative Techniques • The method of solving simultaneous linear algebraic equations using Gaussian Elimination and the Gauss-Jordan Method. These techniques are known as direct methods. Problems can arise from round-off errors and zero on the diagonal. • One means of obtaining an approximate solution to the equations is to use an “educated guess”.

  4. Iterative Methods • We will look at three iterative methods: • Jacobi Method • Gauss-Seidel Method • Successive over Relaxation (SOR)

  5. Convergence Restrictions • There are two conditions for the iterative method to converge. • Necessary that 1 coefficient in each equation is dominate. • The sufficient condition is that the diagonal is dominate.

  6. Jacobi Iteration • If the diagonal is dominant, the matrix can be rewritten in the following form

  7. Jacobi Iteration • The technique can be rewritten in a shorthand fashion, where D is the diagonal, A” is the matrix without the diagonal and c is the right-hand side of the equations.

  8. Jacobi Iteration • The technique solves for the entire set of x values for each iteration. • The problem does not update the values until an iteration is completed.

  9. Example (Jacobi Iteration) 4X1 + 2X2 = 2 2X1 + 10X2 + 4X3 = 6 4X2 + 5X3 = 5 Solution: (X1 , X2 , X3 ) = (0.41379, 0.17241, 0.86206)

  10. Jacobi Example

  11. Jacobi Example

  12. Jacobi Example • Formulation of the matrix

  13. Jacobi Iteration Iteration 1 2 3 4 5 6 7 X 1 0.5 0.2 0.45 0.324 0.429 0.376 0.42 X 2 0.6 0.1 0.352 0.142 0.248 0.16 0.204 X 3 1 0.52 0.92 0.718 0.886 0.802 0.872

  14. Jacobi Program • The computer program is setup to do the Jacobi method for any size square matrix: Jacobi(A,b) • The program can has options for maximum number of iterations, nmax, and tolerance, tol. Jacobi(A,b,nmax,tol)

  15. Gauss-Seidel Iterations • The Gauss-Seidel / Seidel technique is similar to the Jacobi iteration technique with one difference. • The method updates the results continuously. It uses the new information from the previous iteration to accelerate converge to a solution.

  16. Gauss-Seidel Model • The Gauss-Seidel Algorithm: • The combined vector is upgraded ever term.

  17. Example (Gauss-Seidel Iteration) 4X1 + 2X2 = 2 2X1 + 10X2 + 4X3 = 6 4X2 + 5X3 = 5 Solution: (X1 , X2 , X3 ) = (0.41379, 0.17241, 0.86206)

  18. Gauss-Seidel Example • Formulation of the matrix problem

  19. Gauss-Seidel Iteration Iteration1 2 3 4 5 6 7 X 1 0.5 0.25 0.345 0.384 0.401 0.408 0.411 X 2 0.5 0.31 0.231 0.197 0.183 0.177 0.175 X 3 0.6 0.75 0.815 0.842 0.854 0.858 0.858

  20. Gauss-Seidel Model • The computer program is setup to do the Gauss-Seidel method for any size square matrix: Seidel(A,b) • The program can has options for maximum number of iterations, nmax, and tolerance, tol. Seidel(A,b,nmax,tol)

  21. Example - Iteration with no diagonal domination 3X1- 3X2 + 5X3 = 4 X1 + 2X2 - 6X3 = 3 2X1 - X2 + 3X3 = 1 Solution: (X1 , X2 , X3 ) = (1.00, -2.00, -1.00)

  22. Using the GS algorithm Using the Gauss-Seidel Program with the following A matrix and b vector. Solution: (X1 , X2 , X3 ) = (1.00, -2.00, -1.00)

  23. Gauss-Seidel Example

  24. Gauss-Seidel Example Program will work with the equations:

  25. Using a Gauss-Seidel Iteration Iteration 1 2 3 4 5 6 7 X 1 1.33 2.630 3.412 2.296 -1.375 6.078 -7.620 X 2 0.833 -0.648 -5.113 -10.584 -11.989 -3.698 14.769 X 3 -0.278 -1.635 -3.645 -4.725 -2.746 3.153 10.336

  26. Successive over Relaxation • The technique is a modification on the Gauss-Seidel method with an additional parameter, w, that may accelerate the convergence of the iterations. • The weighting parameter, w, has two ranges 0 < w <1, and 1< w <2. If w = 1, then the problem is the Gauss-Seidel technique.

  27. SOR Method • The SOR algorithm is defined as: • The difference is the weighting parameter, w.

  28. Weighting Parameter • If the parameter, w is under 1, the residuals will be under-relaxed. • If the parameter, w = 1, the residuals are equal to a Gauss-Seidel model. • If 1< w < 2 the residuals will be over-relaxed and will general help accelerate the convergence of the solution.

  29. Example of SOR 4X1 + 2X2 = 2 2X1 + 10X2 + 4X3 = 6 4X2 + 5X3 = 5 Solution: (X1 , X2 , X3 ) = (0.41379, 0.17241, 0.86206)

  30. SOR Example • Formulation of the SOR Algorithm

  31. Effects of w Parameter • Using the SOR program SOR(A,b,w,nmax,tol) with nmax=50 and tol = 0.000001

  32. Summary • Convergence conditions need to be met in order for iterative techniques to converge • Jacobi method upgrades the values after each iteration. • Gauss-Seidel upgrades continuously through method. • SOR (Successive over Relaxation) uses the residuals to accelerate the convergence.

  33. Homework • Check the Homework webpage

More Related