1 / 36

Objectives

Objectives. Linear Equations Solving Linear Equations Point-Jacobi Methods L-U Decomposition/Gaussian Elimination Non-Linear Equations Solving Non-linear Equations in 1D Interval Bisection Method Newton’s Method Solving Systems of NLE’s Newton’s Method

latika
Télécharger la présentation

Objectives

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Objectives Linear Equations Solving Linear Equations Point-Jacobi Methods L-U Decomposition/Gaussian Elimination Non-Linear Equations Solving Non-linear Equations in 1D Interval Bisection Method Newton’s Method Solving Systems of NLE’s Newton’s Method Good Reference: Heath, Scientific Computing 2005 McGraw Hill

  2. Linear Equations Effects directly proportional to causes Examples: f(x)=a*x + b F([x])=[A]x+[b] F(x)== mxn nx1 nx1

  3. Linear Equations [Q(T)]=[{k}A/L[M]] {T} Where [M]= Examples in building energy modeling: Steady state conduction in multi-component wall system

  4. Solving Linear Equations: Fixed Point Iteration/ Jacobi Iteration • If we start with an initial guess for the solution, we can see how close this guess is, which will inform our next guess, and so on until we arrive at the solution • In fixed-point iteration schemes, we put equation in form x=f(x) and use successive guesses for x(RHS) to get our next guess (LHS)

  5. Solving Linear Equations: Fixed Point Iteration/ Jacobi Iteration • Illustration of Jacobi Iteration for a scalar: • Solve 3x=6 by iterating • We need a term “x” by itself on one side • Break x term into just x and what remains i.e. x + 2x =6 • Rearrange to give x=-x/2+3

  6. Solving Linear Equations: Fixed Point Iteration/ Jacobi Iteration • Now that we have the equation in the proper form, make an initial guess for x • xo=…. Say 7 • Plug into right side only to get x=-7/2+3=-0.5 • -0.5 is now our second guess, x1

  7. Solving Linear Equations: Fixed Point Iteration/ Jacobi Iteration • If we continue in this manner, we approach the solution of x=2

  8. Solving Linear Equations: Fixed Point Iteration/ Jacobi Iteration • We can extend this to systems of linear equations • Given [A]{x}={b} • “Split” matrix [A] into [D] and [C] where [D] is the diagonal elements of [A] and [C] is [A]-[D] • What results is Dx=-Cx+b or x=-D-1Cx + D-1b

  9. Solving Linear Equations: Fixed Point Iteration/ Jacobi Iteration x=-D-1Cx + D-1b • We now have x isolated on one side • Do the same process as for the scalar equation: • Initial guess for {x}, {x0} • Plug this into right side only • Resulting value of left side becomes next guess, {x1}, and so on until convergence

  10. Solving Linear Equations: Fixed Point Iteration/ Jacobi Iteration • Convergence • What if we wrote original scalar equation as x=-2x+6 and did the same type of iteration?

  11. Solving Linear Equations: Fixed Point Iteration/ Jacobi Iteration • Depending on original formulation of the iteration equation, iteration may or may not converge • In general, given ax=b We convert to (a-C)x+Cx=b Then our iteration scheme is : which results in:

  12. Solving Linear Equations: Fixed Point Iteration/ Jacobi Iteration • Under what conditions will this converge as k→∞????? <1 • Similarly for systems of eqns., all eigenvalues and the matrix norm must be <1

  13. Solving Linear Equations: Gaussian Elimination • Single equation easy enough • f(x)=0=a*x + b • x=-b/a • Can extend this to system of equations if we can transform the matrix so that each line has only one variable • Doesn’t require iteration. It is direct calculation. • Used in MatLab and other software to do matrix division

  14. Solving LE’s: Gaussian Elimination Gaussian Elimination, L-U Factorization, L-U Decomposition Given Ax=b, where A is mxn matrix Transform into something of the form: {}={C} (“lower triangular” matrix- what if it is “upper triangular”?)

  15. Solving LE’s: Gaussian Elimination • How do we transform our original equation? • Multiplying both sides of any matrix equation by any matrix [M] does not change equation if M is non-singular • Therefore we can do [M1] [M2]….. [Mn] [A] {x}=[M1] [M2]…[Mn]{b} until we arrive at [N]{x}= [P] {b} Where [N]= [M1] [M2]….. [Mn] [A] and is lower triangular or upper triangular

  16. Solving LE’s: Gaussian Elimination • How do we transform A into N???? • Start with simple example: Ax=b; A is 2 x 1 matrix , B is If we pre-multiply both sides by matrix M= we get Nx=MAx=x=Mb Can we now solve for all values of x?

  17. Solving LE’s: Gaussian Elimination • We transformed a 2 x 1 Matrix into an upper diagonal matrix, N. • We can do this for any size non-singular matrix by using the following transformation

  18. Solving LE’s: Gaussian Elimination • Given Ax=b, with constituents of A being aij for k=1 to n-1 ifakk0 for i = k+1 to n mik=aik/akk end for j=k+1 to n for i= k+1 to n aij = aij =mikakj end end end Loop over columns Avoid dividing by 0 Divides each entry below the diagonal by the diagonal entry for that column Transforms each member of lower part of matrix What results is a transformed version of A which is upper diagonal

  19. Solving LE’s: Gaussian Elimination • With the process on the previous slide, we can transform any non-singular matrix, with a few stipulations • Much more information in handouts on process, derivation, etc. • One example problem now

  20. Solving LE’s: Gaussian Elimination Given Ax= Solve for x

  21. Solving LE’s: Gaussian Elimination Premultiply first by M1= And then by M2= Get M2M1Ax =x=M2M1b=

  22. Solving LE’s: Gaussian Elimination Easily solved by successive substitution to give: x=

  23. Non-Linear Equations Tsky (known) Troof (quantity of interest) Arise often in building energy modeling applications e.g Calculate radiation heat transfer from roof when sky temperature is known Q(Troof)= Fes(Tsky4 - Troof4) Aroof

  24. Solving Non-Linear Equations Analytical solutions rarely possible Often large systems of equations Must use numerical techniques Will introduce two: Interval Bisection (single equation) Newton’s Method (single/system)

  25. Solving NLE’s: Interval Bisection All NLE’s can be written as homogeneous equations, i.e f(x) =0 Therefore, all solutions of NLE’s are “zero-finding” exercises If we know an approximate interval where f(x) crosses x-axis, we can make interval smaller and smaller until we have a tiny interval in which solution lies

  26. Solving NLE’s: Interval Bisection f(x) x=a x=b x

  27. Solving NLE’s: Interval Bisection f(x) x=a x=b x=(a+b)/2 x

  28. Solving NLE’s: Interval Bisection f(x) Is zero between x=a and x=(a+b)/2 or x=(a+b)/2 and x=b? Repeat process again until interval is smaller than a certain tolerance x=a x=b x=(a+b)/2 x

  29. Solving NLE’s: Interval Bisection • Example: Find solution of f(x)=(x-2)2-4 • We know the answer is between, say, -3 and 1.1 • Solution:

  30. Solving NLE’s: Newton’s Method • With interval bisection, we only looked at the sign of the function at a given point • If we also look at the derivative of the function, we can find a solution faster • Newton’s Method uses a Taylor Series expansion: f(x+h) =f(x) +f’(x)h + higher order terms

  31. Solving NLE’s: Newton’s Method • If we drop higher order terms we get f(x+h)≈f(x) +f’(x)h • If we start at some value of x (initial guess), we want to find a value of h for which f(x+h) is as close to 0 as possible. • This occurs at h=-f(x)/f’(x) • We then evaluate the function and its derivative at (x+h) and start the process again

  32. Solving NLE’s: Newton’s Method • Mathematically: k=0 x0=initial guess while f(xk)>tolerance xk+1=xk-f(xk)/f’(xk) k=k+1 end

  33. Solving NLE’s: Newton’s Method • Graphically: f(x) x x0

  34. Solving NLE’s: Newton’s Method • This can be extended to systems of NLE’s • Instead of derivative we use Jacobian matrix: [Jf]ij= • Truncated Taylor series is then f(x+s)=f(x)+Jf(x)s • And we use iteration: x0=initial guess solve Jf(xk)sk=-f(sk) for sk xk+1 = xk + sk

  35. Solving NLE’s: Newton’s Method Example: Solve f(x)=(x-2)2 -4=0 Start at say x=-2 *Notice how much faster Newton’s Method converges • For linear equations it converges in one step • Why? • Newton has “quadratic convergence” • Interval bisection has “linear convergence”

  36. Summary • 2 methods for solving systems of linear equations • 2 methods for solving non-linear equations • Discussed convergence and computational efficiency • Please contact me with any questions about this or the rest of class. Jordan Clark jdclark@utexas

More Related