1 / 21

Programming assignment #2 Results Numerical Methods for PDEs Spring 2007

Programming assignment #2 Results Numerical Methods for PDEs Spring 2007. Jim E. Jones. Assignment #2. Forward Difference method (explicit) Backward difference method (implicit) Crank-Nicolson method (implicit).

dong
Télécharger la présentation

Programming assignment #2 Results Numerical Methods for PDEs Spring 2007

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Programming assignment #2 Results Numerical Methods for PDEs Spring 2007 Jim E. Jones

  2. Assignment #2 • Forward Difference method (explicit) • Backward difference method (implicit) • Crank-Nicolson method (implicit) Assignment #2 will is due Wednesday Feb 21. You will code up these three methods for a particular problem and look at accuracy and stability issues.

  3. Assignment #2 PDE solution is: The Burden and Faires text contains results for this example

  4. Assignment #2 • Your job is to experiment with different values of h and k. Do your best to investigate numerically some of the issues we’ve talked about in the lecture. • Stability: Run at least two problems with forward differences. One that satisfies the stability condition and one that does not. Comment on your observations. We’ve not seen it yet, but the other two methods are unconditionally stable. • Convergence: Backward and Forward differencing has truncation error O(k+h2). Crank-Nicolson is O(k2+h2). Calculate the errors you see and comment on how they agree, or not, with these truncation error results. • Comparison: Comment on the relative strengths and weaknesses of the three methods.

  5. Forward.m function [ w ] = Forward(N,T) h=1/N k=1/T r=k/(h*h); wnew=zeros(N+1,1); wold=zeros(N+1,1); for i=2:N+1 wold(i)=PDEsolution((i-1)*h,0); end for tstep=1:T for i=2:N wnew(i)=r*(wold(i-1)+wold(i+1))+(1-2*r)*wold(i); end wold=wnew; end emax=0; for i=2:N err=wnew(i)-PDEsolution((i-1)*h,1); if abs(err) > emax emax=abs(err); end end emax=emax function u=PDEsolution(x,t) u=exp(-pi*pi*t)*sin(pi*x); Example: N=7, T=3 tstep=3 tstep=0 i=1 i=8

  6. Backward.m function [ w ] = Backward(N,T) h=1/N k=1/T r=k/(h*h); wnew=zeros(N-1,1); wold=zeros(N-1,1); for i=1:N-1 wold(i)=PDEsolution(i*h,0); end a=-r*ones(N-1,1); b=(1+2*r)*ones(N-1,1); c=-r*ones(N-1,1); for tstep=1:T wnew=tridisolve(a,b,c,wold); wold=wnew; end emax=0; for i=1:N-1 err=wnew(i)-PDEsolution(i*h,1); if abs(err) > emax emax=abs(err); end end emax=emax Example: N=7, T=3 tstep=3 tstep=0 i=1 i=6

  7. emax=0; for i=1:N-1 err=wnew(i)-PDEsolution(i*h,1); if abs(err) > emax emax=abs(err); end end emax=emax CrankN.m function [ w ] = CrankN(N,T) h=1/N k=1/T r=k/(h*h); wnew=zeros(N-1,1); wold=zeros(N-1,1); for i=1:N-1 wold(i)=PDEsolution(i*h,0); end a=-(r/2)*ones(N-1,1); b=(1+r)*ones(N-1,1); c=-(r/2)*ones(N-1,1); d=zeros(N-1,1); for tstep=1:T d(1)=(r/2)*wold(2)+(1-r)*wold(1); for i=2:N-2 d(i)=(r/2)*(wold(i-1)+wold(i+1))+(1-r)*wold(i); end d(N-1)=(r/2)*wold(N-2)+(1-r)*wold(N-1); wnew=tridisolve(a,b,c,d); wold=wnew; end Example: N=7, T=3 tstep=3 tstep=0 i=1 i=6

  8. Forward Differences: Results Infinity norm of error at T=1 * Denotes method diverged

  9. Forward Differences: Results • Stability? Infinity norm of error at T=1 * Denotes method diverged

  10. Forward Differences: Results • Stability? r=k/h2 Infinity norm of error at T=1 r=1.0 r=.25 * Denotes method diverged

  11. Forward Differences: Results • Accuracy? Infinity norm of error at T=1 * Denotes method diverged

  12. Forward Differences: Results • Accuracy? O(k+h2) Infinity norm of error at T=1 * Denotes method diverged

  13. Backward Differences: Results Infinity norm of error at T=1 * Denotes method diverged

  14. Backward Differences: Results • Stability? Infinity norm of error at T=1 * Denotes method diverged

  15. Backward Differences: Results • Accuracy? Infinity norm of error at T=1 * Denotes method diverged

  16. Backward Differences: Results • Accuracy? O(k+h2) Infinity norm of error at T=1 * Denotes method diverged

  17. Crank-Nicolson: Results Infinity norm of error at T=1 * Denotes method diverged

  18. Crank-Nicolson: Results Infinity norm of error at T=1 Note K values are much larger than in previous tables. This means fewer time steps = less work. Also they are reduced by a factor of 2 (not 4) in each successive row. * Denotes method diverged

  19. Crank-Nicolson: Results • Stability? Infinity norm of error at T=1 * Denotes method diverged

  20. Crank-Nicolson: Results • Accuracy? Infinity norm of error at T=1 * Denotes method diverged

  21. Crank-Nicolson: Results • Accuracy? O(k2+h2) Infinity norm of error at T=1 * Denotes method diverged

More Related